Dec 06 06:46:23 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Dec 06 06:46:23 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 06 06:46:23 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:23 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 06 06:46:23 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 06 06:46:23 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 06 06:46:23 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 06 06:46:23 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Dec 06 06:46:23 localhost kernel: signal: max sigframe size: 1776
Dec 06 06:46:23 localhost kernel: BIOS-provided physical RAM map:
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 06 06:46:23 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Dec 06 06:46:23 localhost kernel: NX (Execute Disable) protection: active
Dec 06 06:46:23 localhost kernel: SMBIOS 2.8 present.
Dec 06 06:46:23 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 06 06:46:23 localhost kernel: Hypervisor detected: KVM
Dec 06 06:46:23 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 06 06:46:23 localhost kernel: kvm-clock: using sched offset of 1817873558 cycles
Dec 06 06:46:23 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 06 06:46:23 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 06 06:46:23 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 06 06:46:23 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 06 06:46:23 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Dec 06 06:46:23 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 06 06:46:23 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 06 06:46:23 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 06 06:46:23 localhost kernel: Using GB pages for direct mapping
Dec 06 06:46:23 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Dec 06 06:46:23 localhost kernel: ACPI: Early table checksum verification disabled
Dec 06 06:46:23 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 06 06:46:23 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:23 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:23 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:23 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 06 06:46:23 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:23 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 06:46:23 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 06 06:46:23 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 06 06:46:23 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 06 06:46:23 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 06 06:46:23 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 06 06:46:23 localhost kernel: No NUMA configuration found
Dec 06 06:46:23 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Dec 06 06:46:23 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Dec 06 06:46:23 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Dec 06 06:46:23 localhost kernel: Zone ranges:
Dec 06 06:46:23 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 06 06:46:23 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 06 06:46:23 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Dec 06 06:46:23 localhost kernel:   Device   empty
Dec 06 06:46:23 localhost kernel: Movable zone start for each node
Dec 06 06:46:23 localhost kernel: Early memory node ranges
Dec 06 06:46:23 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 06 06:46:23 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 06 06:46:23 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Dec 06 06:46:23 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Dec 06 06:46:23 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 06 06:46:23 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 06 06:46:23 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 06 06:46:23 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 06 06:46:23 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 06 06:46:23 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 06 06:46:23 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 06 06:46:23 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 06 06:46:23 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 06 06:46:23 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 06 06:46:23 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 06 06:46:23 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 06 06:46:23 localhost kernel: TSC deadline timer available
Dec 06 06:46:23 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 06 06:46:23 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 06 06:46:23 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 06 06:46:23 localhost kernel: Booting paravirtualized kernel on KVM
Dec 06 06:46:23 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 06 06:46:23 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 06 06:46:23 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Dec 06 06:46:23 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Dec 06 06:46:23 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 06 06:46:23 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 06 06:46:23 localhost kernel: Fallback order for Node 0: 0 
Dec 06 06:46:23 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Dec 06 06:46:23 localhost kernel: Policy zone: Normal
Dec 06 06:46:23 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:23 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Dec 06 06:46:23 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Dec 06 06:46:23 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 06 06:46:23 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 06 06:46:23 localhost kernel: software IO TLB: area num 8.
Dec 06 06:46:23 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Dec 06 06:46:23 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Dec 06 06:46:23 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 06 06:46:23 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Dec 06 06:46:23 localhost kernel: ftrace: allocated 176 pages with 3 groups
Dec 06 06:46:23 localhost kernel: Dynamic Preempt: voluntary
Dec 06 06:46:23 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 06 06:46:23 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 06 06:46:23 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 06 06:46:23 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 06 06:46:23 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 06 06:46:23 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 06 06:46:23 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 06 06:46:23 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 06 06:46:23 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 06 06:46:23 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 06 06:46:23 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Dec 06 06:46:23 localhost kernel: Console: colour VGA+ 80x25
Dec 06 06:46:23 localhost kernel: printk: console [tty0] enabled
Dec 06 06:46:23 localhost kernel: printk: console [ttyS0] enabled
Dec 06 06:46:23 localhost kernel: ACPI: Core revision 20211217
Dec 06 06:46:23 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 06 06:46:23 localhost kernel: x2apic enabled
Dec 06 06:46:23 localhost kernel: Switched APIC routing to physical x2apic.
Dec 06 06:46:23 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 06 06:46:23 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 06 06:46:23 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 06 06:46:23 localhost kernel: LSM: Security Framework initializing
Dec 06 06:46:23 localhost kernel: Yama: becoming mindful.
Dec 06 06:46:23 localhost kernel: SELinux:  Initializing.
Dec 06 06:46:23 localhost kernel: LSM support for eBPF active
Dec 06 06:46:23 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 06 06:46:23 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Dec 06 06:46:23 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 06 06:46:23 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 06 06:46:23 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 06 06:46:23 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 06 06:46:23 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 06 06:46:23 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Dec 06 06:46:23 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Dec 06 06:46:23 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 06 06:46:23 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 06 06:46:23 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 06 06:46:23 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 06 06:46:23 localhost kernel: Freeing SMP alternatives memory: 36K
Dec 06 06:46:23 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 06 06:46:23 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Dec 06 06:46:23 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:23 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:23 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Dec 06 06:46:23 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 06 06:46:23 localhost kernel: ... version:                0
Dec 06 06:46:23 localhost kernel: ... bit width:              48
Dec 06 06:46:23 localhost kernel: ... generic registers:      6
Dec 06 06:46:23 localhost kernel: ... value mask:             0000ffffffffffff
Dec 06 06:46:23 localhost kernel: ... max period:             00007fffffffffff
Dec 06 06:46:23 localhost kernel: ... fixed-purpose events:   0
Dec 06 06:46:23 localhost kernel: ... event mask:             000000000000003f
Dec 06 06:46:23 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 06 06:46:23 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 06 06:46:23 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 06 06:46:23 localhost kernel: x86: Booting SMP configuration:
Dec 06 06:46:23 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 06 06:46:23 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 06 06:46:23 localhost kernel: smpboot: Max logical packages: 8
Dec 06 06:46:23 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 06 06:46:23 localhost kernel: node 0 deferred pages initialised in 23ms
Dec 06 06:46:23 localhost kernel: devtmpfs: initialized
Dec 06 06:46:23 localhost kernel: x86/mm: Memory block size: 128MB
Dec 06 06:46:23 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 06 06:46:23 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Dec 06 06:46:23 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 06 06:46:23 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 06 06:46:23 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Dec 06 06:46:23 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 06 06:46:23 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 06 06:46:23 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 06 06:46:23 localhost kernel: audit: type=2000 audit(1765003581.413:1): state=initialized audit_enabled=0 res=1
Dec 06 06:46:23 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 06 06:46:23 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 06 06:46:23 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 06 06:46:23 localhost kernel: cpuidle: using governor menu
Dec 06 06:46:23 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Dec 06 06:46:23 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 06 06:46:23 localhost kernel: PCI: Using configuration type 1 for base access
Dec 06 06:46:23 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 06 06:46:23 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 06 06:46:23 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Dec 06 06:46:23 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Dec 06 06:46:23 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Dec 06 06:46:23 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Dec 06 06:46:23 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Dec 06 06:46:23 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 06 06:46:23 localhost kernel: ACPI: Interpreter enabled
Dec 06 06:46:23 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 06 06:46:23 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 06 06:46:23 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 06 06:46:23 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 06 06:46:23 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 06 06:46:23 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 06 06:46:23 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [3] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [4] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [5] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [6] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [7] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [8] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [9] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [10] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [11] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [12] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [13] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [14] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [15] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [16] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [17] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [18] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [19] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [20] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [21] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [22] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [23] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [24] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [25] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [26] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [27] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [28] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [29] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [30] registered
Dec 06 06:46:23 localhost kernel: acpiphp: Slot [31] registered
Dec 06 06:46:23 localhost kernel: PCI host bridge to bus 0000:00
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Dec 06 06:46:23 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Dec 06 06:46:23 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Dec 06 06:46:23 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Dec 06 06:46:23 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Dec 06 06:46:23 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 06 06:46:23 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Dec 06 06:46:23 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Dec 06 06:46:23 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 06 06:46:23 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 06 06:46:23 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 06 06:46:23 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 06 06:46:23 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 06 06:46:23 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 06 06:46:23 localhost kernel: iommu: Default domain type: Translated 
Dec 06 06:46:23 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Dec 06 06:46:23 localhost kernel: SCSI subsystem initialized
Dec 06 06:46:23 localhost kernel: ACPI: bus type USB registered
Dec 06 06:46:23 localhost kernel: usbcore: registered new interface driver usbfs
Dec 06 06:46:23 localhost kernel: usbcore: registered new interface driver hub
Dec 06 06:46:23 localhost kernel: usbcore: registered new device driver usb
Dec 06 06:46:23 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 06 06:46:23 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 06 06:46:23 localhost kernel: PTP clock support registered
Dec 06 06:46:23 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 06 06:46:23 localhost kernel: NetLabel: Initializing
Dec 06 06:46:23 localhost kernel: NetLabel:  domain hash size = 128
Dec 06 06:46:23 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 06 06:46:23 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 06 06:46:23 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 06 06:46:23 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 06 06:46:23 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 06 06:46:23 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 06 06:46:23 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 06 06:46:23 localhost kernel: vgaarb: loaded
Dec 06 06:46:23 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 06 06:46:23 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 06 06:46:23 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 06 06:46:23 localhost kernel: pnp: PnP ACPI init
Dec 06 06:46:23 localhost kernel: pnp 00:03: [dma 2]
Dec 06 06:46:23 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 06 06:46:23 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 06 06:46:23 localhost kernel: NET: Registered PF_INET protocol family
Dec 06 06:46:23 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Dec 06 06:46:23 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Dec 06 06:46:23 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 06 06:46:23 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 06 06:46:23 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 06 06:46:23 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Dec 06 06:46:23 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Dec 06 06:46:23 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 06 06:46:23 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Dec 06 06:46:23 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 06 06:46:23 localhost kernel: NET: Registered PF_XDP protocol family
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 06 06:46:23 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 06 06:46:23 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 06 06:46:23 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 06 06:46:23 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 26855 usecs
Dec 06 06:46:23 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 06 06:46:23 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 06 06:46:23 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 06 06:46:23 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 06 06:46:23 localhost kernel: ACPI: bus type thunderbolt registered
Dec 06 06:46:23 localhost kernel: Initialise system trusted keyrings
Dec 06 06:46:23 localhost kernel: Key type blacklist registered
Dec 06 06:46:23 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Dec 06 06:46:23 localhost kernel: zbud: loaded
Dec 06 06:46:23 localhost kernel: integrity: Platform Keyring initialized
Dec 06 06:46:23 localhost kernel: NET: Registered PF_ALG protocol family
Dec 06 06:46:23 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 06 06:46:23 localhost kernel: Key type asymmetric registered
Dec 06 06:46:23 localhost kernel: Asymmetric key parser 'x509' registered
Dec 06 06:46:23 localhost kernel: Running certificate verification selftests
Dec 06 06:46:23 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 06 06:46:23 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 06 06:46:23 localhost kernel: io scheduler mq-deadline registered
Dec 06 06:46:23 localhost kernel: io scheduler kyber registered
Dec 06 06:46:23 localhost kernel: io scheduler bfq registered
Dec 06 06:46:23 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 06 06:46:23 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 06 06:46:23 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 06 06:46:23 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 06 06:46:23 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 06 06:46:23 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 06 06:46:23 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 06 06:46:23 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 06 06:46:23 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 06 06:46:23 localhost kernel: Non-volatile memory driver v1.3
Dec 06 06:46:23 localhost kernel: rdac: device handler registered
Dec 06 06:46:23 localhost kernel: hp_sw: device handler registered
Dec 06 06:46:23 localhost kernel: emc: device handler registered
Dec 06 06:46:23 localhost kernel: alua: device handler registered
Dec 06 06:46:23 localhost kernel: libphy: Fixed MDIO Bus: probed
Dec 06 06:46:23 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Dec 06 06:46:23 localhost kernel: ehci-pci: EHCI PCI platform driver
Dec 06 06:46:23 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Dec 06 06:46:23 localhost kernel: ohci-pci: OHCI PCI platform driver
Dec 06 06:46:23 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Dec 06 06:46:23 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 06 06:46:23 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 06 06:46:23 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 06 06:46:23 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 06 06:46:23 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 06 06:46:23 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 06 06:46:23 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 06 06:46:23 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Dec 06 06:46:23 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 06 06:46:23 localhost kernel: hub 1-0:1.0: USB hub found
Dec 06 06:46:23 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 06 06:46:23 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 06 06:46:23 localhost kernel: usbserial: USB Serial support registered for generic
Dec 06 06:46:23 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 06 06:46:23 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 06 06:46:23 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 06 06:46:23 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 06 06:46:23 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 06 06:46:23 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 06 06:46:23 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 06 06:46:23 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T06:46:22 UTC (1765003582)
Dec 06 06:46:23 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 06 06:46:23 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 06 06:46:23 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 06 06:46:23 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 06 06:46:23 localhost kernel: usbcore: registered new interface driver usbhid
Dec 06 06:46:23 localhost kernel: usbhid: USB HID core driver
Dec 06 06:46:23 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 06 06:46:23 localhost kernel: Initializing XFRM netlink socket
Dec 06 06:46:23 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 06 06:46:23 localhost kernel: Segment Routing with IPv6
Dec 06 06:46:23 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 06 06:46:23 localhost kernel: mpls_gso: MPLS GSO support
Dec 06 06:46:23 localhost kernel: IPI shorthand broadcast: enabled
Dec 06 06:46:23 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 06 06:46:23 localhost kernel: AES CTR mode by8 optimization enabled
Dec 06 06:46:23 localhost kernel: sched_clock: Marking stable (824422579, 175960483)->(1128676633, -128293571)
Dec 06 06:46:23 localhost kernel: registered taskstats version 1
Dec 06 06:46:23 localhost kernel: Loading compiled-in X.509 certificates
Dec 06 06:46:23 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 06 06:46:23 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 06 06:46:23 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 06 06:46:23 localhost kernel: zswap: loaded using pool lzo/zbud
Dec 06 06:46:23 localhost kernel: page_owner is disabled
Dec 06 06:46:23 localhost kernel: Key type big_key registered
Dec 06 06:46:23 localhost kernel: Freeing initrd memory: 74232K
Dec 06 06:46:23 localhost kernel: Key type encrypted registered
Dec 06 06:46:23 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 06 06:46:23 localhost kernel: Loading compiled-in module X.509 certificates
Dec 06 06:46:23 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Dec 06 06:46:23 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 06 06:46:23 localhost kernel: ima: No architecture policies found
Dec 06 06:46:23 localhost kernel: evm: Initialising EVM extended attributes:
Dec 06 06:46:23 localhost kernel: evm: security.selinux
Dec 06 06:46:23 localhost kernel: evm: security.SMACK64 (disabled)
Dec 06 06:46:23 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 06 06:46:23 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 06 06:46:23 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 06 06:46:23 localhost kernel: evm: security.apparmor (disabled)
Dec 06 06:46:23 localhost kernel: evm: security.ima
Dec 06 06:46:23 localhost kernel: evm: security.capability
Dec 06 06:46:23 localhost kernel: evm: HMAC attrs: 0x1
Dec 06 06:46:23 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 06 06:46:23 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 06 06:46:23 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 06 06:46:23 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 06 06:46:23 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 06 06:46:23 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 06 06:46:23 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 06 06:46:23 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 06 06:46:23 localhost kernel: Freeing unused decrypted memory: 2036K
Dec 06 06:46:23 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Dec 06 06:46:23 localhost kernel: Write protecting the kernel read-only data: 26624k
Dec 06 06:46:23 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Dec 06 06:46:23 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Dec 06 06:46:23 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 06 06:46:23 localhost kernel: Run /init as init process
Dec 06 06:46:23 localhost kernel:   with arguments:
Dec 06 06:46:23 localhost kernel:     /init
Dec 06 06:46:23 localhost kernel:   with environment:
Dec 06 06:46:23 localhost kernel:     HOME=/
Dec 06 06:46:23 localhost kernel:     TERM=linux
Dec 06 06:46:23 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Dec 06 06:46:23 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 06:46:23 localhost systemd[1]: Detected virtualization kvm.
Dec 06 06:46:23 localhost systemd[1]: Detected architecture x86-64.
Dec 06 06:46:23 localhost systemd[1]: Running in initrd.
Dec 06 06:46:23 localhost systemd[1]: No hostname configured, using default hostname.
Dec 06 06:46:23 localhost systemd[1]: Hostname set to <localhost>.
Dec 06 06:46:23 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 06 06:46:23 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 06 06:46:23 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:23 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 06:46:23 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 06 06:46:23 localhost systemd[1]: Reached target Local File Systems.
Dec 06 06:46:23 localhost systemd[1]: Reached target Path Units.
Dec 06 06:46:23 localhost systemd[1]: Reached target Slice Units.
Dec 06 06:46:23 localhost systemd[1]: Reached target Swaps.
Dec 06 06:46:23 localhost systemd[1]: Reached target Timer Units.
Dec 06 06:46:23 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 06:46:23 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 06 06:46:23 localhost systemd[1]: Listening on Journal Socket.
Dec 06 06:46:23 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 06:46:23 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 06:46:23 localhost systemd[1]: Reached target Socket Units.
Dec 06 06:46:23 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 06:46:23 localhost systemd[1]: Starting Journal Service...
Dec 06 06:46:23 localhost systemd[1]: Starting Load Kernel Modules...
Dec 06 06:46:23 localhost systemd[1]: Starting Create System Users...
Dec 06 06:46:23 localhost systemd[1]: Starting Setup Virtual Console...
Dec 06 06:46:23 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 06:46:23 localhost systemd-journald[284]: Journal started
Dec 06 06:46:23 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/0b20d7bd13414912afa7eec4e2b0c648) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:23 localhost systemd-modules-load[285]: Module 'msr' is built in
Dec 06 06:46:23 localhost systemd[1]: Started Journal Service.
Dec 06 06:46:23 localhost systemd[1]: Finished Load Kernel Modules.
Dec 06 06:46:23 localhost systemd[1]: Finished Setup Virtual Console.
Dec 06 06:46:23 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 06 06:46:23 localhost systemd[1]: Starting dracut cmdline hook...
Dec 06 06:46:23 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 06:46:23 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Dec 06 06:46:23 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Dec 06 06:46:23 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Dec 06 06:46:23 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 06 06:46:23 localhost systemd[1]: Finished Create System Users.
Dec 06 06:46:23 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 06:46:23 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Dec 06 06:46:23 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 06:46:23 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 06:46:23 localhost dracut-cmdline[289]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Dec 06 06:46:23 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 06:46:23 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 06:46:23 localhost systemd[1]: Finished dracut cmdline hook.
Dec 06 06:46:23 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 06 06:46:23 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 06 06:46:23 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 06 06:46:23 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Dec 06 06:46:23 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 06 06:46:23 localhost kernel: RPC: Registered udp transport module.
Dec 06 06:46:23 localhost kernel: RPC: Registered tcp transport module.
Dec 06 06:46:23 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 06 06:46:23 localhost rpc.statd[407]: Version 2.5.4 starting
Dec 06 06:46:23 localhost rpc.statd[407]: Initializing NSM state
Dec 06 06:46:23 localhost rpc.idmapd[412]: Setting log level to 0
Dec 06 06:46:23 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 06 06:46:23 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 06:46:23 localhost systemd-udevd[425]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 06:46:23 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 06:46:23 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 06 06:46:23 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 06 06:46:23 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 06:46:23 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 06:46:23 localhost systemd[1]: Reached target System Initialization.
Dec 06 06:46:23 localhost systemd[1]: Reached target Basic System.
Dec 06 06:46:23 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 06:46:23 localhost systemd[1]: Reached target Network.
Dec 06 06:46:23 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 06:46:23 localhost systemd[1]: Starting dracut initqueue hook...
Dec 06 06:46:23 localhost kernel: libata version 3.00 loaded.
Dec 06 06:46:23 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Dec 06 06:46:23 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 06 06:46:23 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Dec 06 06:46:23 localhost kernel: scsi host0: ata_piix
Dec 06 06:46:23 localhost kernel: GPT:20971519 != 838860799
Dec 06 06:46:23 localhost kernel: scsi host1: ata_piix
Dec 06 06:46:23 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Dec 06 06:46:23 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Dec 06 06:46:23 localhost kernel: GPT:20971519 != 838860799
Dec 06 06:46:23 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Dec 06 06:46:24 localhost systemd-udevd[457]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:46:24 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Dec 06 06:46:24 localhost kernel:  vda: vda1 vda2 vda3 vda4
Dec 06 06:46:24 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 06 06:46:24 localhost systemd[1]: Reached target Initrd Root Device.
Dec 06 06:46:24 localhost kernel: ata1: found unknown device (class 0)
Dec 06 06:46:24 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 06 06:46:24 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 06 06:46:24 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 06 06:46:24 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 06 06:46:24 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 06 06:46:24 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 06 06:46:24 localhost systemd[1]: Finished dracut initqueue hook.
Dec 06 06:46:24 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 06:46:24 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 06 06:46:24 localhost systemd[1]: Reached target Remote File Systems.
Dec 06 06:46:24 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 06 06:46:24 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 06 06:46:24 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Dec 06 06:46:24 localhost systemd-fsck[513]: /usr/sbin/fsck.xfs: XFS file system.
Dec 06 06:46:24 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Dec 06 06:46:24 localhost systemd[1]: Mounting /sysroot...
Dec 06 06:46:24 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 06 06:46:24 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Dec 06 06:46:24 localhost kernel: XFS (vda4): Ending clean mount
Dec 06 06:46:24 localhost systemd[1]: Mounted /sysroot.
Dec 06 06:46:24 localhost systemd[1]: Reached target Initrd Root File System.
Dec 06 06:46:24 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 06 06:46:24 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 06 06:46:24 localhost systemd[1]: Reached target Initrd File Systems.
Dec 06 06:46:24 localhost systemd[1]: Reached target Initrd Default Target.
Dec 06 06:46:24 localhost systemd[1]: Starting dracut mount hook...
Dec 06 06:46:24 localhost systemd[1]: Finished dracut mount hook.
Dec 06 06:46:24 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 06 06:46:24 localhost rpc.idmapd[412]: exiting on signal 15
Dec 06 06:46:24 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 06 06:46:24 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 06 06:46:24 localhost systemd[1]: Stopped target Network.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Timer Units.
Dec 06 06:46:24 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 06 06:46:24 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Basic System.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Path Units.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Remote File Systems.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Slice Units.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Socket Units.
Dec 06 06:46:24 localhost systemd[1]: Stopped target System Initialization.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Local File Systems.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Swaps.
Dec 06 06:46:24 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut mount hook.
Dec 06 06:46:24 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 06 06:46:24 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 06 06:46:24 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:24 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 06 06:46:24 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 06 06:46:24 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Load Kernel Modules.
Dec 06 06:46:24 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 06 06:46:24 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 06 06:46:24 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 06 06:46:24 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 06 06:46:24 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 06 06:46:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 06 06:46:24 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 06 06:46:24 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Closed udev Control Socket.
Dec 06 06:46:24 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Closed udev Kernel Socket.
Dec 06 06:46:24 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 06 06:46:24 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 06 06:46:24 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 06 06:46:25 localhost systemd[1]: Starting Cleanup udev Database...
Dec 06 06:46:25 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 06 06:46:25 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 06 06:46:25 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Stopped Create System Users.
Dec 06 06:46:25 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Finished Cleanup udev Database.
Dec 06 06:46:25 localhost systemd[1]: Reached target Switch Root.
Dec 06 06:46:25 localhost systemd[1]: Starting Switch Root...
Dec 06 06:46:25 localhost systemd[1]: Switching root.
Dec 06 06:46:25 localhost systemd-journald[284]: Journal stopped
Dec 06 06:46:25 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Dec 06 06:46:25 localhost kernel: audit: type=1404 audit(1765003585.122:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability open_perms=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 06:46:25 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 06:46:25 localhost kernel: audit: type=1403 audit(1765003585.221:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 06 06:46:25 localhost systemd[1]: Successfully loaded SELinux policy in 100.414ms.
Dec 06 06:46:25 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 23.172ms.
Dec 06 06:46:25 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 06:46:25 localhost systemd[1]: Detected virtualization kvm.
Dec 06 06:46:25 localhost systemd[1]: Detected architecture x86-64.
Dec 06 06:46:25 localhost systemd-rc-local-generator[583]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 06:46:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 06:46:25 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Stopped Switch Root.
Dec 06 06:46:25 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 06 06:46:25 localhost systemd[1]: Created slice Slice /system/getty.
Dec 06 06:46:25 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 06 06:46:25 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 06 06:46:25 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 06 06:46:25 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Dec 06 06:46:25 localhost systemd[1]: Created slice User and Session Slice.
Dec 06 06:46:25 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 06:46:25 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 06 06:46:25 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 06 06:46:25 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 06:46:25 localhost systemd[1]: Stopped target Switch Root.
Dec 06 06:46:25 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 06 06:46:25 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 06 06:46:25 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 06 06:46:25 localhost systemd[1]: Reached target Path Units.
Dec 06 06:46:25 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 06 06:46:25 localhost systemd[1]: Reached target Slice Units.
Dec 06 06:46:25 localhost systemd[1]: Reached target Swaps.
Dec 06 06:46:25 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 06 06:46:25 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 06 06:46:25 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 06 06:46:25 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 06 06:46:25 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 06 06:46:25 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 06:46:25 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 06:46:25 localhost systemd[1]: Mounting Huge Pages File System...
Dec 06 06:46:25 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 06 06:46:25 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 06 06:46:25 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 06 06:46:25 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 06:46:25 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 06:46:25 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 06:46:25 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 06 06:46:25 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 06 06:46:25 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 06 06:46:25 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 06 06:46:25 localhost systemd[1]: Stopped Journal Service.
Dec 06 06:46:25 localhost kernel: fuse: init (API version 7.36)
Dec 06 06:46:25 localhost systemd[1]: Starting Journal Service...
Dec 06 06:46:25 localhost systemd[1]: Starting Load Kernel Modules...
Dec 06 06:46:25 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 06 06:46:25 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 06 06:46:25 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 06 06:46:25 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 06:46:25 localhost systemd[1]: Mounted Huge Pages File System.
Dec 06 06:46:25 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 06 06:46:25 localhost systemd-journald[619]: Journal started
Dec 06 06:46:25 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:25 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 06 06:46:25 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd-modules-load[620]: Module 'msr' is built in
Dec 06 06:46:25 localhost systemd[1]: Started Journal Service.
Dec 06 06:46:25 localhost kernel: ACPI: bus type drm_connector registered
Dec 06 06:46:25 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 06 06:46:25 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 06 06:46:25 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 06 06:46:25 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 06:46:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 06:46:25 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 06 06:46:25 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 06 06:46:25 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 06 06:46:25 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 06 06:46:25 localhost systemd[1]: Finished Load Kernel Modules.
Dec 06 06:46:25 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 06 06:46:25 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 06 06:46:25 localhost systemd[1]: Mounting FUSE Control File System...
Dec 06 06:46:25 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 06 06:46:25 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 06:46:25 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 06 06:46:25 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 06 06:46:25 localhost systemd[1]: Starting Load/Save Random Seed...
Dec 06 06:46:25 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 06:46:25 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 8.0M, max 314.7M, 306.7M free.
Dec 06 06:46:25 localhost systemd-journald[619]: Received client request to flush runtime journal.
Dec 06 06:46:25 localhost systemd[1]: Starting Create System Users...
Dec 06 06:46:25 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 06:46:25 localhost systemd[1]: Mounted FUSE Control File System.
Dec 06 06:46:25 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 06 06:46:25 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 06 06:46:25 localhost systemd[1]: Finished Load/Save Random Seed.
Dec 06 06:46:25 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 06:46:25 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 06:46:25 localhost systemd-sysusers[632]: Creating group 'sgx' with GID 989.
Dec 06 06:46:25 localhost systemd-sysusers[632]: Creating group 'systemd-oom' with GID 988.
Dec 06 06:46:25 localhost systemd-sysusers[632]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Dec 06 06:46:25 localhost systemd[1]: Finished Create System Users.
Dec 06 06:46:25 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 06:46:25 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 06:46:25 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 06 06:46:25 localhost systemd[1]: Set up automount EFI System Partition Automount.
Dec 06 06:46:26 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 06 06:46:26 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 06:46:26 localhost systemd-udevd[636]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 06:46:26 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 06:46:26 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 06:46:26 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 06:46:26 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 06:46:26 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 06 06:46:26 localhost systemd-udevd[640]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:46:26 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Dec 06 06:46:26 localhost systemd[1]: Mounting /boot...
Dec 06 06:46:26 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Dec 06 06:46:26 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Dec 06 06:46:26 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Dec 06 06:46:26 localhost kernel: XFS (vda3): Ending clean mount
Dec 06 06:46:26 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Dec 06 06:46:26 localhost systemd[1]: Mounted /boot.
Dec 06 06:46:26 localhost systemd-fsck[688]: fsck.fat 4.2 (2021-01-31)
Dec 06 06:46:26 localhost systemd-fsck[688]: /dev/vda2: 12 files, 1782/51145 clusters
Dec 06 06:46:26 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Dec 06 06:46:26 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 06 06:46:26 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 06 06:46:26 localhost kernel: SVM: TSC scaling supported
Dec 06 06:46:26 localhost kernel: kvm: Nested Virtualization enabled
Dec 06 06:46:26 localhost kernel: SVM: kvm: Nested Paging enabled
Dec 06 06:46:26 localhost kernel: SVM: LBR virtualization supported
Dec 06 06:46:26 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 06 06:46:26 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 06 06:46:26 localhost kernel: Console: switching to colour dummy device 80x25
Dec 06 06:46:26 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 06 06:46:26 localhost kernel: [drm] features: -context_init
Dec 06 06:46:26 localhost kernel: [drm] number of scanouts: 1
Dec 06 06:46:26 localhost kernel: [drm] number of cap sets: 0
Dec 06 06:46:26 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Dec 06 06:46:26 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Dec 06 06:46:26 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 06 06:46:26 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 06 06:46:26 localhost systemd[1]: Mounting /boot/efi...
Dec 06 06:46:26 localhost systemd[1]: Mounted /boot/efi.
Dec 06 06:46:26 localhost systemd[1]: Reached target Local File Systems.
Dec 06 06:46:26 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 06 06:46:26 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 06 06:46:26 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 06 06:46:26 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:26 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 06 06:46:26 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 06 06:46:26 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 06:46:26 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 708 (bootctl)
Dec 06 06:46:26 localhost systemd[1]: Starting File System Check on /dev/vda2...
Dec 06 06:46:26 localhost systemd[1]: Finished File System Check on /dev/vda2.
Dec 06 06:46:26 localhost systemd[1]: Mounting EFI System Partition Automount...
Dec 06 06:46:26 localhost systemd[1]: Mounted EFI System Partition Automount.
Dec 06 06:46:26 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 06 06:46:26 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 06:46:26 localhost systemd[1]: Starting Security Auditing Service...
Dec 06 06:46:26 localhost systemd[1]: Starting RPC Bind...
Dec 06 06:46:26 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 06 06:46:26 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 06 06:46:26 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Dec 06 06:46:26 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Dec 06 06:46:26 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 06 06:46:26 localhost systemd[1]: Starting Update is Completed...
Dec 06 06:46:26 localhost systemd[1]: Started RPC Bind.
Dec 06 06:46:26 localhost systemd[1]: Finished Update is Completed.
Dec 06 06:46:26 localhost augenrules[731]: /sbin/augenrules: No change
Dec 06 06:46:26 localhost augenrules[743]: No rules
Dec 06 06:46:26 localhost augenrules[743]: enabled 1
Dec 06 06:46:26 localhost augenrules[743]: failure 1
Dec 06 06:46:26 localhost augenrules[743]: pid 725
Dec 06 06:46:26 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:26 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:26 localhost augenrules[743]: lost 0
Dec 06 06:46:26 localhost augenrules[743]: backlog 4
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:26 localhost augenrules[743]: enabled 1
Dec 06 06:46:26 localhost augenrules[743]: failure 1
Dec 06 06:46:26 localhost augenrules[743]: pid 725
Dec 06 06:46:26 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:26 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:26 localhost augenrules[743]: lost 0
Dec 06 06:46:26 localhost augenrules[743]: backlog 3
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:26 localhost augenrules[743]: enabled 1
Dec 06 06:46:26 localhost augenrules[743]: failure 1
Dec 06 06:46:26 localhost augenrules[743]: pid 725
Dec 06 06:46:26 localhost augenrules[743]: rate_limit 0
Dec 06 06:46:26 localhost augenrules[743]: backlog_limit 8192
Dec 06 06:46:26 localhost augenrules[743]: lost 0
Dec 06 06:46:26 localhost augenrules[743]: backlog 4
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time 60000
Dec 06 06:46:26 localhost augenrules[743]: backlog_wait_time_actual 0
Dec 06 06:46:26 localhost systemd[1]: Started Security Auditing Service.
Dec 06 06:46:26 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 06 06:46:26 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 06 06:46:26 localhost systemd[1]: Reached target System Initialization.
Dec 06 06:46:26 localhost systemd[1]: Started dnf makecache --timer.
Dec 06 06:46:26 localhost systemd[1]: Started Daily rotation of log files.
Dec 06 06:46:26 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 06 06:46:26 localhost systemd[1]: Reached target Timer Units.
Dec 06 06:46:26 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 06:46:26 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 06 06:46:26 localhost systemd[1]: Reached target Socket Units.
Dec 06 06:46:26 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Dec 06 06:46:27 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 06 06:46:27 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:27 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 06 06:46:27 localhost dbus-broker-lau[752]: Ready
Dec 06 06:46:27 localhost systemd[1]: Reached target Basic System.
Dec 06 06:46:27 localhost systemd[1]: Starting NTP client/server...
Dec 06 06:46:27 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 06 06:46:27 localhost systemd[1]: Started irqbalance daemon.
Dec 06 06:46:27 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 06 06:46:27 localhost systemd[1]: Starting System Logging Service...
Dec 06 06:46:27 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:27 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:27 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 06:46:27 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 06 06:46:27 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 06 06:46:27 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 06 06:46:27 localhost chronyd[762]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 06:46:27 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start
Dec 06 06:46:27 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Dec 06 06:46:27 localhost chronyd[762]: Using right/UTC timezone to obtain leap second data
Dec 06 06:46:27 localhost chronyd[762]: Loaded seccomp filter (level 2)
Dec 06 06:46:27 localhost systemd[1]: Starting User Login Management...
Dec 06 06:46:27 localhost systemd[1]: Started System Logging Service.
Dec 06 06:46:27 localhost systemd[1]: Started NTP client/server.
Dec 06 06:46:27 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 06 06:46:27 localhost rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 06:46:27 localhost systemd-logind[766]: New seat seat0.
Dec 06 06:46:27 localhost systemd-logind[766]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 06:46:27 localhost systemd-logind[766]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 06:46:27 localhost systemd[1]: Started User Login Management.
Dec 06 06:46:27 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sat, 06 Dec 2025 06:46:27 +0000. Up 5.65 seconds.
Dec 06 06:46:27 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 06 06:46:27 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 06 06:46:27 localhost systemd[1]: Starting Hostname Service...
Dec 06 06:46:27 localhost systemd[1]: run-cloud\x2dinit-tmp-tmppp438cfj.mount: Deactivated successfully.
Dec 06 06:46:27 localhost systemd[1]: Started Hostname Service.
Dec 06 06:46:27 np0005548789.novalocal systemd-hostnamed[785]: Hostname set to <np0005548789.novalocal> (static)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Reached target Preparation for Network.
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Network Manager...
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8621] NetworkManager (version 1.42.2-1.el9) is starting... (boot:a2c5bf5a-4be9-4ef7-a12e-aeb290b897cb)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8627] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Started Network Manager.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8652] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Reached target Network.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8707] manager[0x563828ced020]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8833] hostname: hostname: using hostnamed
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8833] hostname: static hostname changed from (none) to "np0005548789.novalocal"
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8838] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8975] manager[0x563828ced020]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.8977] manager[0x563828ced020]: rfkill: WWAN hardware radio set enabled
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9039] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9040] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9042] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9046] manager: Networking is enabled by state file
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9059] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9060] settings: Loaded settings plugin: keyfile (internal)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9084] dhcp: init: Using DHCP client 'internal'
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9090] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9104] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9108] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9116] device (lo): Activation: starting connection 'lo' (1c0ca10a-4a5b-41dd-9a55-58f9b21f8cc0)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9124] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9128] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9160] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9163] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9164] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9166] device (eth0): carrier: link connected
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9168] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9172] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9177] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9183] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9184] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9187] manager: NetworkManager state is now CONNECTING
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9189] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9194] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9197] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Reached target NFS client services.
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9270] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9273] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9291] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Reached target Remote File Systems.
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9423] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9424] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9429] device (lo): Activation: successful, device activated.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9433] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9435] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9437] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9440] device (eth0): Activation: successful, device activated.
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9443] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 06:46:27 np0005548789.novalocal NetworkManager[790]: <info>  [1765003587.9445] manager: startup complete
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 06:46:27 np0005548789.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: Cloud-init v. 22.1-9.el9 running 'init' at Sat, 06 Dec 2025 06:46:28 +0000. Up 6.44 seconds.
Dec 06 06:46:28 np0005548789.novalocal systemd[1]: Starting Authorization Manager...
Dec 06 06:46:28 np0005548789.novalocal polkitd[1032]: Started polkitd version 0.117
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |  eth0  | True |        38.102.83.150         | 255.255.255.0 | global | fa:16:3e:11:88:44 |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |  eth0  | True | fe80::f816:3eff:fe11:8844/64 |       .       |  link  | fa:16:3e:11:88:44 |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 06 06:46:28 np0005548789.novalocal cloud-init[1030]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 06:46:28 np0005548789.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 06:46:28 np0005548789.novalocal polkitd[1032]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 06:46:28 np0005548789.novalocal polkitd[1032]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 06:46:28 np0005548789.novalocal polkitd[1032]: Finished loading, compiling and executing 4 rules
Dec 06 06:46:28 np0005548789.novalocal systemd[1]: Started Authorization Manager.
Dec 06 06:46:28 np0005548789.novalocal polkitd[1032]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: new group: name=cloud-user, GID=1001
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: add 'cloud-user' to group 'adm'
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: add 'cloud-user' to group 'systemd-journal'
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: add 'cloud-user' to shadow group 'adm'
Dec 06 06:46:29 np0005548789.novalocal useradd[1116]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Generating public/private rsa key pair.
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key fingerprint is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: SHA256:afUs/aXJem1MbQYsKivIGrnY561c0pW4IR2IXc3Bbfk root@np0005548789.novalocal
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key's randomart image is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +---[RSA 3072]----+
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |      .+.o .     |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |   o o  + +      |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |  . o .  ... .   |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |     . o + +E o  |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |    . + S ..+. .o|
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |   . o =. .. o +=|
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |  o...+  o    =* |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: | o +++. .    .. +|
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |. +++...    .. . |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +----[SHA256]-----+
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Generating public/private ecdsa key pair.
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key fingerprint is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: SHA256:7xY1g5s/YjqrzdVuxiBI0g2l850jNJ3wntt6EYV6+ys root@np0005548789.novalocal
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key's randomart image is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +---[ECDSA 256]---+
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |       .o    .   |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |      .. + .. .  |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |     .ooo +o .   |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |    . o+.+oo*    |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |     o .S *= =   |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |      . .o==o    |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |         .+=oo   |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |       o.o+oE .  |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |      ..=*o=.o.. |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +----[SHA256]-----+
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Generating public/private ed25519 key pair.
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key fingerprint is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: SHA256:qtMm4Ybt7afrbmQTpdQ7d3hongkiN8O5Y7mswrjOXBw root@np0005548789.novalocal
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: The key's randomart image is:
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +--[ED25519 256]--+
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |       .         |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |      . o        |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |     o + . o     |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |    . X + = o    |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |   E o BS* =     |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |  . o O.  +      |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: | o * B.+         |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |+ = *o* .        |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: |o= ++XB+         |
Dec 06 06:46:31 np0005548789.novalocal cloud-init[1030]: +----[SHA256]-----+
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Reached target Network is Online.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Permit User Sessions...
Dec 06 06:46:32 np0005548789.novalocal sm-notify[1129]: Version 2.5.4 starting
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Finished Permit User Sessions.
Dec 06 06:46:32 np0005548789.novalocal sshd[1130]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Started Command Scheduler.
Dec 06 06:46:32 np0005548789.novalocal sshd[1130]: Server listening on 0.0.0.0 port 22.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Started Getty on tty1.
Dec 06 06:46:32 np0005548789.novalocal sshd[1130]: Server listening on :: port 22.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 06 06:46:32 np0005548789.novalocal crond[1136]: (CRON) STARTUP (1.5.7)
Dec 06 06:46:32 np0005548789.novalocal crond[1136]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Reached target Login Prompts.
Dec 06 06:46:32 np0005548789.novalocal crond[1136]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 29% if used.)
Dec 06 06:46:32 np0005548789.novalocal crond[1136]: (CRON) INFO (running with inotify support)
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Reached target Multi-User System.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 06 06:46:32 np0005548789.novalocal kdumpctl[1133]: kdump: No kdump initial ramdisk found.
Dec 06 06:46:32 np0005548789.novalocal kdumpctl[1133]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1226]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sat, 06 Dec 2025 06:46:32 +0000. Up 10.47 seconds.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Dec 06 06:46:32 np0005548789.novalocal sshd[1334]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1355]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1355]: Unable to negotiate with 38.102.83.114 port 45518: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1376]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1376]: Connection reset by 38.102.83.114 port 45530 [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1385]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1385]: Unable to negotiate with 38.102.83.114 port 45544: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1396]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1396]: Unable to negotiate with 38.102.83.114 port 45552: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1334]: Connection closed by 38.102.83.114 port 45516 [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1408]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1408]: Connection closed by 38.102.83.114 port 45560 [preauth]
Dec 06 06:46:32 np0005548789.novalocal sshd[1425]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1430]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sat, 06 Dec 2025 06:46:32 +0000. Up 10.86 seconds.
Dec 06 06:46:32 np0005548789.novalocal dracut[1432]: dracut-057-21.git20230214.el9
Dec 06 06:46:32 np0005548789.novalocal sshd[1433]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1433]: fatal: mm_answer_sign: sign: error in libcrypto
Dec 06 06:46:32 np0005548789.novalocal sshd[1449]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:46:32 np0005548789.novalocal sshd[1449]: Unable to negotiate with 38.102.83.114 port 45596: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1453]: #############################################################
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1454]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 06 06:46:32 np0005548789.novalocal sshd[1425]: Connection closed by 38.102.83.114 port 45576 [preauth]
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1456]: 256 SHA256:7xY1g5s/YjqrzdVuxiBI0g2l850jNJ3wntt6EYV6+ys root@np0005548789.novalocal (ECDSA)
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1459]: 256 SHA256:qtMm4Ybt7afrbmQTpdQ7d3hongkiN8O5Y7mswrjOXBw root@np0005548789.novalocal (ED25519)
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1464]: 3072 SHA256:afUs/aXJem1MbQYsKivIGrnY561c0pW4IR2IXc3Bbfk root@np0005548789.novalocal (RSA)
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1468]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1471]: #############################################################
Dec 06 06:46:32 np0005548789.novalocal dracut[1435]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Dec 06 06:46:32 np0005548789.novalocal cloud-init[1430]: Cloud-init v. 22.1-9.el9 finished at Sat, 06 Dec 2025 06:46:32 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.11 seconds
Dec 06 06:46:32 np0005548789.novalocal chronyd[762]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org)
Dec 06 06:46:32 np0005548789.novalocal chronyd[762]: System clock TAI offset set to 37 seconds
Dec 06 06:46:32 np0005548789.novalocal systemd[1]: Reloading Network Manager...
Dec 06 06:46:32 np0005548789.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 06 06:46:32 np0005548789.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 06 06:46:32 np0005548789.novalocal dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 06:46:33 np0005548789.novalocal NetworkManager[790]: <info>  [1765003593.0082] audit: op="reload" arg="0" pid=1573 uid=0 result="success"
Dec 06 06:46:33 np0005548789.novalocal NetworkManager[790]: <info>  [1765003593.0092] config: signal: SIGHUP (no changes from disk)
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 06:46:33 np0005548789.novalocal systemd[1]: Reloaded Network Manager.
Dec 06 06:46:33 np0005548789.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Dec 06 06:46:33 np0005548789.novalocal systemd[1]: Reached target Cloud-init target.
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: memstrack is not available
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: memstrack is not available
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 06:46:33 np0005548789.novalocal dracut[1435]: *** Including module: systemd ***
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: *** Including module: systemd-initrd ***
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: *** Including module: i18n ***
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: No KEYMAP configured.
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: *** Including module: drm ***
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: *** Including module: prefixdevname ***
Dec 06 06:46:34 np0005548789.novalocal dracut[1435]: *** Including module: kernel-modules ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: kernel-modules-extra ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: qemu ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: fstab-sys ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: rootfs-block ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: terminfo ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: udev-rules ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: Skipping udev rule: 91-permissions.rules
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: virtiofs ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: dracut-systemd ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: usrmount ***
Dec 06 06:46:35 np0005548789.novalocal dracut[1435]: *** Including module: base ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including module: fs-lib ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including module: kdumpbase ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:   microcode_ctl module: mangling fw_dir
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including module: shutdown ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including module: squash ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Including modules done ***
Dec 06 06:46:36 np0005548789.novalocal dracut[1435]: *** Installing kernel module dependencies ***
Dec 06 06:46:37 np0005548789.novalocal dracut[1435]: *** Installing kernel module dependencies done ***
Dec 06 06:46:37 np0005548789.novalocal dracut[1435]: *** Resolving executable dependencies ***
Dec 06 06:46:38 np0005548789.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Resolving executable dependencies done ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Hardlinking files ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Mode:           real
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Files:          1099
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Linked:         3 files
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Compared:       0 xattrs
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Compared:       373 files
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Saved:          61.04 KiB
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Duration:       0.025114 seconds
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Hardlinking files done ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Could not find 'strip'. Not stripping the initramfs.
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Generating early-microcode cpio image ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Constructing AuthenticAMD.bin ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Store current command line parameters ***
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: Stored kernel commandline:
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: No dracut internal kernel commandline stored in the initramfs
Dec 06 06:46:38 np0005548789.novalocal dracut[1435]: *** Install squash loader ***
Dec 06 06:46:39 np0005548789.novalocal dracut[1435]: *** Squashing the files inside the initramfs ***
Dec 06 06:46:40 np0005548789.novalocal dracut[1435]: *** Squashing the files inside the initramfs done ***
Dec 06 06:46:40 np0005548789.novalocal dracut[1435]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Dec 06 06:46:40 np0005548789.novalocal dracut[1435]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Dec 06 06:46:41 np0005548789.novalocal kdumpctl[1133]: kdump: kexec: loaded kdump kernel
Dec 06 06:46:41 np0005548789.novalocal kdumpctl[1133]: kdump: Starting kdump: [OK]
Dec 06 06:46:41 np0005548789.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 06 06:46:41 np0005548789.novalocal systemd[1]: Startup finished in 1.265s (kernel) + 2.105s (initrd) + 15.985s (userspace) = 19.356s.
Dec 06 06:46:57 np0005548789.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 06:48:24 np0005548789.novalocal sshd[4175]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:48:24 np0005548789.novalocal sshd[4175]: Accepted publickey for zuul from 38.102.83.114 port 42384 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 06 06:48:24 np0005548789.novalocal systemd-logind[766]: New session 1 of user zuul.
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Queued start job for default target Main User Target.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Created slice User Application Slice.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Reached target Paths.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Reached target Timers.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Starting D-Bus User Message Bus Socket...
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Starting Create User's Volatile Files and Directories...
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Finished Create User's Volatile Files and Directories.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Listening on D-Bus User Message Bus Socket.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Reached target Sockets.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Reached target Basic System.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Reached target Main User Target.
Dec 06 06:48:24 np0005548789.novalocal systemd[4179]: Startup finished in 109ms.
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 06 06:48:24 np0005548789.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 06 06:48:24 np0005548789.novalocal sshd[4175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:48:25 np0005548789.novalocal python3[4231]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:35 np0005548789.novalocal python3[4250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:42 np0005548789.novalocal python3[4303]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:43 np0005548789.novalocal python3[4333]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 06 06:48:46 np0005548789.novalocal python3[4349]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:48:47 np0005548789.novalocal python3[4363]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:48 np0005548789.novalocal python3[4422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:48:48 np0005548789.novalocal python3[4463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003728.1588957-393-159816955677880/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa follow=False checksum=59556e0a2f4b936183817041ae1f59f0f3c92dd9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:51 np0005548789.novalocal python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:48:51 np0005548789.novalocal python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765003730.9362123-494-144815595449017/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa.pub follow=False checksum=2b77fe3fb3441abe077d8d93b68745bd8f418f92 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:48:53 np0005548789.novalocal python3[4605]: ansible-ping Invoked with data=pong
Dec 06 06:48:55 np0005548789.novalocal python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 06:48:58 np0005548789.novalocal python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 06 06:49:01 np0005548789.novalocal python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:02 np0005548789.novalocal python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:02 np0005548789.novalocal python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:04 np0005548789.novalocal python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:04 np0005548789.novalocal python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:04 np0005548789.novalocal python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:06 np0005548789.novalocal sudo[4778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgutchlcestgzaysejnjkgwpxfkxhgui ; /usr/bin/python3
Dec 06 06:49:06 np0005548789.novalocal sudo[4778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:07 np0005548789.novalocal python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:07 np0005548789.novalocal sudo[4778]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:08 np0005548789.novalocal sudo[4827]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qefdpuaqzhaxirdhnifavrammlptertu ; /usr/bin/python3
Dec 06 06:49:08 np0005548789.novalocal sudo[4827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:08 np0005548789.novalocal python3[4829]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:08 np0005548789.novalocal sudo[4827]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:08 np0005548789.novalocal sudo[4870]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljbcvoccbudegforxzwdolitjuwmcehu ; /usr/bin/python3
Dec 06 06:49:08 np0005548789.novalocal sudo[4870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:08 np0005548789.novalocal python3[4872]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003748.294465-103-75605470268299/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:08 np0005548789.novalocal sudo[4870]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:16 np0005548789.novalocal python3[4900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:16 np0005548789.novalocal python3[4914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548789.novalocal python3[4928]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548789.novalocal python3[4942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548789.novalocal python3[4956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:17 np0005548789.novalocal python3[4970]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548789.novalocal python3[4984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548789.novalocal python3[4998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548789.novalocal python3[5012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:18 np0005548789.novalocal python3[5026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548789.novalocal python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548789.novalocal python3[5054]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:19 np0005548789.novalocal python3[5068]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548789.novalocal python3[5082]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548789.novalocal python3[5096]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548789.novalocal python3[5110]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:20 np0005548789.novalocal python3[5124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548789.novalocal python3[5138]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548789.novalocal python3[5152]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548789.novalocal python3[5166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:21 np0005548789.novalocal python3[5180]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548789.novalocal python3[5194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548789.novalocal python3[5208]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:22 np0005548789.novalocal python3[5222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:23 np0005548789.novalocal python3[5236]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:23 np0005548789.novalocal python3[5250]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 06:49:24 np0005548789.novalocal sudo[5264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utcdrsgjcopxbwjsfwzkjpweythmhkct ; /usr/bin/python3
Dec 06 06:49:24 np0005548789.novalocal sudo[5264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:25 np0005548789.novalocal python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 06:49:25 np0005548789.novalocal systemd[1]: Starting Time & Date Service...
Dec 06 06:49:25 np0005548789.novalocal systemd[1]: Started Time & Date Service.
Dec 06 06:49:25 np0005548789.novalocal systemd-timedated[5268]: Changed time zone to 'UTC' (UTC).
Dec 06 06:49:25 np0005548789.novalocal sudo[5264]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:26 np0005548789.novalocal sudo[5285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtcubsyxexjzcgdiouysqerdrlzljwzy ; /usr/bin/python3
Dec 06 06:49:26 np0005548789.novalocal sudo[5285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:26 np0005548789.novalocal python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:26 np0005548789.novalocal sudo[5285]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:28 np0005548789.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:28 np0005548789.novalocal python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765003767.8538969-495-205773419454080/source _original_basename=tmplovyvf94 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:29 np0005548789.novalocal python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:30 np0005548789.novalocal python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003769.4579241-584-257946452858827/source _original_basename=tmp299r9vl9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:31 np0005548789.novalocal sudo[5535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqivaxvaphzjasxicootoifthezflsko ; /usr/bin/python3
Dec 06 06:49:31 np0005548789.novalocal sudo[5535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:31 np0005548789.novalocal python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:31 np0005548789.novalocal sudo[5535]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:31 np0005548789.novalocal sudo[5578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryhhdltquzdtdybfzloifvdvttgmdntk ; /usr/bin/python3
Dec 06 06:49:31 np0005548789.novalocal sudo[5578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:32 np0005548789.novalocal python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765003771.5337546-729-220628508346141/source _original_basename=tmpivkrylp5 follow=False checksum=12efaaf67f4d002c9317067f1840bb831c38c306 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:32 np0005548789.novalocal sudo[5578]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:33 np0005548789.novalocal python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:33 np0005548789.novalocal python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:34 np0005548789.novalocal sudo[5672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scmrcbtgupfhbiwpywhzeinnwrsjqtph ; /usr/bin/python3
Dec 06 06:49:34 np0005548789.novalocal sudo[5672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:34 np0005548789.novalocal python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:49:34 np0005548789.novalocal sudo[5672]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:34 np0005548789.novalocal sudo[5715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isaeoldlbvrdihtceumppxgtxompdang ; /usr/bin/python3
Dec 06 06:49:34 np0005548789.novalocal sudo[5715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:35 np0005548789.novalocal python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765003774.4274964-849-148715715435939/source _original_basename=tmp98l69hd1 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:35 np0005548789.novalocal sudo[5715]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:36 np0005548789.novalocal sudo[5746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuogfpqbfcdpunzrywuuxnesgwtllkwi ; /usr/bin/python3
Dec 06 06:49:36 np0005548789.novalocal sudo[5746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:49:36 np0005548789.novalocal python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-8d81-2216-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:49:36 np0005548789.novalocal sudo[5746]: pam_unix(sudo:session): session closed for user root
Dec 06 06:49:38 np0005548789.novalocal python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-8d81-2216-000000000024-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 06 06:49:39 np0005548789.novalocal python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:49:55 np0005548789.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 06:50:21 np0005548789.novalocal sshd[5788]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:50:37 np0005548789.novalocal sudo[5803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oovoatsnrcbtaszglarjlpzwhpgzukbg ; /usr/bin/python3
Dec 06 06:50:37 np0005548789.novalocal sudo[5803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:50:37 np0005548789.novalocal python3[5805]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:50:37 np0005548789.novalocal sudo[5803]: pam_unix(sudo:session): session closed for user root
Dec 06 06:51:16 np0005548789.novalocal systemd[4179]: Starting Mark boot as successful...
Dec 06 06:51:16 np0005548789.novalocal systemd[4179]: Finished Mark boot as successful.
Dec 06 06:51:37 np0005548789.novalocal sshd[4188]: Received disconnect from 38.102.83.114 port 42384:11: disconnected by user
Dec 06 06:51:37 np0005548789.novalocal sshd[4188]: Disconnected from user zuul 38.102.83.114 port 42384
Dec 06 06:51:37 np0005548789.novalocal sshd[4175]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:51:37 np0005548789.novalocal systemd-logind[766]: Session 1 logged out. Waiting for processes to exit.
Dec 06 06:51:48 np0005548789.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Dec 06 06:51:48 np0005548789.novalocal systemd[1]: efi.mount: Deactivated successfully.
Dec 06 06:51:48 np0005548789.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Dec 06 06:52:21 np0005548789.novalocal sshd[5788]: fatal: Timeout before authentication for 14.103.117.88 port 59842
Dec 06 06:53:09 np0005548789.novalocal sshd[5811]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:53:24 np0005548789.novalocal sshd[5811]: Connection closed by 167.94.138.180 port 11792 [preauth]
Dec 06 06:54:16 np0005548789.novalocal systemd[4179]: Created slice User Background Tasks Slice.
Dec 06 06:54:16 np0005548789.novalocal systemd[4179]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 06:54:16 np0005548789.novalocal systemd[4179]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Dec 06 06:54:19 np0005548789.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Dec 06 06:54:19 np0005548789.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2424] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 06:54:19 np0005548789.novalocal systemd-udevd[5815]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2528] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2550] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2552] device (eth1): carrier: link connected
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2553] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2556] policy: auto-activating connection 'Wired connection 1' (c69a1528-e75c-3f2e-b5a3-724110d3f450)
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2559] device (eth1): Activation: starting connection 'Wired connection 1' (c69a1528-e75c-3f2e-b5a3-724110d3f450)
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2559] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2561] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2564] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Dec 06 06:54:19 np0005548789.novalocal NetworkManager[790]: <info>  [1765004059.2566] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:19 np0005548789.novalocal sshd[5818]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:54:20 np0005548789.novalocal sshd[5818]: Accepted publickey for zuul from 38.102.83.114 port 45586 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 06:54:20 np0005548789.novalocal systemd-logind[766]: New session 3 of user zuul.
Dec 06 06:54:20 np0005548789.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 06 06:54:20 np0005548789.novalocal sshd[5818]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:54:20 np0005548789.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Dec 06 06:54:20 np0005548789.novalocal python3[5835]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-1ece-0164-000000000408-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:54:33 np0005548789.novalocal sudo[5883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goweloukfvmwztxpwzmrxtssqcahidvd ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:33 np0005548789.novalocal sudo[5883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:33 np0005548789.novalocal python3[5885]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:54:33 np0005548789.novalocal sudo[5883]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:33 np0005548789.novalocal sudo[5926]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfnirexsbjiemvfrmldnqlmqcsmwtneq ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:33 np0005548789.novalocal sudo[5926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:33 np0005548789.novalocal python3[5928]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765004073.3000777-486-197346107908609/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=ee1ff0d02b7f6e5013c40075618b5eb9b72f06b2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:54:33 np0005548789.novalocal sudo[5926]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:34 np0005548789.novalocal sudo[5956]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mefmzglrxmnpfqikxojhrvajijxjyzrm ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:54:34 np0005548789.novalocal sudo[5956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:54:34 np0005548789.novalocal python3[5958]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Stopping Network Manager...
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6038] caught SIGTERM, shutting down normally.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6178] dhcp4 (eth0): canceled DHCP transaction
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6178] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6178] dhcp4 (eth0): state changed no lease
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6183] manager: NetworkManager state is now CONNECTING
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6368] dhcp4 (eth1): canceled DHCP transaction
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6369] dhcp4 (eth1): state changed no lease
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[790]: <info>  [1765004074.6439] exiting (success)
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Stopped Network Manager.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: NetworkManager.service: Consumed 2.627s CPU time.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Starting Network Manager...
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.6950] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:a2c5bf5a-4be9-4ef7-a12e-aeb290b897cb)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.6951] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Started Network Manager.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.6983] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7044] manager[0x561cd06d7090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Starting Hostname Service...
Dec 06 06:54:34 np0005548789.novalocal sudo[5956]: pam_unix(sudo:session): session closed for user root
Dec 06 06:54:34 np0005548789.novalocal systemd[1]: Started Hostname Service.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7642] hostname: hostname: using hostnamed
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7643] hostname: static hostname changed from (none) to "np0005548789.novalocal"
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7649] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7655] manager[0x561cd06d7090]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7656] manager[0x561cd06d7090]: rfkill: WWAN hardware radio set enabled
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7706] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7707] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7709] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7710] manager: Networking is enabled by state file
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7723] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7724] settings: Loaded settings plugin: keyfile (internal)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7773] dhcp: init: Using DHCP client 'internal'
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7778] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7789] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7795] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7809] device (lo): Activation: starting connection 'lo' (1c0ca10a-4a5b-41dd-9a55-58f9b21f8cc0)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7819] device (eth0): carrier: link connected
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7826] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7833] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7834] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7844] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7855] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7863] device (eth1): carrier: link connected
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7870] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7878] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c69a1528-e75c-3f2e-b5a3-724110d3f450) (indicated)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7878] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7885] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7896] device (eth1): Activation: starting connection 'Wired connection 1' (c69a1528-e75c-3f2e-b5a3-724110d3f450)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7925] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7931] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7945] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7948] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7952] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7954] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7957] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.7998] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8002] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8006] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8014] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8016] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8029] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8034] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8039] device (lo): Activation: successful, device activated.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8096] dhcp4 (eth0): state changed new lease, address=38.102.83.150
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8101] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8171] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8193] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8196] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8201] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8205] device (eth0): Activation: successful, device activated.
Dec 06 06:54:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004074.8211] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 06:54:35 np0005548789.novalocal python3[6039]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-1ece-0164-00000000012b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 06:54:44 np0005548789.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:55:04 np0005548789.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 06:55:19 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004119.7517] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Dec 06 06:55:19 np0005548789.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 06:55:19 np0005548789.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 06:55:19 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004119.7741] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Dec 06 06:55:19 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004119.7743] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Dec 06 06:55:19 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004119.7750] device (eth1): Activation: successful, device activated.
Dec 06 06:55:19 np0005548789.novalocal NetworkManager[5973]: <info>  [1765004119.7755] manager: startup complete
Dec 06 06:55:19 np0005548789.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 06:55:29 np0005548789.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 06:55:35 np0005548789.novalocal sshd[5821]: Received disconnect from 38.102.83.114 port 45586:11: disconnected by user
Dec 06 06:55:35 np0005548789.novalocal sshd[5821]: Disconnected from user zuul 38.102.83.114 port 45586
Dec 06 06:55:35 np0005548789.novalocal sshd[5818]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:55:35 np0005548789.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 06 06:55:35 np0005548789.novalocal systemd[1]: session-3.scope: Consumed 1.515s CPU time.
Dec 06 06:55:35 np0005548789.novalocal systemd-logind[766]: Session 3 logged out. Waiting for processes to exit.
Dec 06 06:55:35 np0005548789.novalocal systemd-logind[766]: Removed session 3.
Dec 06 06:56:17 np0005548789.novalocal sshd[6058]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:18 np0005548789.novalocal sshd[6060]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:18 np0005548789.novalocal sshd[6060]: Accepted publickey for zuul from 38.102.83.114 port 40918 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 06:56:18 np0005548789.novalocal systemd-logind[766]: New session 4 of user zuul.
Dec 06 06:56:18 np0005548789.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 06 06:56:18 np0005548789.novalocal sshd[6060]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 06:56:19 np0005548789.novalocal sudo[6109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyamszvdlxyfnxwsvcinybhlubhqolca ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:56:19 np0005548789.novalocal sudo[6109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:56:19 np0005548789.novalocal python3[6111]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 06:56:19 np0005548789.novalocal sudo[6109]: pam_unix(sudo:session): session closed for user root
Dec 06 06:56:19 np0005548789.novalocal sudo[6152]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwcgnsmujovvhsznihpfftlygdwbpyho ; OS_CLOUD=vexxhost /usr/bin/python3
Dec 06 06:56:19 np0005548789.novalocal sudo[6152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 06:56:19 np0005548789.novalocal python3[6154]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004178.949045-628-261181115769909/source _original_basename=tmp2c4451xj follow=False checksum=301833a7e04d955921816dd6c79e775f1a8a19aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 06:56:19 np0005548789.novalocal sudo[6152]: pam_unix(sudo:session): session closed for user root
Dec 06 06:56:19 np0005548789.novalocal sshd[6058]: Connection reset by authenticating user root 91.202.233.33 port 45516 [preauth]
Dec 06 06:56:20 np0005548789.novalocal sshd[6169]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:22 np0005548789.novalocal sshd[6169]: Connection reset by authenticating user root 91.202.233.33 port 45518 [preauth]
Dec 06 06:56:22 np0005548789.novalocal sshd[6171]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:22 np0005548789.novalocal sshd[6060]: pam_unix(sshd:session): session closed for user zuul
Dec 06 06:56:22 np0005548789.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 06 06:56:22 np0005548789.novalocal systemd-logind[766]: Session 4 logged out. Waiting for processes to exit.
Dec 06 06:56:22 np0005548789.novalocal systemd-logind[766]: Removed session 4.
Dec 06 06:56:24 np0005548789.novalocal sshd[6171]: Invalid user ftpuser from 91.202.233.33 port 27818
Dec 06 06:56:24 np0005548789.novalocal sshd[6171]: Connection reset by invalid user ftpuser 91.202.233.33 port 27818 [preauth]
Dec 06 06:56:25 np0005548789.novalocal sshd[6174]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:27 np0005548789.novalocal sshd[6174]: Connection reset by authenticating user root 91.202.233.33 port 27840 [preauth]
Dec 06 06:56:27 np0005548789.novalocal sshd[6176]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 06:56:30 np0005548789.novalocal sshd[6176]: Connection reset by authenticating user root 91.202.233.33 port 27846 [preauth]
Dec 06 07:00:16 np0005548789.novalocal systemd[1]: Starting dnf makecache...
Dec 06 07:00:17 np0005548789.novalocal dnf[6178]: Failed determining last makecache time.
Dec 06 07:00:17 np0005548789.novalocal dnf[6178]: There are no enabled repositories in "/etc/yum.repos.d", "/etc/yum/repos.d", "/etc/distro.repos.d".
Dec 06 07:00:17 np0005548789.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 07:00:17 np0005548789.novalocal systemd[1]: Finished dnf makecache.
Dec 06 07:01:01 np0005548789.novalocal CROND[6181]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 07:01:01 np0005548789.novalocal run-parts[6184]: (/etc/cron.hourly) starting 0anacron
Dec 06 07:01:01 np0005548789.novalocal anacron[6192]: Anacron started on 2025-12-06
Dec 06 07:01:01 np0005548789.novalocal anacron[6192]: Will run job `cron.daily' in 17 min.
Dec 06 07:01:01 np0005548789.novalocal anacron[6192]: Will run job `cron.weekly' in 37 min.
Dec 06 07:01:01 np0005548789.novalocal anacron[6192]: Will run job `cron.monthly' in 57 min.
Dec 06 07:01:01 np0005548789.novalocal anacron[6192]: Jobs will be executed sequentially
Dec 06 07:01:01 np0005548789.novalocal run-parts[6194]: (/etc/cron.hourly) finished 0anacron
Dec 06 07:01:01 np0005548789.novalocal CROND[6180]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 07:01:43 np0005548789.novalocal sshd[6195]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:01:43 np0005548789.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Dec 06 07:01:43 np0005548789.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 06 07:01:43 np0005548789.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Dec 06 07:01:43 np0005548789.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 06 07:01:46 np0005548789.novalocal sshd[6195]: Connection reset by authenticating user root 45.140.17.124 port 51138 [preauth]
Dec 06 07:01:46 np0005548789.novalocal sshd[6199]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:01:48 np0005548789.novalocal sshd[6199]: Connection reset by authenticating user root 45.140.17.124 port 51150 [preauth]
Dec 06 07:01:48 np0005548789.novalocal sshd[6201]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:01:50 np0005548789.novalocal sshd[6201]: Connection reset by authenticating user root 45.140.17.124 port 51160 [preauth]
Dec 06 07:01:50 np0005548789.novalocal sshd[6203]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:01:52 np0005548789.novalocal sshd[6203]: Connection reset by authenticating user root 45.140.17.124 port 51170 [preauth]
Dec 06 07:01:52 np0005548789.novalocal sshd[6205]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:01:55 np0005548789.novalocal sshd[6205]: Connection reset by authenticating user root 45.140.17.124 port 51180 [preauth]
Dec 06 07:04:39 np0005548789.novalocal sshd[6209]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:04:39 np0005548789.novalocal sshd[6209]: Accepted publickey for zuul from 38.102.83.114 port 56864 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:04:39 np0005548789.novalocal systemd-logind[766]: New session 5 of user zuul.
Dec 06 07:04:39 np0005548789.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 06 07:04:39 np0005548789.novalocal sshd[6209]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:04:39 np0005548789.novalocal sudo[6226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnuoqelcuirffrgkfiaxguspukzgltmh ; /usr/bin/python3
Dec 06 07:04:39 np0005548789.novalocal sudo[6226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:40 np0005548789.novalocal python3[6228]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e5b2-9de0-000000001d10-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:40 np0005548789.novalocal sudo[6226]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548789.novalocal sudo[6244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfxkgleocdsnkdzjpszesvrsitalnpbm ; /usr/bin/python3
Dec 06 07:04:41 np0005548789.novalocal sudo[6244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548789.novalocal python3[6246]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548789.novalocal sudo[6244]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548789.novalocal sudo[6260]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjxlesopijpgqieewellevzrneabektl ; /usr/bin/python3
Dec 06 07:04:41 np0005548789.novalocal sudo[6260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548789.novalocal python3[6262]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548789.novalocal sudo[6260]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:41 np0005548789.novalocal sudo[6276]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkzwcbasvzeglhtfiimezygprzhkwefu ; /usr/bin/python3
Dec 06 07:04:41 np0005548789.novalocal sudo[6276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:41 np0005548789.novalocal python3[6278]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:41 np0005548789.novalocal sudo[6276]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:42 np0005548789.novalocal sudo[6292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnitzzqofmyrroihwuhtjauoftyevgoj ; /usr/bin/python3
Dec 06 07:04:42 np0005548789.novalocal sudo[6292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:42 np0005548789.novalocal python3[6294]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:42 np0005548789.novalocal sudo[6292]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:42 np0005548789.novalocal sudo[6308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxrmvettzseglxwrxyetwlkybkkqmyfo ; /usr/bin/python3
Dec 06 07:04:42 np0005548789.novalocal sudo[6308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:43 np0005548789.novalocal python3[6310]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:43 np0005548789.novalocal sudo[6308]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:44 np0005548789.novalocal sudo[6356]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukxwakynyhefstxwalrgchlsipilaumz ; /usr/bin/python3
Dec 06 07:04:44 np0005548789.novalocal sudo[6356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:44 np0005548789.novalocal python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:04:44 np0005548789.novalocal sudo[6356]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:44 np0005548789.novalocal sudo[6399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czquvxrlmccyvemijnrmwouukfeltady ; /usr/bin/python3
Dec 06 07:04:44 np0005548789.novalocal sudo[6399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:44 np0005548789.novalocal python3[6401]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765004684.0802486-648-270810667483841/source _original_basename=tmp1g0xjkfa follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:04:44 np0005548789.novalocal sudo[6399]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:46 np0005548789.novalocal sudo[6429]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yggorfnvycbwtthxzjfbwwqmowuxzvqf ; /usr/bin/python3
Dec 06 07:04:46 np0005548789.novalocal sudo[6429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:46 np0005548789.novalocal python3[6431]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 07:04:46 np0005548789.novalocal systemd[1]: Reloading.
Dec 06 07:04:46 np0005548789.novalocal systemd-rc-local-generator[6449]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:04:46 np0005548789.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:04:46 np0005548789.novalocal sudo[6429]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:47 np0005548789.novalocal sudo[6475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfttssubcnbevytyljvxvejxwwfxaojp ; /usr/bin/python3
Dec 06 07:04:47 np0005548789.novalocal sudo[6475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:48 np0005548789.novalocal python3[6477]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 06 07:04:48 np0005548789.novalocal sudo[6475]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:49 np0005548789.novalocal sudo[6491]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffmijzajihifhxdlagmffqrcdulunvno ; /usr/bin/python3
Dec 06 07:04:49 np0005548789.novalocal sudo[6491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:49 np0005548789.novalocal python3[6493]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:49 np0005548789.novalocal sudo[6491]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:49 np0005548789.novalocal sudo[6509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtyvatrlrrzqevjhzbnemymxcjwqdsiu ; /usr/bin/python3
Dec 06 07:04:49 np0005548789.novalocal sudo[6509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:49 np0005548789.novalocal python3[6511]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:49 np0005548789.novalocal sudo[6509]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:50 np0005548789.novalocal sudo[6527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljppvjhnujtrgshaarklliiegyuxstwt ; /usr/bin/python3
Dec 06 07:04:50 np0005548789.novalocal sudo[6527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:50 np0005548789.novalocal python3[6529]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:50 np0005548789.novalocal sudo[6527]: pam_unix(sudo:session): session closed for user root
Dec 06 07:04:50 np0005548789.novalocal sudo[6545]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kerdbpuvpxaywlqjomxgfkbupgtcvzfv ; /usr/bin/python3
Dec 06 07:04:50 np0005548789.novalocal sudo[6545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:04:50 np0005548789.novalocal python3[6547]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:04:50 np0005548789.novalocal sudo[6545]: pam_unix(sudo:session): session closed for user root
Dec 06 07:05:01 np0005548789.novalocal python3[6565]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-e5b2-9de0-000000001d17-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:05:02 np0005548789.novalocal python3[6584]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:05:05 np0005548789.novalocal sshd[6209]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:05:05 np0005548789.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 06 07:05:05 np0005548789.novalocal systemd[1]: session-5.scope: Consumed 3.908s CPU time.
Dec 06 07:05:05 np0005548789.novalocal systemd-logind[766]: Session 5 logged out. Waiting for processes to exit.
Dec 06 07:05:05 np0005548789.novalocal systemd-logind[766]: Removed session 5.
Dec 06 07:06:59 np0005548789.novalocal sshd[6593]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:06:59 np0005548789.novalocal sshd[6593]: Accepted publickey for zuul from 38.102.83.114 port 34880 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:06:59 np0005548789.novalocal systemd-logind[766]: New session 6 of user zuul.
Dec 06 07:06:59 np0005548789.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 06 07:06:59 np0005548789.novalocal sshd[6593]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:06:59 np0005548789.novalocal sudo[6610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uefxvtnhvfbpnzkjcyxccadrlvbimpxa ; /usr/bin/python3
Dec 06 07:06:59 np0005548789.novalocal sudo[6610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:06:59 np0005548789.novalocal systemd[1]: Starting RHSM dbus service...
Dec 06 07:07:00 np0005548789.novalocal systemd[1]: Started RHSM dbus service.
Dec 06 07:07:00 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:00 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:01 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005548789.novalocal (49b9d3d6-359c-4738-9880-6751941cc8f8)
Dec 06 07:07:01 np0005548789.novalocal subscription-manager[6617]: Registered system with identity: 49b9d3d6-359c-4738-9880-6751941cc8f8
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.entcertlib:131] certs updated:
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]: Total updates: 1
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]: Found (local) serial# []
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]: Expected (UEP) serial# [4524945705155541200]
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]: Added (new)
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]:   [sn:4524945705155541200 ( Content Access,) @ /etc/pki/entitlement/4524945705155541200.pem]
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]: Deleted (rogue):
Dec 06 07:07:02 np0005548789.novalocal rhsm-service[6617]:   <NONE>
Dec 06 07:07:02 np0005548789.novalocal subscription-manager[6617]: Added subscription for 'Content Access' contract 'None'
Dec 06 07:07:02 np0005548789.novalocal subscription-manager[6617]: Added subscription for product ' Content Access'
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:03 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:04 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:07:04 np0005548789.novalocal sudo[6610]: pam_unix(sudo:session): session closed for user root
Dec 06 07:07:11 np0005548789.novalocal python3[6708]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-ea42-bf82-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:07:13 np0005548789.novalocal sudo[6725]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvicowgnnrpjkkwcbzypjgbgrumwmawg ; /usr/bin/python3
Dec 06 07:07:13 np0005548789.novalocal sudo[6725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:07:13 np0005548789.novalocal python3[6727]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:07:44 np0005548789.novalocal setsebool[6802]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 06 07:07:44 np0005548789.novalocal setsebool[6802]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  Converting 409 SID table entries...
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 07:07:52 np0005548789.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 07:08:05 np0005548789.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 06 07:08:05 np0005548789.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:08:05 np0005548789.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:08:05 np0005548789.novalocal systemd[1]: Reloading.
Dec 06 07:08:05 np0005548789.novalocal systemd-rc-local-generator[7669]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:08:06 np0005548789.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:08:06 np0005548789.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:08:07 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:08:07 np0005548789.novalocal sudo[6725]: pam_unix(sudo:session): session closed for user root
Dec 06 07:08:07 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:08:14 np0005548789.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:08:14 np0005548789.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:08:14 np0005548789.novalocal systemd[1]: man-db-cache-update.service: Consumed 10.578s CPU time.
Dec 06 07:08:14 np0005548789.novalocal systemd[1]: run-r009f101a74f34b9b987df03572949b1b.service: Deactivated successfully.
Dec 06 07:08:58 np0005548789.novalocal sudo[18392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inkriyxvlkvccphdfhqrlhynexxgcniq ; /usr/bin/python3
Dec 06 07:08:58 np0005548789.novalocal sudo[18392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:08:59 np0005548789.novalocal systemd[1]: var-lib-containers-storage-overlay-opaque\x2dbug\x2dcheck3706556388-merged.mount: Deactivated successfully.
Dec 06 07:08:59 np0005548789.novalocal podman[18395]: 2025-12-06 07:08:59.238406899 +0000 UTC m=+0.125981404 system refresh
Dec 06 07:08:59 np0005548789.novalocal sudo[18392]: pam_unix(sudo:session): session closed for user root
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: Starting D-Bus User Message Bus...
Dec 06 07:09:00 np0005548789.novalocal dbus-broker-launch[18452]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 06 07:09:00 np0005548789.novalocal dbus-broker-launch[18452]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: Started D-Bus User Message Bus.
Dec 06 07:09:00 np0005548789.novalocal dbus-broker-lau[18452]: Ready
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: Created slice Slice /user.
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: podman-18435.scope: unit configures an IP firewall, but not running as root.
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: (This warning is only shown for the first unit using IP firewalling.)
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: Started podman-18435.scope.
Dec 06 07:09:00 np0005548789.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:09:00 np0005548789.novalocal systemd[4179]: Started podman-pause-927ce357.scope.
Dec 06 07:09:02 np0005548789.novalocal sshd[6593]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:09:02 np0005548789.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 06 07:09:02 np0005548789.novalocal systemd[1]: session-6.scope: Consumed 50.791s CPU time.
Dec 06 07:09:02 np0005548789.novalocal systemd-logind[766]: Session 6 logged out. Waiting for processes to exit.
Dec 06 07:09:02 np0005548789.novalocal systemd-logind[766]: Removed session 6.
Dec 06 07:09:17 np0005548789.novalocal sshd[18455]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548789.novalocal sshd[18455]: Unable to negotiate with 38.102.83.83 port 53754: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 06 07:09:17 np0005548789.novalocal sshd[18456]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548789.novalocal sshd[18459]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548789.novalocal sshd[18457]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548789.novalocal sshd[18458]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:17 np0005548789.novalocal sshd[18456]: Connection closed by 38.102.83.83 port 53734 [preauth]
Dec 06 07:09:17 np0005548789.novalocal sshd[18459]: Connection closed by 38.102.83.83 port 53740 [preauth]
Dec 06 07:09:17 np0005548789.novalocal sshd[18457]: Unable to negotiate with 38.102.83.83 port 53758: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 06 07:09:17 np0005548789.novalocal sshd[18458]: Unable to negotiate with 38.102.83.83 port 53774: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 06 07:09:22 np0005548789.novalocal sshd[18465]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:09:22 np0005548789.novalocal sshd[18465]: Accepted publickey for zuul from 38.102.83.114 port 56990 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:09:22 np0005548789.novalocal systemd-logind[766]: New session 7 of user zuul.
Dec 06 07:09:22 np0005548789.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 06 07:09:22 np0005548789.novalocal sshd[18465]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:09:22 np0005548789.novalocal python3[18482]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEYVtM235X0xWH2FKli0CUGpvCLQnDDtCI4yCYqNdWcGuxt1LThsgCBuwYYpkvH+K5VLRKMEyM949Yu6yQU/mgI= zuul@np0005548782.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:09:23 np0005548789.novalocal sudo[18496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gybtpgogqvujxyecymroukgbxavistwj ; /usr/bin/python3
Dec 06 07:09:23 np0005548789.novalocal sudo[18496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:09:23 np0005548789.novalocal python3[18498]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEYVtM235X0xWH2FKli0CUGpvCLQnDDtCI4yCYqNdWcGuxt1LThsgCBuwYYpkvH+K5VLRKMEyM949Yu6yQU/mgI= zuul@np0005548782.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:09:23 np0005548789.novalocal sudo[18496]: pam_unix(sudo:session): session closed for user root
Dec 06 07:09:25 np0005548789.novalocal sshd[18465]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:09:25 np0005548789.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Dec 06 07:09:25 np0005548789.novalocal systemd-logind[766]: Session 7 logged out. Waiting for processes to exit.
Dec 06 07:09:25 np0005548789.novalocal systemd-logind[766]: Removed session 7.
Dec 06 07:11:02 np0005548789.novalocal sshd[18500]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:11:02 np0005548789.novalocal sshd[18500]: Accepted publickey for zuul from 38.102.83.114 port 48248 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:11:02 np0005548789.novalocal systemd-logind[766]: New session 8 of user zuul.
Dec 06 07:11:02 np0005548789.novalocal systemd[1]: Started Session 8 of User zuul.
Dec 06 07:11:02 np0005548789.novalocal sshd[18500]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:11:02 np0005548789.novalocal sudo[18517]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrtoicknchcxwgoczfzvhchziytzfjcy ; /usr/bin/python3
Dec 06 07:11:02 np0005548789.novalocal sudo[18517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:03 np0005548789.novalocal python3[18519]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:11:03 np0005548789.novalocal sudo[18517]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:03 np0005548789.novalocal sudo[18533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdidlyxblocjmtayrbpfkpitszckkolm ; /usr/bin/python3
Dec 06 07:11:03 np0005548789.novalocal sudo[18533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:04 np0005548789.novalocal python3[18535]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 07:11:04 np0005548789.novalocal sudo[18533]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:05 np0005548789.novalocal sudo[18583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jedvzwkgdfakwtcdmwveypzcpbqndreu ; /usr/bin/python3
Dec 06 07:11:05 np0005548789.novalocal sudo[18583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:05 np0005548789.novalocal python3[18585]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:05 np0005548789.novalocal sudo[18583]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:05 np0005548789.novalocal sudo[18626]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzvvxhyuqrffagkrgbzqywdbwynvybbf ; /usr/bin/python3
Dec 06 07:11:05 np0005548789.novalocal sudo[18626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:05 np0005548789.novalocal python3[18628]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765005065.3021338-136-108546485667979/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa follow=False checksum=59556e0a2f4b936183817041ae1f59f0f3c92dd9 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:05 np0005548789.novalocal sudo[18626]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:07 np0005548789.novalocal sudo[18688]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waykwkgsebdbswmzdcqmzmtzpirckckt ; /usr/bin/python3
Dec 06 07:11:07 np0005548789.novalocal sudo[18688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:07 np0005548789.novalocal python3[18690]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:07 np0005548789.novalocal sudo[18688]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:07 np0005548789.novalocal sudo[18731]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adlnegttckhycvwssblcxffulsvpluns ; /usr/bin/python3
Dec 06 07:11:07 np0005548789.novalocal sudo[18731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:07 np0005548789.novalocal python3[18733]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765005066.9738903-222-9153435052993/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=00c07cb1874e45b180d5c151333e96b1_id_rsa.pub follow=False checksum=2b77fe3fb3441abe077d8d93b68745bd8f418f92 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:07 np0005548789.novalocal sudo[18731]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:09 np0005548789.novalocal sudo[18761]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxnydcqocwwwmoxalyrqatuzhfrnqnwq ; /usr/bin/python3
Dec 06 07:11:09 np0005548789.novalocal sudo[18761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:11:09 np0005548789.novalocal python3[18763]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:09 np0005548789.novalocal sudo[18761]: pam_unix(sudo:session): session closed for user root
Dec 06 07:11:10 np0005548789.novalocal python3[18809]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:11 np0005548789.novalocal python3[18825]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmp9y3n8ftm recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:12 np0005548789.novalocal python3[18885]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:12 np0005548789.novalocal python3[18901]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpm7rf67np recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:14 np0005548789.novalocal python3[18961]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:11:14 np0005548789.novalocal python3[18977]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpb9v6_722 recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:11:14 np0005548789.novalocal sshd[18500]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:11:14 np0005548789.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Dec 06 07:11:14 np0005548789.novalocal systemd[1]: session-8.scope: Consumed 3.516s CPU time.
Dec 06 07:11:14 np0005548789.novalocal systemd-logind[766]: Session 8 logged out. Waiting for processes to exit.
Dec 06 07:11:14 np0005548789.novalocal systemd-logind[766]: Removed session 8.
Dec 06 07:12:50 np0005548789.novalocal sshd[18994]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:28 np0005548789.novalocal sshd[18996]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:13:28 np0005548789.novalocal sshd[18996]: Accepted publickey for zuul from 38.102.83.83 port 52286 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:13:28 np0005548789.novalocal systemd[1]: Started Session 9 of User zuul.
Dec 06 07:13:28 np0005548789.novalocal systemd-logind[766]: New session 9 of user zuul.
Dec 06 07:13:28 np0005548789.novalocal sshd[18996]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:13:28 np0005548789.novalocal python3[19042]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:14:50 np0005548789.novalocal sshd[18994]: fatal: Timeout before authentication for 43.224.126.107 port 8546
Dec 06 07:18:01 np0005548789.novalocal anacron[6192]: Job `cron.daily' started
Dec 06 07:18:01 np0005548789.novalocal anacron[6192]: Job `cron.daily' terminated
Dec 06 07:18:28 np0005548789.novalocal sshd[18999]: Received disconnect from 38.102.83.83 port 52286:11: disconnected by user
Dec 06 07:18:28 np0005548789.novalocal sshd[18999]: Disconnected from user zuul 38.102.83.83 port 52286
Dec 06 07:18:28 np0005548789.novalocal sshd[18996]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:18:28 np0005548789.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Dec 06 07:18:28 np0005548789.novalocal systemd-logind[766]: Session 9 logged out. Waiting for processes to exit.
Dec 06 07:18:28 np0005548789.novalocal systemd-logind[766]: Removed session 9.
Dec 06 07:19:06 np0005548789.novalocal sshd[19048]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:08 np0005548789.novalocal sshd[19048]: Invalid user 1 from 91.202.233.33 port 21040
Dec 06 07:19:08 np0005548789.novalocal sshd[19048]: Connection reset by invalid user 1 91.202.233.33 port 21040 [preauth]
Dec 06 07:19:08 np0005548789.novalocal sshd[19050]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:10 np0005548789.novalocal sshd[19050]: Invalid user admin from 91.202.233.33 port 21046
Dec 06 07:19:11 np0005548789.novalocal sshd[19050]: Connection reset by invalid user admin 91.202.233.33 port 21046 [preauth]
Dec 06 07:19:11 np0005548789.novalocal sshd[19052]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:13 np0005548789.novalocal sshd[19052]: Invalid user www from 91.202.233.33 port 32842
Dec 06 07:19:13 np0005548789.novalocal sshd[19052]: Connection reset by invalid user www 91.202.233.33 port 32842 [preauth]
Dec 06 07:19:13 np0005548789.novalocal sshd[19054]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:14 np0005548789.novalocal sshd[19054]: Invalid user oracle from 91.202.233.33 port 32858
Dec 06 07:19:15 np0005548789.novalocal sshd[19054]: Connection reset by invalid user oracle 91.202.233.33 port 32858 [preauth]
Dec 06 07:19:15 np0005548789.novalocal sshd[19056]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:19:17 np0005548789.novalocal sshd[19056]: Connection reset by authenticating user root 91.202.233.33 port 32870 [preauth]
Dec 06 07:24:15 np0005548789.novalocal sshd[19059]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:17 np0005548789.novalocal sshd[19059]: Connection reset by authenticating user root 45.140.17.124 port 26204 [preauth]
Dec 06 07:24:17 np0005548789.novalocal sshd[19061]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:19 np0005548789.novalocal sshd[19061]: Invalid user kali from 45.140.17.124 port 26212
Dec 06 07:24:19 np0005548789.novalocal sshd[19061]: Connection reset by invalid user kali 45.140.17.124 port 26212 [preauth]
Dec 06 07:24:19 np0005548789.novalocal sshd[19063]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:22 np0005548789.novalocal sshd[19063]: Connection reset by authenticating user root 45.140.17.124 port 26228 [preauth]
Dec 06 07:24:22 np0005548789.novalocal sshd[19065]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:24 np0005548789.novalocal sshd[19065]: Connection reset by authenticating user root 45.140.17.124 port 26242 [preauth]
Dec 06 07:24:25 np0005548789.novalocal sshd[19067]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:24:27 np0005548789.novalocal sshd[19067]: Connection reset by authenticating user root 45.140.17.124 port 20888 [preauth]
Dec 06 07:29:18 np0005548789.novalocal sshd[19070]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:29:19 np0005548789.novalocal sshd[19070]: Received disconnect from 154.201.83.49 port 44728:11: Bye Bye [preauth]
Dec 06 07:29:19 np0005548789.novalocal sshd[19070]: Disconnected from authenticating user root 154.201.83.49 port 44728 [preauth]
Dec 06 07:29:41 np0005548789.novalocal sshd[19072]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:02 np0005548789.novalocal sshd[19074]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:30:03 np0005548789.novalocal sshd[19074]: Received disconnect from 162.241.87.197 port 41144:11: Bye Bye [preauth]
Dec 06 07:30:03 np0005548789.novalocal sshd[19074]: Disconnected from authenticating user root 162.241.87.197 port 41144 [preauth]
Dec 06 07:30:55 np0005548789.novalocal sshd[19078]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:06 np0005548789.novalocal sshd[19079]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:06 np0005548789.novalocal sshd[19079]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 07:31:06 np0005548789.novalocal sshd[19079]: Connection closed by 223.108.74.219 port 38302
Dec 06 07:31:06 np0005548789.novalocal sshd[19080]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:07 np0005548789.novalocal sshd[19080]: Invalid user a from 223.108.74.219 port 38766
Dec 06 07:31:07 np0005548789.novalocal sshd[19080]: Connection closed by invalid user a 223.108.74.219 port 38766 [preauth]
Dec 06 07:31:31 np0005548789.novalocal sshd[19083]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:31 np0005548789.novalocal sshd[19083]: Accepted publickey for zuul from 38.102.83.114 port 35168 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:31:31 np0005548789.novalocal systemd-logind[766]: New session 10 of user zuul.
Dec 06 07:31:31 np0005548789.novalocal systemd[1]: Started Session 10 of User zuul.
Dec 06 07:31:31 np0005548789.novalocal sshd[19083]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:31:32 np0005548789.novalocal python3[19100]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:31:32 np0005548789.novalocal sshd[19103]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:33 np0005548789.novalocal sshd[19103]: Received disconnect from 195.250.72.168 port 37424:11: Bye Bye [preauth]
Dec 06 07:31:33 np0005548789.novalocal sshd[19103]: Disconnected from authenticating user root 195.250.72.168 port 37424 [preauth]
Dec 06 07:31:33 np0005548789.novalocal sudo[19120]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogcmrotcwpgqnppebafcqldfoquhjlmo ; /usr/bin/python3
Dec 06 07:31:33 np0005548789.novalocal sudo[19120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:31:33 np0005548789.novalocal python3[19122]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000000d-1-overcloudnovacompute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:31:35 np0005548789.novalocal sudo[19120]: pam_unix(sudo:session): session closed for user root
Dec 06 07:31:38 np0005548789.novalocal sudo[19139]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muxsjdyhalwsjlrvoypuzzvhxgkirjbd ; /usr/bin/python3
Dec 06 07:31:38 np0005548789.novalocal sudo[19139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:31:38 np0005548789.novalocal python3[19141]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Dec 06 07:31:39 np0005548789.novalocal sshd[19143]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:41 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:31:41 np0005548789.novalocal sshd[19072]: fatal: Timeout before authentication for 180.76.146.235 port 60530
Dec 06 07:31:41 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:31:44 np0005548789.novalocal sshd[19143]: Connection reset by authenticating user root 45.135.232.92 port 61704 [preauth]
Dec 06 07:31:44 np0005548789.novalocal sshd[19273]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:46 np0005548789.novalocal sshd[19273]: Connection reset by authenticating user root 45.135.232.92 port 38096 [preauth]
Dec 06 07:31:46 np0005548789.novalocal sshd[19275]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:48 np0005548789.novalocal sshd[19275]: Connection reset by authenticating user root 45.135.232.92 port 38106 [preauth]
Dec 06 07:31:48 np0005548789.novalocal sshd[19277]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:50 np0005548789.novalocal sshd[19277]: Invalid user User from 45.135.232.92 port 38118
Dec 06 07:31:51 np0005548789.novalocal sshd[19283]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:51 np0005548789.novalocal sshd[19277]: Connection reset by invalid user User 45.135.232.92 port 38118 [preauth]
Dec 06 07:31:51 np0005548789.novalocal sshd[19286]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:52 np0005548789.novalocal sshd[19283]: Received disconnect from 77.222.100.142 port 41882:11: Bye Bye [preauth]
Dec 06 07:31:52 np0005548789.novalocal sshd[19283]: Disconnected from authenticating user root 77.222.100.142 port 41882 [preauth]
Dec 06 07:31:53 np0005548789.novalocal sshd[19291]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:31:53 np0005548789.novalocal sshd[19291]: Received disconnect from 74.94.234.151 port 48104:11: Bye Bye [preauth]
Dec 06 07:31:53 np0005548789.novalocal sshd[19291]: Disconnected from authenticating user root 74.94.234.151 port 48104 [preauth]
Dec 06 07:31:54 np0005548789.novalocal sshd[19286]: Connection reset by authenticating user sshd 45.135.232.92 port 38124 [preauth]
Dec 06 07:32:06 np0005548789.novalocal sudo[19139]: pam_unix(sudo:session): session closed for user root
Dec 06 07:32:36 np0005548789.novalocal sudo[19311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrgnghadsbifsrzczqhmzbwdwwewowdj ; /usr/bin/python3
Dec 06 07:32:36 np0005548789.novalocal sudo[19311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:32:36 np0005548789.novalocal python3[19313]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Dec 06 07:32:39 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:39 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:42 np0005548789.novalocal sudo[19311]: pam_unix(sudo:session): session closed for user root
Dec 06 07:32:47 np0005548789.novalocal sudo[19452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtekizgiipowwobteyyritaplehenzib ; /usr/bin/python3
Dec 06 07:32:47 np0005548789.novalocal sudo[19452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:32:48 np0005548789.novalocal python3[19454]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Dec 06 07:32:50 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:50 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:55 np0005548789.novalocal sshd[19078]: fatal: Timeout before authentication for 115.190.41.148 port 40698
Dec 06 07:32:55 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:32:55 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:03 np0005548789.novalocal sudo[19452]: pam_unix(sudo:session): session closed for user root
Dec 06 07:33:21 np0005548789.novalocal sudo[19728]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkybkzdrzdifkornzparfmwumwyupriu ; /usr/bin/python3
Dec 06 07:33:21 np0005548789.novalocal sudo[19728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:33:21 np0005548789.novalocal python3[19730]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 06 07:33:23 np0005548789.novalocal sshd[19733]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:23 np0005548789.novalocal sshd[19733]: Received disconnect from 74.94.234.151 port 46924:11: Bye Bye [preauth]
Dec 06 07:33:23 np0005548789.novalocal sshd[19733]: Disconnected from authenticating user root 74.94.234.151 port 46924 [preauth]
Dec 06 07:33:24 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:24 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:24 np0005548789.novalocal sshd[19857]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:26 np0005548789.novalocal sshd[19857]: Received disconnect from 154.201.83.49 port 42522:11: Bye Bye [preauth]
Dec 06 07:33:26 np0005548789.novalocal sshd[19857]: Disconnected from authenticating user root 154.201.83.49 port 42522 [preauth]
Dec 06 07:33:29 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:30 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:36 np0005548789.novalocal sudo[19728]: pam_unix(sudo:session): session closed for user root
Dec 06 07:33:38 np0005548789.novalocal sshd[20115]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:38 np0005548789.novalocal sshd[20115]: Received disconnect from 162.241.87.197 port 47354:11: Bye Bye [preauth]
Dec 06 07:33:38 np0005548789.novalocal sshd[20115]: Disconnected from authenticating user root 162.241.87.197 port 47354 [preauth]
Dec 06 07:33:41 np0005548789.novalocal sshd[20117]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:33:42 np0005548789.novalocal sshd[20117]: Received disconnect from 195.250.72.168 port 44606:11: Bye Bye [preauth]
Dec 06 07:33:42 np0005548789.novalocal sshd[20117]: Disconnected from authenticating user root 195.250.72.168 port 44606 [preauth]
Dec 06 07:33:51 np0005548789.novalocal sudo[20132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtcegglxnhakkagtvboicsygwockgbsj ; /usr/bin/python3
Dec 06 07:33:51 np0005548789.novalocal sudo[20132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:33:52 np0005548789.novalocal python3[20134]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Dec 06 07:33:54 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:33:55 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:00 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:00 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:08 np0005548789.novalocal sudo[20132]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:24 np0005548789.novalocal sudo[20470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbcnlsyeunsyzcjlmpzhxukrajdkfzac ; /usr/bin/python3
Dec 06 07:34:24 np0005548789.novalocal sudo[20470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:34:24 np0005548789.novalocal python3[20472]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000013-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:34:26 np0005548789.novalocal sudo[20470]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:27 np0005548789.novalocal sshd[20476]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:28 np0005548789.novalocal sshd[20476]: Received disconnect from 77.222.100.142 port 55318:11: Bye Bye [preauth]
Dec 06 07:34:28 np0005548789.novalocal sshd[20476]: Disconnected from authenticating user root 77.222.100.142 port 55318 [preauth]
Dec 06 07:34:29 np0005548789.novalocal sudo[20491]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btpsbgvxptuoatyrhnxjhgsvszpwowke ; /usr/bin/python3
Dec 06 07:34:29 np0005548789.novalocal sudo[20491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:34:29 np0005548789.novalocal python3[20493]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:34:48 np0005548789.novalocal sshd[20586]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:48 np0005548789.novalocal sshd[20589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  Converting 490 SID table entries...
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 07:34:49 np0005548789.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 07:34:49 np0005548789.novalocal groupadd[20595]: group added to /etc/group: name=unbound, GID=987
Dec 06 07:34:49 np0005548789.novalocal groupadd[20595]: group added to /etc/gshadow: name=unbound
Dec 06 07:34:49 np0005548789.novalocal sshd[20586]: Received disconnect from 74.94.234.151 port 45336:11: Bye Bye [preauth]
Dec 06 07:34:49 np0005548789.novalocal sshd[20586]: Disconnected from authenticating user root 74.94.234.151 port 45336 [preauth]
Dec 06 07:34:49 np0005548789.novalocal groupadd[20595]: new group: name=unbound, GID=987
Dec 06 07:34:49 np0005548789.novalocal useradd[20602]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Dec 06 07:34:49 np0005548789.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Dec 06 07:34:49 np0005548789.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 06 07:34:49 np0005548789.novalocal groupadd[20615]: group added to /etc/group: name=openvswitch, GID=986
Dec 06 07:34:49 np0005548789.novalocal groupadd[20615]: group added to /etc/gshadow: name=openvswitch
Dec 06 07:34:49 np0005548789.novalocal groupadd[20615]: new group: name=openvswitch, GID=986
Dec 06 07:34:49 np0005548789.novalocal useradd[20622]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Dec 06 07:34:49 np0005548789.novalocal groupadd[20630]: group added to /etc/group: name=hugetlbfs, GID=985
Dec 06 07:34:49 np0005548789.novalocal groupadd[20630]: group added to /etc/gshadow: name=hugetlbfs
Dec 06 07:34:49 np0005548789.novalocal groupadd[20630]: new group: name=hugetlbfs, GID=985
Dec 06 07:34:49 np0005548789.novalocal usermod[20638]: add 'openvswitch' to group 'hugetlbfs'
Dec 06 07:34:49 np0005548789.novalocal usermod[20638]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 06 07:34:49 np0005548789.novalocal sshd[20646]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:50 np0005548789.novalocal sshd[20646]: Received disconnect from 162.241.87.197 port 60320:11: Bye Bye [preauth]
Dec 06 07:34:50 np0005548789.novalocal sshd[20646]: Disconnected from authenticating user root 162.241.87.197 port 60320 [preauth]
Dec 06 07:34:50 np0005548789.novalocal sshd[20589]: Invalid user admin from 78.128.112.74 port 36532
Dec 06 07:34:50 np0005548789.novalocal sshd[20589]: Connection closed by invalid user admin 78.128.112.74 port 36532 [preauth]
Dec 06 07:34:53 np0005548789.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:34:53 np0005548789.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:34:53 np0005548789.novalocal systemd[1]: Reloading.
Dec 06 07:34:53 np0005548789.novalocal systemd-rc-local-generator[21158]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:34:53 np0005548789.novalocal systemd-sysv-generator[21164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:34:53 np0005548789.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:34:53 np0005548789.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:34:54 np0005548789.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:34:54 np0005548789.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:34:54 np0005548789.novalocal systemd[1]: run-r3049330e21e64e75be809c4d43891857.service: Deactivated successfully.
Dec 06 07:34:55 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:55 np0005548789.novalocal sudo[20491]: pam_unix(sudo:session): session closed for user root
Dec 06 07:34:55 np0005548789.novalocal rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 07:34:55 np0005548789.novalocal sshd[21694]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:34:57 np0005548789.novalocal sshd[21694]: Received disconnect from 154.201.83.49 port 47892:11: Bye Bye [preauth]
Dec 06 07:34:57 np0005548789.novalocal sshd[21694]: Disconnected from authenticating user root 154.201.83.49 port 47892 [preauth]
Dec 06 07:35:02 np0005548789.novalocal sshd[21696]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:03 np0005548789.novalocal sshd[21696]: Received disconnect from 195.250.72.168 port 46152:11: Bye Bye [preauth]
Dec 06 07:35:03 np0005548789.novalocal sshd[21696]: Disconnected from authenticating user root 195.250.72.168 port 46152 [preauth]
Dec 06 07:35:22 np0005548789.novalocal sudo[21712]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuqrmbjeuybwkusonofbivhgbobszuxz ; /usr/bin/python3
Dec 06 07:35:22 np0005548789.novalocal sudo[21712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:22 np0005548789.novalocal python3[21714]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000015-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:35:35 np0005548789.novalocal sshd[21718]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:36 np0005548789.novalocal sshd[21718]: Received disconnect from 77.222.100.142 port 49568:11: Bye Bye [preauth]
Dec 06 07:35:36 np0005548789.novalocal sshd[21718]: Disconnected from authenticating user root 77.222.100.142 port 49568 [preauth]
Dec 06 07:35:39 np0005548789.novalocal sudo[21712]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:53 np0005548789.novalocal sudo[21734]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvnymmqptseohkizuytgmopeyqbncvrw ; /usr/bin/python3
Dec 06 07:35:53 np0005548789.novalocal sudo[21734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:53 np0005548789.novalocal python3[21736]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:35:53 np0005548789.novalocal sudo[21734]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:54 np0005548789.novalocal sudo[21782]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etukwffoacvtydxwsnjqznnxevgvllve ; /usr/bin/python3
Dec 06 07:35:54 np0005548789.novalocal sudo[21782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:54 np0005548789.novalocal python3[21784]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:35:54 np0005548789.novalocal sudo[21782]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:54 np0005548789.novalocal sudo[21825]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sunflrfggqlesahsbabixlstoatfewlf ; /usr/bin/python3
Dec 06 07:35:54 np0005548789.novalocal sudo[21825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:54 np0005548789.novalocal python3[21827]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765006553.982665-295-118966497809452/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=9333f42ac4b9baf349a5c32f7bcba3335b5912e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:35:54 np0005548789.novalocal sudo[21825]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:55 np0005548789.novalocal sudo[21855]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxrukaykfhcybipntugedgtvfkfwppwk ; /usr/bin/python3
Dec 06 07:35:55 np0005548789.novalocal sudo[21855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548789.novalocal python3[21857]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548789.novalocal sudo[21855]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548789.novalocal systemd-journald[619]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Dec 06 07:35:56 np0005548789.novalocal systemd-journald[619]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 07:35:56 np0005548789.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 07:35:56 np0005548789.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 07:35:56 np0005548789.novalocal sudo[21876]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvwybibcjvgpszkuissuynqawnqfnwvt ; /usr/bin/python3
Dec 06 07:35:56 np0005548789.novalocal sudo[21876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548789.novalocal python3[21878]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548789.novalocal sudo[21876]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548789.novalocal sudo[21896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flwseszkdvkaqthdfmbfcvrytezybzvu ; /usr/bin/python3
Dec 06 07:35:56 np0005548789.novalocal sudo[21896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548789.novalocal python3[21898]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:56 np0005548789.novalocal sudo[21896]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:56 np0005548789.novalocal sudo[21916]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyztnetjnfsajlstirijwxiznmwppiok ; /usr/bin/python3
Dec 06 07:35:56 np0005548789.novalocal sudo[21916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:56 np0005548789.novalocal python3[21918]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:57 np0005548789.novalocal sudo[21916]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:57 np0005548789.novalocal sudo[21936]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldhveqdiizyoazdoqtsswhjkrjeehqcc ; /usr/bin/python3
Dec 06 07:35:57 np0005548789.novalocal sudo[21936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:35:57 np0005548789.novalocal python3[21938]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Dec 06 07:35:57 np0005548789.novalocal sudo[21936]: pam_unix(sudo:session): session closed for user root
Dec 06 07:35:57 np0005548789.novalocal sshd[21943]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:58 np0005548789.novalocal sshd[21944]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:58 np0005548789.novalocal sshd[21945]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:35:58 np0005548789.novalocal sshd[21944]: error: kex_exchange_identification: read: Connection reset by peer
Dec 06 07:35:58 np0005548789.novalocal sshd[21944]: Connection reset by 45.140.17.97 port 27448
Dec 06 07:35:58 np0005548789.novalocal sshd[21945]: Received disconnect from 162.241.87.197 port 43018:11: Bye Bye [preauth]
Dec 06 07:35:58 np0005548789.novalocal sshd[21945]: Disconnected from authenticating user root 162.241.87.197 port 43018 [preauth]
Dec 06 07:36:00 np0005548789.novalocal sudo[21960]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yinjbymfisaqqwwupaniqdxphcdadwut ; /usr/bin/python3
Dec 06 07:36:00 np0005548789.novalocal sudo[21960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:00 np0005548789.novalocal python3[21962]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:36:01 np0005548789.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Dec 06 07:36:01 np0005548789.novalocal network[21965]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:01 np0005548789.novalocal network[21976]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:01 np0005548789.novalocal network[21965]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:01 np0005548789.novalocal network[21977]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:01 np0005548789.novalocal network[21965]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 07:36:01 np0005548789.novalocal network[21978]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 07:36:01 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006561.6917] audit: op="connections-reload" pid=22006 uid=0 result="success"
Dec 06 07:36:01 np0005548789.novalocal network[21965]: Bringing up loopback interface:  [  OK  ]
Dec 06 07:36:01 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006561.8795] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22094 uid=0 result="success"
Dec 06 07:36:01 np0005548789.novalocal network[21965]: Bringing up interface eth0:  [  OK  ]
Dec 06 07:36:01 np0005548789.novalocal systemd[1]: Started LSB: Bring up/down networking.
Dec 06 07:36:01 np0005548789.novalocal sudo[21960]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:02 np0005548789.novalocal sudo[22133]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvftforfuiewxgvbevoaqibzglxtnehk ; /usr/bin/python3
Dec 06 07:36:02 np0005548789.novalocal sudo[22133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:02 np0005548789.novalocal python3[22135]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Dec 06 07:36:02 np0005548789.novalocal chown[22139]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22144]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22144]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22144]: Starting ovsdb-server [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-vsctl[22193]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 06 07:36:02 np0005548789.novalocal ovs-vsctl[22213]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"b142a5ef-fbed-4e92-aa78-e3ad080c6370\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22144]: Configuring Open vSwitch system IDs [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22144]: Enabling remote OVSDB managers [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-vsctl[22219]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548789.novalocal
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Started Open vSwitch Database Unit.
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 06 07:36:02 np0005548789.novalocal kernel: openvswitch: Open vSwitch switching datapath
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22263]: Inserting openvswitch module [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22232]: Starting ovs-vswitchd [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal ovs-ctl[22232]: Enabling remote OVSDB managers [  OK  ]
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 06 07:36:02 np0005548789.novalocal ovs-vsctl[22282]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005548789.novalocal
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Starting Open vSwitch...
Dec 06 07:36:02 np0005548789.novalocal systemd[1]: Finished Open vSwitch.
Dec 06 07:36:02 np0005548789.novalocal sudo[22133]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:05 np0005548789.novalocal sudo[22298]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnywklevofgxrlassrtuvrnpbyspjtsp ; /usr/bin/python3
Dec 06 07:36:05 np0005548789.novalocal sudo[22298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:36:05 np0005548789.novalocal python3[22300]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000001a-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.3443] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22458 uid=0 result="success"
Dec 06 07:36:06 np0005548789.novalocal ifup[22459]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:06 np0005548789.novalocal ifup[22460]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:06 np0005548789.novalocal ifup[22461]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.3710] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22467 uid=0 result="success"
Dec 06 07:36:06 np0005548789.novalocal ovs-vsctl[22469]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:66:7f:12 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Dec 06 07:36:06 np0005548789.novalocal kernel: device ovs-system entered promiscuous mode
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.3981] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Dec 06 07:36:06 np0005548789.novalocal systemd-udevd[22471]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:06 np0005548789.novalocal kernel: Timeout policy base is empty
Dec 06 07:36:06 np0005548789.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Dec 06 07:36:06 np0005548789.novalocal kernel: device br-ex entered promiscuous mode
Dec 06 07:36:06 np0005548789.novalocal systemd-udevd[22485]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.4412] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.4648] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22497 uid=0 result="success"
Dec 06 07:36:06 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006566.4860] device (br-ex): carrier: link connected
Dec 06 07:36:09 np0005548789.novalocal sshd[22517]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.5386] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22528 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.5826] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22543 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal NET[22568]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.6532] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.6718] dhcp4 (eth1): canceled DHCP transaction
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.6719] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.6719] dhcp4 (eth1): state changed no lease
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.6746] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22577 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal ifup[22578]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:09 np0005548789.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 07:36:09 np0005548789.novalocal ifup[22580]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:09 np0005548789.novalocal ifup[22581]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:09 np0005548789.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.7050] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22594 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.7477] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22605 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.7541] device (eth1): carrier: link connected
Dec 06 07:36:09 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006569.7752] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22614 uid=0 result="success"
Dec 06 07:36:09 np0005548789.novalocal ipv6_wait_tentative[22626]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Dec 06 07:36:10 np0005548789.novalocal sshd[22517]: Received disconnect from 74.94.234.151 port 43754:11: Bye Bye [preauth]
Dec 06 07:36:10 np0005548789.novalocal sshd[22517]: Disconnected from authenticating user root 74.94.234.151 port 43754 [preauth]
Dec 06 07:36:10 np0005548789.novalocal ipv6_wait_tentative[22631]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Dec 06 07:36:11 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006571.8408] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22640 uid=0 result="success"
Dec 06 07:36:11 np0005548789.novalocal ovs-vsctl[22655]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Dec 06 07:36:11 np0005548789.novalocal kernel: device eth1 entered promiscuous mode
Dec 06 07:36:11 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006571.9124] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22663 uid=0 result="success"
Dec 06 07:36:11 np0005548789.novalocal ifup[22664]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:11 np0005548789.novalocal ifup[22665]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:11 np0005548789.novalocal ifup[22666]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:11 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006571.9434] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22672 uid=0 result="success"
Dec 06 07:36:11 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006571.9878] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22682 uid=0 result="success"
Dec 06 07:36:11 np0005548789.novalocal ifup[22683]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:11 np0005548789.novalocal ifup[22684]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:11 np0005548789.novalocal ifup[22685]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:12 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006572.0211] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22691 uid=0 result="success"
Dec 06 07:36:12 np0005548789.novalocal ovs-vsctl[22694]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 06 07:36:12 np0005548789.novalocal kernel: device vlan23 entered promiscuous mode
Dec 06 07:36:12 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006572.0631] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Dec 06 07:36:12 np0005548789.novalocal systemd-udevd[22696]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:12 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006572.0939] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22705 uid=0 result="success"
Dec 06 07:36:12 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006572.1154] device (vlan23): carrier: link connected
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.1742] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22734 uid=0 result="success"
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.2199] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22749 uid=0 result="success"
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.2813] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22770 uid=0 result="success"
Dec 06 07:36:15 np0005548789.novalocal ifup[22771]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:15 np0005548789.novalocal ifup[22772]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:15 np0005548789.novalocal ifup[22773]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.3149] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22779 uid=0 result="success"
Dec 06 07:36:15 np0005548789.novalocal ovs-vsctl[22782]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 06 07:36:15 np0005548789.novalocal kernel: device vlan20 entered promiscuous mode
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.3653] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Dec 06 07:36:15 np0005548789.novalocal systemd-udevd[22784]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.3915] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22794 uid=0 result="success"
Dec 06 07:36:15 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006575.4122] device (vlan20): carrier: link connected
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.4605] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22824 uid=0 result="success"
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.5093] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22839 uid=0 result="success"
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.5693] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22860 uid=0 result="success"
Dec 06 07:36:18 np0005548789.novalocal ifup[22861]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:18 np0005548789.novalocal ifup[22862]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:18 np0005548789.novalocal ifup[22863]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.6011] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22869 uid=0 result="success"
Dec 06 07:36:18 np0005548789.novalocal ovs-vsctl[22872]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 06 07:36:18 np0005548789.novalocal kernel: device vlan22 entered promiscuous mode
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.6383] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Dec 06 07:36:18 np0005548789.novalocal systemd-udevd[22875]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.6618] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22884 uid=0 result="success"
Dec 06 07:36:18 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006578.6794] device (vlan22): carrier: link connected
Dec 06 07:36:19 np0005548789.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 07:36:20 np0005548789.novalocal sshd[22903]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.7267] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22916 uid=0 result="success"
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.7727] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22931 uid=0 result="success"
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.8227] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22952 uid=0 result="success"
Dec 06 07:36:21 np0005548789.novalocal ifup[22953]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:21 np0005548789.novalocal ifup[22954]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:21 np0005548789.novalocal ifup[22955]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.8549] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22961 uid=0 result="success"
Dec 06 07:36:21 np0005548789.novalocal ovs-vsctl[22964]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.8975] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Dec 06 07:36:21 np0005548789.novalocal kernel: device vlan44 entered promiscuous mode
Dec 06 07:36:21 np0005548789.novalocal systemd-udevd[22966]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.9248] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22976 uid=0 result="success"
Dec 06 07:36:21 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006581.9474] device (vlan44): carrier: link connected
Dec 06 07:36:22 np0005548789.novalocal sshd[22994]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:22 np0005548789.novalocal sshd[22903]: Received disconnect from 154.201.83.49 port 47844:11: Bye Bye [preauth]
Dec 06 07:36:22 np0005548789.novalocal sshd[22903]: Disconnected from authenticating user root 154.201.83.49 port 47844 [preauth]
Dec 06 07:36:23 np0005548789.novalocal sshd[22994]: Received disconnect from 195.250.72.168 port 53874:11: Bye Bye [preauth]
Dec 06 07:36:23 np0005548789.novalocal sshd[22994]: Disconnected from authenticating user root 195.250.72.168 port 53874 [preauth]
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.0091] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23008 uid=0 result="success"
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.0550] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23023 uid=0 result="success"
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.1159] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23044 uid=0 result="success"
Dec 06 07:36:25 np0005548789.novalocal ifup[23045]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:25 np0005548789.novalocal ifup[23046]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:25 np0005548789.novalocal ifup[23047]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.1495] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23053 uid=0 result="success"
Dec 06 07:36:25 np0005548789.novalocal ovs-vsctl[23056]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 06 07:36:25 np0005548789.novalocal kernel: device vlan21 entered promiscuous mode
Dec 06 07:36:25 np0005548789.novalocal systemd-udevd[23058]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.1914] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.2155] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23068 uid=0 result="success"
Dec 06 07:36:25 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006585.2374] device (vlan21): carrier: link connected
Dec 06 07:36:28 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006588.3012] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23098 uid=0 result="success"
Dec 06 07:36:28 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006588.3525] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23113 uid=0 result="success"
Dec 06 07:36:28 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006588.4138] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23134 uid=0 result="success"
Dec 06 07:36:28 np0005548789.novalocal ifup[23135]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:28 np0005548789.novalocal ifup[23136]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:28 np0005548789.novalocal ifup[23137]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:28 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006588.4472] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23143 uid=0 result="success"
Dec 06 07:36:28 np0005548789.novalocal ovs-vsctl[23146]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Dec 06 07:36:28 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006588.5070] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23153 uid=0 result="success"
Dec 06 07:36:29 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006589.5688] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23180 uid=0 result="success"
Dec 06 07:36:29 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006589.6088] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23195 uid=0 result="success"
Dec 06 07:36:29 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006589.6674] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23216 uid=0 result="success"
Dec 06 07:36:29 np0005548789.novalocal ifup[23217]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:29 np0005548789.novalocal ifup[23218]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:29 np0005548789.novalocal ifup[23219]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:29 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006589.6982] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23225 uid=0 result="success"
Dec 06 07:36:29 np0005548789.novalocal ovs-vsctl[23228]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Dec 06 07:36:29 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006589.7454] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23235 uid=0 result="success"
Dec 06 07:36:30 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006590.8111] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23263 uid=0 result="success"
Dec 06 07:36:30 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006590.8602] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23278 uid=0 result="success"
Dec 06 07:36:30 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006590.9241] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23299 uid=0 result="success"
Dec 06 07:36:30 np0005548789.novalocal ifup[23300]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:30 np0005548789.novalocal ifup[23301]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:30 np0005548789.novalocal ifup[23302]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:30 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006590.9603] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23308 uid=0 result="success"
Dec 06 07:36:30 np0005548789.novalocal ovs-vsctl[23311]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Dec 06 07:36:31 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006591.0214] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23318 uid=0 result="success"
Dec 06 07:36:32 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006592.0815] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23346 uid=0 result="success"
Dec 06 07:36:32 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006592.1279] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23361 uid=0 result="success"
Dec 06 07:36:32 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006592.1870] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23382 uid=0 result="success"
Dec 06 07:36:32 np0005548789.novalocal ifup[23383]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:32 np0005548789.novalocal ifup[23384]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:32 np0005548789.novalocal ifup[23385]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:32 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006592.2203] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23391 uid=0 result="success"
Dec 06 07:36:32 np0005548789.novalocal ovs-vsctl[23394]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Dec 06 07:36:32 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006592.2798] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23401 uid=0 result="success"
Dec 06 07:36:33 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006593.3435] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23429 uid=0 result="success"
Dec 06 07:36:33 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006593.3979] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23444 uid=0 result="success"
Dec 06 07:36:33 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006593.4510] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23465 uid=0 result="success"
Dec 06 07:36:33 np0005548789.novalocal ifup[23466]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Dec 06 07:36:33 np0005548789.novalocal ifup[23467]: 'network-scripts' will be removed from distribution in near future.
Dec 06 07:36:33 np0005548789.novalocal ifup[23468]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Dec 06 07:36:33 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006593.4784] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23474 uid=0 result="success"
Dec 06 07:36:33 np0005548789.novalocal ovs-vsctl[23477]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Dec 06 07:36:33 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006593.5291] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23484 uid=0 result="success"
Dec 06 07:36:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006594.5922] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23512 uid=0 result="success"
Dec 06 07:36:34 np0005548789.novalocal NetworkManager[5973]: <info>  [1765006594.6385] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23527 uid=0 result="success"
Dec 06 07:36:34 np0005548789.novalocal sudo[22298]: pam_unix(sudo:session): session closed for user root
Dec 06 07:36:43 np0005548789.novalocal sshd[23545]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:36:44 np0005548789.novalocal sshd[23545]: Received disconnect from 77.222.100.142 port 45068:11: Bye Bye [preauth]
Dec 06 07:36:44 np0005548789.novalocal sshd[23545]: Disconnected from authenticating user root 77.222.100.142 port 45068 [preauth]
Dec 06 07:37:09 np0005548789.novalocal sshd[23547]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:09 np0005548789.novalocal sshd[23547]: Received disconnect from 162.241.87.197 port 36034:11: Bye Bye [preauth]
Dec 06 07:37:09 np0005548789.novalocal sshd[23547]: Disconnected from authenticating user root 162.241.87.197 port 36034 [preauth]
Dec 06 07:37:27 np0005548789.novalocal python3[23563]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-00000000001b-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:32 np0005548789.novalocal sshd[23569]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:33 np0005548789.novalocal sshd[23569]: Received disconnect from 74.94.234.151 port 42164:11: Bye Bye [preauth]
Dec 06 07:37:33 np0005548789.novalocal sshd[23569]: Disconnected from authenticating user root 74.94.234.151 port 42164 [preauth]
Dec 06 07:37:33 np0005548789.novalocal python3[23584]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:33 np0005548789.novalocal sudo[23598]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgqndffsvtsjleilbxemlyckmndlbqee ; /usr/bin/python3
Dec 06 07:37:33 np0005548789.novalocal sudo[23598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:34 np0005548789.novalocal python3[23600]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:34 np0005548789.novalocal sudo[23598]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:35 np0005548789.novalocal python3[23614]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:35 np0005548789.novalocal sudo[23628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnwkwtifwmlemxpxdpghqvzmiuokxltq ; /usr/bin/python3
Dec 06 07:37:35 np0005548789.novalocal sudo[23628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:36 np0005548789.novalocal python3[23630]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 07:37:36 np0005548789.novalocal sudo[23628]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:37 np0005548789.novalocal python3[23644]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Dec 06 07:37:37 np0005548789.novalocal python3[23659]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005548789.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000022-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:38 np0005548789.novalocal sudo[23677]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktsdgedxsdbvlcdrrfsjvjjxhyagxywc ; /usr/bin/python3
Dec 06 07:37:38 np0005548789.novalocal sudo[23677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:37:38 np0005548789.novalocal python3[23679]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1bf3-6840-000000000023-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:37:38 np0005548789.novalocal systemd[1]: Starting Hostname Service...
Dec 06 07:37:38 np0005548789.novalocal systemd[1]: Started Hostname Service.
Dec 06 07:37:38 np0005548789.localdomain systemd-hostnamed[23683]: Hostname set to <np0005548789.localdomain> (static)
Dec 06 07:37:38 np0005548789.localdomain NetworkManager[5973]: <info>  [1765006658.7404] hostname: static hostname changed from "np0005548789.novalocal" to "np0005548789.localdomain"
Dec 06 07:37:38 np0005548789.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 07:37:38 np0005548789.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 07:37:38 np0005548789.localdomain sudo[23677]: pam_unix(sudo:session): session closed for user root
Dec 06 07:37:40 np0005548789.localdomain sshd[19083]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:37:40 np0005548789.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Dec 06 07:37:40 np0005548789.localdomain systemd[1]: session-10.scope: Consumed 1min 44.156s CPU time.
Dec 06 07:37:40 np0005548789.localdomain systemd-logind[766]: Session 10 logged out. Waiting for processes to exit.
Dec 06 07:37:40 np0005548789.localdomain systemd-logind[766]: Removed session 10.
Dec 06 07:37:43 np0005548789.localdomain sshd[23694]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:43 np0005548789.localdomain sshd[23694]: Accepted publickey for zuul from 38.102.83.114 port 35254 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:37:43 np0005548789.localdomain systemd-logind[766]: New session 11 of user zuul.
Dec 06 07:37:43 np0005548789.localdomain systemd[1]: Started Session 11 of User zuul.
Dec 06 07:37:43 np0005548789.localdomain sshd[23694]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:37:43 np0005548789.localdomain python3[23711]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 06 07:37:45 np0005548789.localdomain sshd[23694]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:37:45 np0005548789.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Dec 06 07:37:45 np0005548789.localdomain systemd-logind[766]: Session 11 logged out. Waiting for processes to exit.
Dec 06 07:37:45 np0005548789.localdomain systemd-logind[766]: Removed session 11.
Dec 06 07:37:47 np0005548789.localdomain sshd[23713]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:48 np0005548789.localdomain sshd[23715]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:48 np0005548789.localdomain sshd[23713]: Received disconnect from 195.250.72.168 port 35686:11: Bye Bye [preauth]
Dec 06 07:37:48 np0005548789.localdomain sshd[23713]: Disconnected from authenticating user root 195.250.72.168 port 35686 [preauth]
Dec 06 07:37:48 np0005548789.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 07:37:49 np0005548789.localdomain sshd[23715]: Received disconnect from 154.201.83.49 port 57060:11: Bye Bye [preauth]
Dec 06 07:37:49 np0005548789.localdomain sshd[23715]: Disconnected from authenticating user root 154.201.83.49 port 57060 [preauth]
Dec 06 07:37:52 np0005548789.localdomain sshd[23717]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:37:53 np0005548789.localdomain sshd[23717]: Received disconnect from 77.222.100.142 port 37620:11: Bye Bye [preauth]
Dec 06 07:37:53 np0005548789.localdomain sshd[23717]: Disconnected from authenticating user root 77.222.100.142 port 37620 [preauth]
Dec 06 07:38:01 np0005548789.localdomain anacron[6192]: Job `cron.weekly' started
Dec 06 07:38:01 np0005548789.localdomain anacron[6192]: Job `cron.weekly' terminated
Dec 06 07:38:08 np0005548789.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 07:38:21 np0005548789.localdomain sshd[23725]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:21 np0005548789.localdomain sshd[23725]: Received disconnect from 162.241.87.197 port 38642:11: Bye Bye [preauth]
Dec 06 07:38:21 np0005548789.localdomain sshd[23725]: Disconnected from authenticating user root 162.241.87.197 port 38642 [preauth]
Dec 06 07:38:31 np0005548789.localdomain sshd[23727]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:31 np0005548789.localdomain sshd[23727]: Accepted publickey for zuul from 38.102.83.114 port 34520 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:38:31 np0005548789.localdomain systemd-logind[766]: New session 12 of user zuul.
Dec 06 07:38:31 np0005548789.localdomain systemd[1]: Started Session 12 of User zuul.
Dec 06 07:38:31 np0005548789.localdomain sshd[23727]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:38:31 np0005548789.localdomain sudo[23744]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khhrqowyvimtpofkawysxajtcfjfhafp ; /usr/bin/python3
Dec 06 07:38:31 np0005548789.localdomain sudo[23744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:38:31 np0005548789.localdomain python3[23746]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:38:35 np0005548789.localdomain systemd-rc-local-generator[23784]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:35 np0005548789.localdomain systemd-sysv-generator[23788]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:38:35 np0005548789.localdomain systemd-sysv-generator[23831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:35 np0005548789.localdomain systemd-rc-local-generator[23827]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:38:35 np0005548789.localdomain systemd-rc-local-generator[23865]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:35 np0005548789.localdomain systemd-sysv-generator[23869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:35 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:38:36 np0005548789.localdomain systemd-rc-local-generator[23931]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:38:36 np0005548789.localdomain systemd-sysv-generator[23934]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 07:38:36 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 07:38:37 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 07:38:37 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 07:38:37 np0005548789.localdomain systemd[1]: run-r87c78052be9c4c00b5254abdbd491c77.service: Deactivated successfully.
Dec 06 07:38:37 np0005548789.localdomain systemd[1]: run-r180c600c3766474fa0509bd24a3f2262.service: Deactivated successfully.
Dec 06 07:38:37 np0005548789.localdomain sudo[23744]: pam_unix(sudo:session): session closed for user root
Dec 06 07:38:54 np0005548789.localdomain sshd[24519]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:38:54 np0005548789.localdomain sshd[24519]: Received disconnect from 74.94.234.151 port 40582:11: Bye Bye [preauth]
Dec 06 07:38:54 np0005548789.localdomain sshd[24519]: Disconnected from authenticating user root 74.94.234.151 port 40582 [preauth]
Dec 06 07:39:01 np0005548789.localdomain sshd[24521]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:02 np0005548789.localdomain sshd[24521]: Received disconnect from 77.222.100.142 port 36710:11: Bye Bye [preauth]
Dec 06 07:39:02 np0005548789.localdomain sshd[24521]: Disconnected from authenticating user root 77.222.100.142 port 36710 [preauth]
Dec 06 07:39:20 np0005548789.localdomain sshd[24523]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:21 np0005548789.localdomain sshd[24523]: Received disconnect from 195.250.72.168 port 47118:11: Bye Bye [preauth]
Dec 06 07:39:21 np0005548789.localdomain sshd[24523]: Disconnected from authenticating user root 195.250.72.168 port 47118 [preauth]
Dec 06 07:39:26 np0005548789.localdomain sshd[24525]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:27 np0005548789.localdomain sshd[24525]: Received disconnect from 154.201.83.49 port 38870:11: Bye Bye [preauth]
Dec 06 07:39:27 np0005548789.localdomain sshd[24525]: Disconnected from authenticating user root 154.201.83.49 port 38870 [preauth]
Dec 06 07:39:30 np0005548789.localdomain sshd[24527]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:39:30 np0005548789.localdomain sshd[24527]: Received disconnect from 162.241.87.197 port 43066:11: Bye Bye [preauth]
Dec 06 07:39:30 np0005548789.localdomain sshd[24527]: Disconnected from authenticating user root 162.241.87.197 port 43066 [preauth]
Dec 06 07:39:37 np0005548789.localdomain sshd[23730]: Received disconnect from 38.102.83.114 port 34520:11: disconnected by user
Dec 06 07:39:37 np0005548789.localdomain sshd[23730]: Disconnected from user zuul 38.102.83.114 port 34520
Dec 06 07:39:37 np0005548789.localdomain sshd[23727]: pam_unix(sshd:session): session closed for user zuul
Dec 06 07:39:37 np0005548789.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Dec 06 07:39:37 np0005548789.localdomain systemd[1]: session-12.scope: Consumed 4.664s CPU time.
Dec 06 07:39:37 np0005548789.localdomain systemd-logind[766]: Session 12 logged out. Waiting for processes to exit.
Dec 06 07:39:37 np0005548789.localdomain systemd-logind[766]: Removed session 12.
Dec 06 07:40:08 np0005548789.localdomain sshd[24529]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:09 np0005548789.localdomain sshd[24529]: Received disconnect from 77.222.100.142 port 54464:11: Bye Bye [preauth]
Dec 06 07:40:09 np0005548789.localdomain sshd[24529]: Disconnected from authenticating user root 77.222.100.142 port 54464 [preauth]
Dec 06 07:40:13 np0005548789.localdomain sshd[24531]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:13 np0005548789.localdomain sshd[24531]: Received disconnect from 74.94.234.151 port 38998:11: Bye Bye [preauth]
Dec 06 07:40:13 np0005548789.localdomain sshd[24531]: Disconnected from authenticating user root 74.94.234.151 port 38998 [preauth]
Dec 06 07:40:39 np0005548789.localdomain sshd[24533]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:39 np0005548789.localdomain sshd[24533]: Received disconnect from 162.241.87.197 port 52612:11: Bye Bye [preauth]
Dec 06 07:40:39 np0005548789.localdomain sshd[24533]: Disconnected from authenticating user root 162.241.87.197 port 52612 [preauth]
Dec 06 07:40:48 np0005548789.localdomain sshd[24535]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:49 np0005548789.localdomain sshd[24535]: Received disconnect from 195.250.72.168 port 50684:11: Bye Bye [preauth]
Dec 06 07:40:49 np0005548789.localdomain sshd[24535]: Disconnected from authenticating user root 195.250.72.168 port 50684 [preauth]
Dec 06 07:40:56 np0005548789.localdomain sshd[24537]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:40:58 np0005548789.localdomain sshd[24537]: Received disconnect from 154.201.83.49 port 50384:11: Bye Bye [preauth]
Dec 06 07:40:58 np0005548789.localdomain sshd[24537]: Disconnected from authenticating user root 154.201.83.49 port 50384 [preauth]
Dec 06 07:41:14 np0005548789.localdomain sshd[24539]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:15 np0005548789.localdomain sshd[24539]: Received disconnect from 77.222.100.142 port 41052:11: Bye Bye [preauth]
Dec 06 07:41:15 np0005548789.localdomain sshd[24539]: Disconnected from authenticating user root 77.222.100.142 port 41052 [preauth]
Dec 06 07:41:34 np0005548789.localdomain sshd[24541]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:35 np0005548789.localdomain sshd[24541]: Received disconnect from 74.94.234.151 port 37414:11: Bye Bye [preauth]
Dec 06 07:41:35 np0005548789.localdomain sshd[24541]: Disconnected from authenticating user root 74.94.234.151 port 37414 [preauth]
Dec 06 07:41:45 np0005548789.localdomain sshd[24543]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:41:45 np0005548789.localdomain sshd[24543]: Received disconnect from 162.241.87.197 port 52448:11: Bye Bye [preauth]
Dec 06 07:41:45 np0005548789.localdomain sshd[24543]: Disconnected from authenticating user root 162.241.87.197 port 52448 [preauth]
Dec 06 07:42:15 np0005548789.localdomain sshd[24545]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:16 np0005548789.localdomain sshd[24545]: Received disconnect from 195.250.72.168 port 52062:11: Bye Bye [preauth]
Dec 06 07:42:16 np0005548789.localdomain sshd[24545]: Disconnected from authenticating user root 195.250.72.168 port 52062 [preauth]
Dec 06 07:42:18 np0005548789.localdomain sshd[24547]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:19 np0005548789.localdomain sshd[24547]: Received disconnect from 77.222.100.142 port 44776:11: Bye Bye [preauth]
Dec 06 07:42:19 np0005548789.localdomain sshd[24547]: Disconnected from authenticating user root 77.222.100.142 port 44776 [preauth]
Dec 06 07:42:26 np0005548789.localdomain sshd[24549]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:27 np0005548789.localdomain sshd[24549]: Received disconnect from 154.201.83.49 port 41322:11: Bye Bye [preauth]
Dec 06 07:42:27 np0005548789.localdomain sshd[24549]: Disconnected from authenticating user root 154.201.83.49 port 41322 [preauth]
Dec 06 07:42:30 np0005548789.localdomain sshd[24551]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:32 np0005548789.localdomain sshd[24551]: Connection reset by authenticating user root 91.202.233.33 port 43692 [preauth]
Dec 06 07:42:32 np0005548789.localdomain sshd[24553]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:35 np0005548789.localdomain sshd[24553]: Connection reset by authenticating user root 91.202.233.33 port 58098 [preauth]
Dec 06 07:42:35 np0005548789.localdomain sshd[24555]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:37 np0005548789.localdomain sshd[24555]: Connection reset by authenticating user root 91.202.233.33 port 58104 [preauth]
Dec 06 07:42:38 np0005548789.localdomain sshd[24557]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:41 np0005548789.localdomain sshd[24557]: Connection reset by authenticating user root 91.202.233.33 port 58120 [preauth]
Dec 06 07:42:41 np0005548789.localdomain sshd[24559]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:44 np0005548789.localdomain sshd[24559]: Connection reset by authenticating user root 91.202.233.33 port 50336 [preauth]
Dec 06 07:42:50 np0005548789.localdomain sshd[24561]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:50 np0005548789.localdomain sshd[24561]: Received disconnect from 162.241.87.197 port 56832:11: Bye Bye [preauth]
Dec 06 07:42:50 np0005548789.localdomain sshd[24561]: Disconnected from authenticating user root 162.241.87.197 port 56832 [preauth]
Dec 06 07:42:51 np0005548789.localdomain sshd[24563]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:42:52 np0005548789.localdomain sshd[24563]: Received disconnect from 74.94.234.151 port 35830:11: Bye Bye [preauth]
Dec 06 07:42:52 np0005548789.localdomain sshd[24563]: Disconnected from authenticating user root 74.94.234.151 port 35830 [preauth]
Dec 06 07:43:24 np0005548789.localdomain sshd[24565]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:25 np0005548789.localdomain sshd[24565]: Received disconnect from 77.222.100.142 port 35126:11: Bye Bye [preauth]
Dec 06 07:43:25 np0005548789.localdomain sshd[24565]: Disconnected from authenticating user root 77.222.100.142 port 35126 [preauth]
Dec 06 07:43:42 np0005548789.localdomain sshd[24567]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:43 np0005548789.localdomain sshd[24567]: Received disconnect from 195.250.72.168 port 45834:11: Bye Bye [preauth]
Dec 06 07:43:43 np0005548789.localdomain sshd[24567]: Disconnected from authenticating user root 195.250.72.168 port 45834 [preauth]
Dec 06 07:43:55 np0005548789.localdomain sshd[24569]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:56 np0005548789.localdomain sshd[24569]: Received disconnect from 154.201.83.49 port 35010:11: Bye Bye [preauth]
Dec 06 07:43:56 np0005548789.localdomain sshd[24569]: Disconnected from authenticating user root 154.201.83.49 port 35010 [preauth]
Dec 06 07:43:58 np0005548789.localdomain sshd[24571]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:43:59 np0005548789.localdomain sshd[24571]: Received disconnect from 162.241.87.197 port 38470:11: Bye Bye [preauth]
Dec 06 07:43:59 np0005548789.localdomain sshd[24571]: Disconnected from authenticating user root 162.241.87.197 port 38470 [preauth]
Dec 06 07:44:11 np0005548789.localdomain sshd[24573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:44:12 np0005548789.localdomain sshd[24573]: Received disconnect from 74.94.234.151 port 34242:11: Bye Bye [preauth]
Dec 06 07:44:12 np0005548789.localdomain sshd[24573]: Disconnected from authenticating user root 74.94.234.151 port 34242 [preauth]
Dec 06 07:44:32 np0005548789.localdomain sshd[24576]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:44:33 np0005548789.localdomain sshd[24576]: Received disconnect from 77.222.100.142 port 48666:11: Bye Bye [preauth]
Dec 06 07:44:33 np0005548789.localdomain sshd[24576]: Disconnected from authenticating user root 77.222.100.142 port 48666 [preauth]
Dec 06 07:45:11 np0005548789.localdomain sshd[24578]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:11 np0005548789.localdomain sshd[24578]: Received disconnect from 162.241.87.197 port 54808:11: Bye Bye [preauth]
Dec 06 07:45:11 np0005548789.localdomain sshd[24578]: Disconnected from authenticating user root 162.241.87.197 port 54808 [preauth]
Dec 06 07:45:14 np0005548789.localdomain sshd[24580]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:15 np0005548789.localdomain sshd[24580]: Received disconnect from 195.250.72.168 port 59898:11: Bye Bye [preauth]
Dec 06 07:45:15 np0005548789.localdomain sshd[24580]: Disconnected from authenticating user root 195.250.72.168 port 59898 [preauth]
Dec 06 07:45:29 np0005548789.localdomain sshd[24583]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:30 np0005548789.localdomain sshd[24583]: Received disconnect from 154.201.83.49 port 39076:11: Bye Bye [preauth]
Dec 06 07:45:30 np0005548789.localdomain sshd[24583]: Disconnected from authenticating user root 154.201.83.49 port 39076 [preauth]
Dec 06 07:45:34 np0005548789.localdomain sshd[24585]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:35 np0005548789.localdomain sshd[24585]: Received disconnect from 74.94.234.151 port 60890:11: Bye Bye [preauth]
Dec 06 07:45:35 np0005548789.localdomain sshd[24585]: Disconnected from authenticating user root 74.94.234.151 port 60890 [preauth]
Dec 06 07:45:42 np0005548789.localdomain sshd[24587]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:45:43 np0005548789.localdomain sshd[24587]: Received disconnect from 77.222.100.142 port 47276:11: Bye Bye [preauth]
Dec 06 07:45:43 np0005548789.localdomain sshd[24587]: Disconnected from authenticating user root 77.222.100.142 port 47276 [preauth]
Dec 06 07:46:22 np0005548789.localdomain sshd[24589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:23 np0005548789.localdomain sshd[24589]: Received disconnect from 162.241.87.197 port 59218:11: Bye Bye [preauth]
Dec 06 07:46:23 np0005548789.localdomain sshd[24589]: Disconnected from authenticating user root 162.241.87.197 port 59218 [preauth]
Dec 06 07:46:43 np0005548789.localdomain sshd[24591]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:45 np0005548789.localdomain sshd[24593]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:45 np0005548789.localdomain sshd[24591]: Connection reset by authenticating user root 45.140.17.124 port 61320 [preauth]
Dec 06 07:46:45 np0005548789.localdomain sshd[24595]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:46 np0005548789.localdomain sshd[24593]: Received disconnect from 195.250.72.168 port 42132:11: Bye Bye [preauth]
Dec 06 07:46:46 np0005548789.localdomain sshd[24593]: Disconnected from authenticating user root 195.250.72.168 port 42132 [preauth]
Dec 06 07:46:47 np0005548789.localdomain sshd[24595]: Connection reset by authenticating user root 45.140.17.124 port 61334 [preauth]
Dec 06 07:46:47 np0005548789.localdomain sshd[24597]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:49 np0005548789.localdomain sshd[24597]: Connection reset by authenticating user root 45.140.17.124 port 61356 [preauth]
Dec 06 07:46:49 np0005548789.localdomain sshd[24599]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:50 np0005548789.localdomain sshd[24601]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:51 np0005548789.localdomain sshd[24601]: Received disconnect from 77.222.100.142 port 35494:11: Bye Bye [preauth]
Dec 06 07:46:51 np0005548789.localdomain sshd[24601]: Disconnected from authenticating user root 77.222.100.142 port 35494 [preauth]
Dec 06 07:46:51 np0005548789.localdomain sshd[24599]: Connection reset by authenticating user root 45.140.17.124 port 61370 [preauth]
Dec 06 07:46:51 np0005548789.localdomain sshd[24603]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:54 np0005548789.localdomain sshd[24603]: Connection reset by authenticating user root 45.140.17.124 port 61386 [preauth]
Dec 06 07:46:56 np0005548789.localdomain sshd[24606]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:46:57 np0005548789.localdomain sshd[24606]: Received disconnect from 74.94.234.151 port 59310:11: Bye Bye [preauth]
Dec 06 07:46:57 np0005548789.localdomain sshd[24606]: Disconnected from authenticating user root 74.94.234.151 port 59310 [preauth]
Dec 06 07:47:01 np0005548789.localdomain sshd[24608]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:47:02 np0005548789.localdomain sshd[24608]: Received disconnect from 154.201.83.49 port 45172:11: Bye Bye [preauth]
Dec 06 07:47:02 np0005548789.localdomain sshd[24608]: Disconnected from authenticating user root 154.201.83.49 port 45172 [preauth]
Dec 06 07:47:32 np0005548789.localdomain sshd[24610]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:47:32 np0005548789.localdomain sshd[24610]: Received disconnect from 162.241.87.197 port 46948:11: Bye Bye [preauth]
Dec 06 07:47:32 np0005548789.localdomain sshd[24610]: Disconnected from authenticating user root 162.241.87.197 port 46948 [preauth]
Dec 06 07:47:57 np0005548789.localdomain sshd[24612]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:47:58 np0005548789.localdomain sshd[24612]: Received disconnect from 77.222.100.142 port 59810:11: Bye Bye [preauth]
Dec 06 07:47:58 np0005548789.localdomain sshd[24612]: Disconnected from authenticating user root 77.222.100.142 port 59810 [preauth]
Dec 06 07:48:16 np0005548789.localdomain sshd[24614]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:48:17 np0005548789.localdomain sshd[24614]: Received disconnect from 195.250.72.168 port 56786:11: Bye Bye [preauth]
Dec 06 07:48:17 np0005548789.localdomain sshd[24614]: Disconnected from authenticating user root 195.250.72.168 port 56786 [preauth]
Dec 06 07:48:18 np0005548789.localdomain sshd[24616]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:48:19 np0005548789.localdomain sshd[24616]: Received disconnect from 74.94.234.151 port 57722:11: Bye Bye [preauth]
Dec 06 07:48:19 np0005548789.localdomain sshd[24616]: Disconnected from authenticating user root 74.94.234.151 port 57722 [preauth]
Dec 06 07:48:36 np0005548789.localdomain sshd[24619]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:48:38 np0005548789.localdomain sshd[24619]: Received disconnect from 154.201.83.49 port 58928:11: Bye Bye [preauth]
Dec 06 07:48:38 np0005548789.localdomain sshd[24619]: Disconnected from authenticating user root 154.201.83.49 port 58928 [preauth]
Dec 06 07:48:43 np0005548789.localdomain sshd[24621]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:48:43 np0005548789.localdomain sshd[24621]: Received disconnect from 162.241.87.197 port 54938:11: Bye Bye [preauth]
Dec 06 07:48:43 np0005548789.localdomain sshd[24621]: Disconnected from authenticating user root 162.241.87.197 port 54938 [preauth]
Dec 06 07:49:04 np0005548789.localdomain sshd[24623]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:05 np0005548789.localdomain sshd[24623]: Received disconnect from 77.222.100.142 port 54554:11: Bye Bye [preauth]
Dec 06 07:49:05 np0005548789.localdomain sshd[24623]: Disconnected from authenticating user root 77.222.100.142 port 54554 [preauth]
Dec 06 07:49:38 np0005548789.localdomain sshd[24625]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:39 np0005548789.localdomain sshd[24625]: Received disconnect from 74.94.234.151 port 56142:11: Bye Bye [preauth]
Dec 06 07:49:39 np0005548789.localdomain sshd[24625]: Disconnected from authenticating user root 74.94.234.151 port 56142 [preauth]
Dec 06 07:49:46 np0005548789.localdomain sshd[24627]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:48 np0005548789.localdomain sshd[24627]: Received disconnect from 195.250.72.168 port 59192:11: Bye Bye [preauth]
Dec 06 07:49:48 np0005548789.localdomain sshd[24627]: Disconnected from authenticating user root 195.250.72.168 port 59192 [preauth]
Dec 06 07:49:53 np0005548789.localdomain sshd[24630]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:49:53 np0005548789.localdomain sshd[24630]: Received disconnect from 162.241.87.197 port 50808:11: Bye Bye [preauth]
Dec 06 07:49:53 np0005548789.localdomain sshd[24630]: Disconnected from authenticating user root 162.241.87.197 port 50808 [preauth]
Dec 06 07:50:10 np0005548789.localdomain sshd[24632]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:50:12 np0005548789.localdomain sshd[24632]: Received disconnect from 154.201.83.49 port 57240:11: Bye Bye [preauth]
Dec 06 07:50:12 np0005548789.localdomain sshd[24632]: Disconnected from authenticating user root 154.201.83.49 port 57240 [preauth]
Dec 06 07:50:13 np0005548789.localdomain sshd[24634]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:50:14 np0005548789.localdomain sshd[24634]: Received disconnect from 77.222.100.142 port 52500:11: Bye Bye [preauth]
Dec 06 07:50:14 np0005548789.localdomain sshd[24634]: Disconnected from authenticating user root 77.222.100.142 port 52500 [preauth]
Dec 06 07:51:04 np0005548789.localdomain sshd[24636]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:04 np0005548789.localdomain sshd[24636]: Received disconnect from 74.94.234.151 port 54554:11: Bye Bye [preauth]
Dec 06 07:51:04 np0005548789.localdomain sshd[24636]: Disconnected from authenticating user root 74.94.234.151 port 54554 [preauth]
Dec 06 07:51:05 np0005548789.localdomain sshd[24638]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:05 np0005548789.localdomain sshd[24638]: Received disconnect from 162.241.87.197 port 37330:11: Bye Bye [preauth]
Dec 06 07:51:05 np0005548789.localdomain sshd[24638]: Disconnected from authenticating user root 162.241.87.197 port 37330 [preauth]
Dec 06 07:51:19 np0005548789.localdomain sshd[24640]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:20 np0005548789.localdomain sshd[24640]: Received disconnect from 195.250.72.168 port 38222:11: Bye Bye [preauth]
Dec 06 07:51:20 np0005548789.localdomain sshd[24640]: Disconnected from authenticating user root 195.250.72.168 port 38222 [preauth]
Dec 06 07:51:24 np0005548789.localdomain sshd[24642]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:25 np0005548789.localdomain sshd[24642]: Received disconnect from 77.222.100.142 port 34756:11: Bye Bye [preauth]
Dec 06 07:51:25 np0005548789.localdomain sshd[24642]: Disconnected from authenticating user root 77.222.100.142 port 34756 [preauth]
Dec 06 07:51:45 np0005548789.localdomain sshd[24644]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:51:47 np0005548789.localdomain sshd[24644]: Received disconnect from 154.201.83.49 port 39200:11: Bye Bye [preauth]
Dec 06 07:51:47 np0005548789.localdomain sshd[24644]: Disconnected from authenticating user root 154.201.83.49 port 39200 [preauth]
Dec 06 07:52:18 np0005548789.localdomain sshd[24646]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:18 np0005548789.localdomain sshd[24646]: Received disconnect from 162.241.87.197 port 53952:11: Bye Bye [preauth]
Dec 06 07:52:18 np0005548789.localdomain sshd[24646]: Disconnected from authenticating user root 162.241.87.197 port 53952 [preauth]
Dec 06 07:52:30 np0005548789.localdomain sshd[24648]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:31 np0005548789.localdomain sshd[24648]: Received disconnect from 74.94.234.151 port 52968:11: Bye Bye [preauth]
Dec 06 07:52:31 np0005548789.localdomain sshd[24648]: Disconnected from authenticating user root 74.94.234.151 port 52968 [preauth]
Dec 06 07:52:34 np0005548789.localdomain sshd[24650]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:35 np0005548789.localdomain sshd[24650]: Received disconnect from 77.222.100.142 port 45138:11: Bye Bye [preauth]
Dec 06 07:52:35 np0005548789.localdomain sshd[24650]: Disconnected from authenticating user root 77.222.100.142 port 45138 [preauth]
Dec 06 07:52:53 np0005548789.localdomain sshd[24653]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:52:54 np0005548789.localdomain sshd[24653]: Received disconnect from 195.250.72.168 port 35906:11: Bye Bye [preauth]
Dec 06 07:52:54 np0005548789.localdomain sshd[24653]: Disconnected from authenticating user root 195.250.72.168 port 35906 [preauth]
Dec 06 07:53:22 np0005548789.localdomain sshd[24655]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:53:24 np0005548789.localdomain sshd[24655]: Received disconnect from 154.201.83.49 port 39180:11: Bye Bye [preauth]
Dec 06 07:53:24 np0005548789.localdomain sshd[24655]: Disconnected from authenticating user root 154.201.83.49 port 39180 [preauth]
Dec 06 07:53:31 np0005548789.localdomain sshd[24657]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:53:31 np0005548789.localdomain sshd[24657]: Received disconnect from 162.241.87.197 port 54156:11: Bye Bye [preauth]
Dec 06 07:53:31 np0005548789.localdomain sshd[24657]: Disconnected from authenticating user root 162.241.87.197 port 54156 [preauth]
Dec 06 07:53:46 np0005548789.localdomain sshd[24659]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:53:47 np0005548789.localdomain sshd[24659]: Received disconnect from 77.222.100.142 port 33708:11: Bye Bye [preauth]
Dec 06 07:53:47 np0005548789.localdomain sshd[24659]: Disconnected from authenticating user root 77.222.100.142 port 33708 [preauth]
Dec 06 07:53:58 np0005548789.localdomain sshd[24661]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:53:59 np0005548789.localdomain sshd[24661]: Received disconnect from 74.94.234.151 port 51388:11: Bye Bye [preauth]
Dec 06 07:53:59 np0005548789.localdomain sshd[24661]: Disconnected from authenticating user root 74.94.234.151 port 51388 [preauth]
Dec 06 07:54:27 np0005548789.localdomain sshd[24663]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:54:28 np0005548789.localdomain sshd[24663]: Received disconnect from 195.250.72.168 port 49560:11: Bye Bye [preauth]
Dec 06 07:54:28 np0005548789.localdomain sshd[24663]: Disconnected from authenticating user root 195.250.72.168 port 49560 [preauth]
Dec 06 07:54:39 np0005548789.localdomain sshd[24665]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:54:39 np0005548789.localdomain sshd[24665]: Received disconnect from 162.241.87.197 port 59612:11: Bye Bye [preauth]
Dec 06 07:54:39 np0005548789.localdomain sshd[24665]: Disconnected from authenticating user root 162.241.87.197 port 59612 [preauth]
Dec 06 07:54:51 np0005548789.localdomain sshd[24667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:54:52 np0005548789.localdomain sshd[24667]: Received disconnect from 77.222.100.142 port 39146:11: Bye Bye [preauth]
Dec 06 07:54:52 np0005548789.localdomain sshd[24667]: Disconnected from authenticating user root 77.222.100.142 port 39146 [preauth]
Dec 06 07:54:56 np0005548789.localdomain sshd[24669]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:54:58 np0005548789.localdomain sshd[24669]: Received disconnect from 154.201.83.49 port 37840:11: Bye Bye [preauth]
Dec 06 07:54:58 np0005548789.localdomain sshd[24669]: Disconnected from authenticating user root 154.201.83.49 port 37840 [preauth]
Dec 06 07:55:00 np0005548789.localdomain sshd[24671]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:02 np0005548789.localdomain sshd[24671]: Connection reset by authenticating user root 45.135.232.92 port 64656 [preauth]
Dec 06 07:55:02 np0005548789.localdomain sshd[24673]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:04 np0005548789.localdomain sshd[24673]: Connection reset by authenticating user root 45.135.232.92 port 64658 [preauth]
Dec 06 07:55:04 np0005548789.localdomain sshd[24675]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:06 np0005548789.localdomain sshd[24675]: Invalid user admin from 45.135.232.92 port 37048
Dec 06 07:55:07 np0005548789.localdomain sshd[24675]: Connection reset by invalid user admin 45.135.232.92 port 37048 [preauth]
Dec 06 07:55:07 np0005548789.localdomain sshd[24677]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:08 np0005548789.localdomain sshd[24677]: Invalid user admin from 45.135.232.92 port 37066
Dec 06 07:55:09 np0005548789.localdomain sshd[24677]: Connection reset by invalid user admin 45.135.232.92 port 37066 [preauth]
Dec 06 07:55:09 np0005548789.localdomain sshd[24679]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:12 np0005548789.localdomain sshd[24679]: Connection reset by authenticating user root 45.135.232.92 port 37096 [preauth]
Dec 06 07:55:17 np0005548789.localdomain sshd[24682]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:17 np0005548789.localdomain sshd[24682]: Accepted publickey for zuul from 192.168.122.100 port 57464 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 07:55:17 np0005548789.localdomain systemd-logind[766]: New session 13 of user zuul.
Dec 06 07:55:17 np0005548789.localdomain systemd[1]: Started Session 13 of User zuul.
Dec 06 07:55:17 np0005548789.localdomain sshd[24682]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 07:55:17 np0005548789.localdomain sudo[24728]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzbegdwxraniqeiwkougsofhtjqhvnvy ; /usr/bin/python3
Dec 06 07:55:17 np0005548789.localdomain sudo[24728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:18 np0005548789.localdomain python3[24730]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 07:55:18 np0005548789.localdomain sudo[24728]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:19 np0005548789.localdomain sudo[24815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcdwcubovzmsvfumapenulemjggenftc ; /usr/bin/python3
Dec 06 07:55:19 np0005548789.localdomain sudo[24815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:19 np0005548789.localdomain sshd[24818]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:19 np0005548789.localdomain python3[24817]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:20 np0005548789.localdomain sshd[24818]: Received disconnect from 74.94.234.151 port 49800:11: Bye Bye [preauth]
Dec 06 07:55:20 np0005548789.localdomain sshd[24818]: Disconnected from authenticating user root 74.94.234.151 port 49800 [preauth]
Dec 06 07:55:22 np0005548789.localdomain sudo[24815]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:22 np0005548789.localdomain sudo[24834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbhvsbodwmulhryeniobitolpumbznak ; /usr/bin/python3
Dec 06 07:55:22 np0005548789.localdomain sudo[24834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:23 np0005548789.localdomain python3[24836]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:55:23 np0005548789.localdomain sudo[24834]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:23 np0005548789.localdomain sudo[24850]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiedgrrvwfycgjfmzjhmqwnbjxjixcbc ; /usr/bin/python3
Dec 06 07:55:23 np0005548789.localdomain sudo[24850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:23 np0005548789.localdomain python3[24852]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:23 np0005548789.localdomain kernel: loop: module loaded
Dec 06 07:55:23 np0005548789.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Dec 06 07:55:23 np0005548789.localdomain sudo[24850]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:24 np0005548789.localdomain sudo[24875]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxyeeooeuosqjufsxffbufcfijmkougl ; /usr/bin/python3
Dec 06 07:55:24 np0005548789.localdomain sudo[24875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:24 np0005548789.localdomain python3[24877]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:24 np0005548789.localdomain lvm[24880]: PV /dev/loop3 not used.
Dec 06 07:55:24 np0005548789.localdomain lvm[24882]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:24 np0005548789.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 06 07:55:24 np0005548789.localdomain lvm[24885]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 06 07:55:24 np0005548789.localdomain lvm[24892]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:24 np0005548789.localdomain lvm[24892]: VG ceph_vg0 finished
Dec 06 07:55:24 np0005548789.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 06 07:55:24 np0005548789.localdomain sudo[24875]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:24 np0005548789.localdomain sudo[24938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaeelnxutbpnuyghygkjjukrncksbwzg ; /usr/bin/python3
Dec 06 07:55:24 np0005548789.localdomain sudo[24938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:25 np0005548789.localdomain python3[24940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:25 np0005548789.localdomain sudo[24938]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:25 np0005548789.localdomain sudo[24981]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeitmrmlihvjpjtcwvqfqwvybbjpjkwf ; /usr/bin/python3
Dec 06 07:55:25 np0005548789.localdomain sudo[24981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:25 np0005548789.localdomain python3[24984]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007724.768987-54544-221435459495844/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:25 np0005548789.localdomain sudo[24981]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:26 np0005548789.localdomain sudo[25012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwptredogrhrdsygeasdwbaungvptpqo ; /usr/bin/python3
Dec 06 07:55:26 np0005548789.localdomain sudo[25012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:26 np0005548789.localdomain python3[25014]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:26 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:55:26 np0005548789.localdomain systemd-sysv-generator[25040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:55:26 np0005548789.localdomain systemd-rc-local-generator[25036]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:55:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:55:26 np0005548789.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 06 07:55:26 np0005548789.localdomain bash[25055]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img)
Dec 06 07:55:26 np0005548789.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 06 07:55:26 np0005548789.localdomain lvm[25056]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:55:26 np0005548789.localdomain lvm[25056]: VG ceph_vg0 finished
Dec 06 07:55:26 np0005548789.localdomain sudo[25012]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:27 np0005548789.localdomain sudo[25070]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtobjfgdvhvulbvdsyotmsfyelrodaok ; /usr/bin/python3
Dec 06 07:55:27 np0005548789.localdomain sudo[25070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:27 np0005548789.localdomain python3[25072]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:29 np0005548789.localdomain sudo[25070]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:30 np0005548789.localdomain sudo[25087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdqjzzwyuvtmdmhenmrylbentjrvrnol ; /usr/bin/python3
Dec 06 07:55:30 np0005548789.localdomain sudo[25087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:30 np0005548789.localdomain python3[25089]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 07:55:30 np0005548789.localdomain sudo[25087]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:30 np0005548789.localdomain sudo[25103]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyqjluodcpeaxoetyetoitbdipjznsob ; /usr/bin/python3
Dec 06 07:55:30 np0005548789.localdomain sudo[25103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:30 np0005548789.localdomain python3[25105]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:30 np0005548789.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Dec 06 07:55:30 np0005548789.localdomain sudo[25103]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:31 np0005548789.localdomain sudo[25125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzqhetedekeisqdndvultodiprqdfyhv ; /usr/bin/python3
Dec 06 07:55:31 np0005548789.localdomain sudo[25125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:31 np0005548789.localdomain python3[25127]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:31 np0005548789.localdomain lvm[25130]: PV /dev/loop4 not used.
Dec 06 07:55:31 np0005548789.localdomain lvm[25140]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:55:31 np0005548789.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec 06 07:55:31 np0005548789.localdomain sudo[25125]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:31 np0005548789.localdomain lvm[25142]:   1 logical volume(s) in volume group "ceph_vg1" now active
Dec 06 07:55:31 np0005548789.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec 06 07:55:32 np0005548789.localdomain sudo[25188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-undzaheximrjjqkllkqyxdonxrmpnuvr ; /usr/bin/python3
Dec 06 07:55:32 np0005548789.localdomain sudo[25188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:32 np0005548789.localdomain python3[25190]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:32 np0005548789.localdomain sudo[25188]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:32 np0005548789.localdomain sudo[25231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovqpblqkravcyfwmocfdpebnbzllkvba ; /usr/bin/python3
Dec 06 07:55:32 np0005548789.localdomain sudo[25231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:32 np0005548789.localdomain python3[25233]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007731.8777587-54738-165649410462717/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:32 np0005548789.localdomain sudo[25231]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:33 np0005548789.localdomain sudo[25261]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyctftzxcatsvetduuserufbutrpbbwu ; /usr/bin/python3
Dec 06 07:55:33 np0005548789.localdomain sudo[25261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:33 np0005548789.localdomain python3[25263]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:55:33 np0005548789.localdomain systemd-sysv-generator[25290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:55:33 np0005548789.localdomain systemd-rc-local-generator[25287]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:55:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:55:33 np0005548789.localdomain systemd[1]: Starting Ceph OSD losetup...
Dec 06 07:55:33 np0005548789.localdomain bash[25303]: /dev/loop4: [64516]:8399529 (/var/lib/ceph-osd-1.img)
Dec 06 07:55:33 np0005548789.localdomain systemd[1]: Finished Ceph OSD losetup.
Dec 06 07:55:33 np0005548789.localdomain lvm[25304]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:55:33 np0005548789.localdomain lvm[25304]: VG ceph_vg1 finished
Dec 06 07:55:33 np0005548789.localdomain sudo[25261]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:42 np0005548789.localdomain sudo[25348]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeiddvzskerqvbliwzlfcclmpptdfhsl ; /usr/bin/python3
Dec 06 07:55:42 np0005548789.localdomain sudo[25348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:42 np0005548789.localdomain python3[25350]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 07:55:42 np0005548789.localdomain sudo[25348]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:43 np0005548789.localdomain sudo[25368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwmshgvewbrrzbnorboeewlrprydckhf ; /usr/bin/python3
Dec 06 07:55:43 np0005548789.localdomain sudo[25368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:43 np0005548789.localdomain python3[25370]: ansible-hostname Invoked with name=np0005548789.localdomain use=None
Dec 06 07:55:43 np0005548789.localdomain systemd[1]: Starting Hostname Service...
Dec 06 07:55:43 np0005548789.localdomain systemd[1]: Started Hostname Service.
Dec 06 07:55:43 np0005548789.localdomain sudo[25368]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:45 np0005548789.localdomain sudo[25391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neubcszjphcmytpyvtqjqpebyrirnodm ; /usr/bin/python3
Dec 06 07:55:45 np0005548789.localdomain sudo[25391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:45 np0005548789.localdomain python3[25393]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 06 07:55:45 np0005548789.localdomain sudo[25391]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:45 np0005548789.localdomain sshd[25426]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:46 np0005548789.localdomain sudo[25441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hngacrxwcpmutrcnfefiqpavsjvjhrzk ; /usr/bin/python3
Dec 06 07:55:46 np0005548789.localdomain sudo[25441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:46 np0005548789.localdomain python3[25443]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.0gnzpzdmtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:46 np0005548789.localdomain sshd[25426]: Received disconnect from 162.241.87.197 port 53252:11: Bye Bye [preauth]
Dec 06 07:55:46 np0005548789.localdomain sshd[25426]: Disconnected from authenticating user root 162.241.87.197 port 53252 [preauth]
Dec 06 07:55:46 np0005548789.localdomain sudo[25441]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:46 np0005548789.localdomain sudo[25471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzrcbmpjfpubbpjmfdaqvnazrrqeqkfp ; /usr/bin/python3
Dec 06 07:55:46 np0005548789.localdomain sudo[25471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:46 np0005548789.localdomain python3[25473]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.0gnzpzdmtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:46 np0005548789.localdomain sudo[25471]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:47 np0005548789.localdomain sudo[25487]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqnwiwjroulrmxahsrftwejzcsuijqfa ; /usr/bin/python3
Dec 06 07:55:47 np0005548789.localdomain sudo[25487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:47 np0005548789.localdomain python3[25489]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.0gnzpzdmtmphosts insertbefore=BOF block=192.168.122.106 np0005548788.localdomain np0005548788
                                                         192.168.122.106 np0005548788.ctlplane.localdomain np0005548788.ctlplane
                                                         192.168.122.107 np0005548789.localdomain np0005548789
                                                         192.168.122.107 np0005548789.ctlplane.localdomain np0005548789.ctlplane
                                                         192.168.122.108 np0005548790.localdomain np0005548790
                                                         192.168.122.108 np0005548790.ctlplane.localdomain np0005548790.ctlplane
                                                         192.168.122.103 np0005548785.localdomain np0005548785
                                                         192.168.122.103 np0005548785.ctlplane.localdomain np0005548785.ctlplane
                                                         192.168.122.104 np0005548786.localdomain np0005548786
                                                         192.168.122.104 np0005548786.ctlplane.localdomain np0005548786.ctlplane
                                                         192.168.122.105 np0005548787.localdomain np0005548787
                                                         192.168.122.105 np0005548787.ctlplane.localdomain np0005548787.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:47 np0005548789.localdomain sudo[25487]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:47 np0005548789.localdomain sudo[25503]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etihmdjrssedcdcsetudjhjtxilgduma ; /usr/bin/python3
Dec 06 07:55:47 np0005548789.localdomain sudo[25503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:47 np0005548789.localdomain python3[25505]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.0gnzpzdmtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:48 np0005548789.localdomain sudo[25503]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:48 np0005548789.localdomain sudo[25520]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmcobshdpgmrvsedqnieobfffeerttja ; /usr/bin/python3
Dec 06 07:55:48 np0005548789.localdomain sudo[25520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:48 np0005548789.localdomain python3[25522]: ansible-file Invoked with path=/tmp/ansible.0gnzpzdmtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:48 np0005548789.localdomain sudo[25520]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:50 np0005548789.localdomain sudo[25536]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjbjutqfpiplfzvejcnnurqosropnqgw ; /usr/bin/python3
Dec 06 07:55:50 np0005548789.localdomain sudo[25536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:50 np0005548789.localdomain python3[25538]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:55:50 np0005548789.localdomain sudo[25536]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:51 np0005548789.localdomain sudo[25554]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amhbfmsgvyeaslwkqsmzjbjrqzekoizj ; /usr/bin/python3
Dec 06 07:55:51 np0005548789.localdomain sudo[25554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:51 np0005548789.localdomain python3[25556]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 07:55:53 np0005548789.localdomain sudo[25554]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:55 np0005548789.localdomain sudo[25604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-astjkdjqzdkyfiihcwnxbjqkqyoqfywj ; /usr/bin/python3
Dec 06 07:55:55 np0005548789.localdomain sudo[25604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:55 np0005548789.localdomain python3[25606]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:55 np0005548789.localdomain sudo[25604]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:55 np0005548789.localdomain sudo[25649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgubmqmiwtfqinifnogkcedsbyeuerlk ; /usr/bin/python3
Dec 06 07:55:55 np0005548789.localdomain sudo[25649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:56 np0005548789.localdomain python3[25651]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007755.1418512-55569-180179163738915/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:55:56 np0005548789.localdomain sudo[25649]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:56 np0005548789.localdomain sshd[25666]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:57 np0005548789.localdomain sshd[25668]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:55:57 np0005548789.localdomain sudo[25683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lycmjxucjzxkxmyrizzdkhrsfidfiqxj ; /usr/bin/python3
Dec 06 07:55:57 np0005548789.localdomain sudo[25683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:57 np0005548789.localdomain sshd[25666]: Received disconnect from 195.250.72.168 port 37428:11: Bye Bye [preauth]
Dec 06 07:55:57 np0005548789.localdomain sshd[25666]: Disconnected from authenticating user root 195.250.72.168 port 37428 [preauth]
Dec 06 07:55:57 np0005548789.localdomain python3[25685]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:55:57 np0005548789.localdomain sudo[25683]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:58 np0005548789.localdomain sshd[25668]: Received disconnect from 77.222.100.142 port 34246:11: Bye Bye [preauth]
Dec 06 07:55:58 np0005548789.localdomain sshd[25668]: Disconnected from authenticating user root 77.222.100.142 port 34246 [preauth]
Dec 06 07:55:58 np0005548789.localdomain sudo[25701]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fllvxyfdwwqfkbaqhviwjumsqvljelqz ; /usr/bin/python3
Dec 06 07:55:58 np0005548789.localdomain sudo[25701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:58 np0005548789.localdomain python3[25703]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:55:58 np0005548789.localdomain chronyd[762]: chronyd exiting
Dec 06 07:55:58 np0005548789.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 07:55:58 np0005548789.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 07:55:58 np0005548789.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 07:55:58 np0005548789.localdomain systemd[1]: chronyd.service: Consumed 97ms CPU time, read 1.9M from disk, written 4.0K to disk.
Dec 06 07:55:58 np0005548789.localdomain systemd[1]: Starting NTP client/server...
Dec 06 07:55:59 np0005548789.localdomain chronyd[25710]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 07:55:59 np0005548789.localdomain chronyd[25710]: Frequency -30.154 +/- 0.056 ppm read from /var/lib/chrony/drift
Dec 06 07:55:59 np0005548789.localdomain chronyd[25710]: Loaded seccomp filter (level 2)
Dec 06 07:55:59 np0005548789.localdomain systemd[1]: Started NTP client/server.
Dec 06 07:55:59 np0005548789.localdomain sudo[25701]: pam_unix(sudo:session): session closed for user root
Dec 06 07:55:59 np0005548789.localdomain sudo[25757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fclrzhrvqxeklitmaogszyqzoajbixii ; /usr/bin/python3
Dec 06 07:55:59 np0005548789.localdomain sudo[25757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:55:59 np0005548789.localdomain python3[25759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 07:55:59 np0005548789.localdomain sudo[25757]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:00 np0005548789.localdomain sudo[25800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqhawuezxwojvqxhhmvgzcalymxldfyr ; /usr/bin/python3
Dec 06 07:56:00 np0005548789.localdomain sudo[25800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:00 np0005548789.localdomain python3[25802]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765007759.5466952-55759-222539856691277/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 07:56:00 np0005548789.localdomain sudo[25800]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:00 np0005548789.localdomain sudo[25830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiekuhehnoywitrizawsimvczoeavxre ; /usr/bin/python3
Dec 06 07:56:00 np0005548789.localdomain sudo[25830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:00 np0005548789.localdomain python3[25832]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 07:56:00 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:56:00 np0005548789.localdomain systemd-sysv-generator[25859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:56:00 np0005548789.localdomain systemd-rc-local-generator[25856]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:56:00 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:56:01 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:56:01 np0005548789.localdomain systemd-rc-local-generator[25893]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:56:01 np0005548789.localdomain systemd-sysv-generator[25898]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:56:01 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:56:01 np0005548789.localdomain systemd[1]: Starting chronyd online sources service...
Dec 06 07:56:01 np0005548789.localdomain chronyc[25908]: 200 OK
Dec 06 07:56:01 np0005548789.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 06 07:56:01 np0005548789.localdomain systemd[1]: Finished chronyd online sources service.
Dec 06 07:56:01 np0005548789.localdomain sudo[25830]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:01 np0005548789.localdomain sudo[25922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiphkfoyhynioubvhhsmjphwwhcehrab ; /usr/bin/python3
Dec 06 07:56:01 np0005548789.localdomain sudo[25922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:02 np0005548789.localdomain python3[25924]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:56:02 np0005548789.localdomain chronyd[25710]: System clock was stepped by 0.000000 seconds
Dec 06 07:56:02 np0005548789.localdomain sudo[25922]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:02 np0005548789.localdomain sudo[25939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skshaavgszbyvfctallmxggbiujupvcl ; /usr/bin/python3
Dec 06 07:56:02 np0005548789.localdomain sudo[25939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:02 np0005548789.localdomain python3[25941]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 07:56:03 np0005548789.localdomain chronyd[25710]: Selected source 51.222.111.13 (pool.ntp.org)
Dec 06 07:56:12 np0005548789.localdomain sudo[25939]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:12 np0005548789.localdomain sudo[25956]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xowykqdtzfxdorlubbyhodfqikpomvql ; /usr/bin/python3
Dec 06 07:56:12 np0005548789.localdomain sudo[25956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:13 np0005548789.localdomain python3[25958]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 06 07:56:13 np0005548789.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 07:56:13 np0005548789.localdomain systemd[1]: Started Time & Date Service.
Dec 06 07:56:13 np0005548789.localdomain sudo[25956]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:13 np0005548789.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 07:56:13 np0005548789.localdomain sudo[25979]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmadoffjmayyfmwrdetmaaohroobjnjs ; /usr/bin/python3
Dec 06 07:56:13 np0005548789.localdomain sudo[25979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:14 np0005548789.localdomain python3[25981]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 07:56:14 np0005548789.localdomain chronyd[25710]: chronyd exiting
Dec 06 07:56:14 np0005548789.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 07:56:14 np0005548789.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 07:56:14 np0005548789.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 07:56:14 np0005548789.localdomain systemd[1]: Starting NTP client/server...
Dec 06 07:56:14 np0005548789.localdomain chronyd[25988]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 07:56:14 np0005548789.localdomain chronyd[25988]: Frequency -30.154 +/- 0.056 ppm read from /var/lib/chrony/drift
Dec 06 07:56:14 np0005548789.localdomain chronyd[25988]: Loaded seccomp filter (level 2)
Dec 06 07:56:14 np0005548789.localdomain systemd[1]: Started NTP client/server.
Dec 06 07:56:14 np0005548789.localdomain sudo[25979]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:18 np0005548789.localdomain chronyd[25988]: Selected source 192.95.27.155 (pool.ntp.org)
Dec 06 07:56:30 np0005548789.localdomain sudo[26003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckeqfmeqbnypqkuvvzjurilqxydhmppo ; /usr/bin/python3
Dec 06 07:56:30 np0005548789.localdomain sudo[26003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:31 np0005548789.localdomain useradd[26007]: new group: name=ceph-admin, GID=1002
Dec 06 07:56:31 np0005548789.localdomain useradd[26007]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 06 07:56:31 np0005548789.localdomain sudo[26003]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:31 np0005548789.localdomain sudo[26059]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvcvstwjubbpawgpwxgghmenyvqodjwf ; /usr/bin/python3
Dec 06 07:56:31 np0005548789.localdomain sudo[26059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:31 np0005548789.localdomain sudo[26059]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548789.localdomain sudo[26102]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqedgwvnwxelwzwtamtpgyxghjjsmzcs ; /usr/bin/python3
Dec 06 07:56:32 np0005548789.localdomain sudo[26102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:32 np0005548789.localdomain sudo[26102]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548789.localdomain sudo[26132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imhcoephtaajrjjhaqyokfidccatwykm ; /usr/bin/python3
Dec 06 07:56:32 np0005548789.localdomain sudo[26132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:32 np0005548789.localdomain sudo[26132]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:32 np0005548789.localdomain sudo[26148]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sofqwihzjpimdomqswhjkivyzpquuujm ; /usr/bin/python3
Dec 06 07:56:32 np0005548789.localdomain sudo[26148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:33 np0005548789.localdomain sudo[26148]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:33 np0005548789.localdomain sudo[26164]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrhohaldglwoynrpiganaaxqrdjverud ; /usr/bin/python3
Dec 06 07:56:33 np0005548789.localdomain sudo[26164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:33 np0005548789.localdomain sudo[26164]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:33 np0005548789.localdomain sudo[26180]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcizpntrttzdkwscbupzqpopqqrsrvyp ; /usr/bin/python3
Dec 06 07:56:33 np0005548789.localdomain sudo[26180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 07:56:34 np0005548789.localdomain sudo[26180]: pam_unix(sudo:session): session closed for user root
Dec 06 07:56:35 np0005548789.localdomain sshd[26183]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:56:36 np0005548789.localdomain sshd[26183]: Received disconnect from 154.201.83.49 port 53652:11: Bye Bye [preauth]
Dec 06 07:56:36 np0005548789.localdomain sshd[26183]: Disconnected from authenticating user root 154.201.83.49 port 53652 [preauth]
Dec 06 07:56:41 np0005548789.localdomain sshd[26185]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:56:41 np0005548789.localdomain sshd[26185]: Received disconnect from 74.94.234.151 port 48216:11: Bye Bye [preauth]
Dec 06 07:56:41 np0005548789.localdomain sshd[26185]: Disconnected from authenticating user root 74.94.234.151 port 48216 [preauth]
Dec 06 07:56:43 np0005548789.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 07:56:57 np0005548789.localdomain sshd[26189]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:56:58 np0005548789.localdomain sshd[26189]: Received disconnect from 162.241.87.197 port 55196:11: Bye Bye [preauth]
Dec 06 07:56:58 np0005548789.localdomain sshd[26189]: Disconnected from authenticating user root 162.241.87.197 port 55196 [preauth]
Dec 06 07:57:04 np0005548789.localdomain sshd[26191]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:57:05 np0005548789.localdomain sshd[26191]: Received disconnect from 77.222.100.142 port 36932:11: Bye Bye [preauth]
Dec 06 07:57:05 np0005548789.localdomain sshd[26191]: Disconnected from authenticating user root 77.222.100.142 port 36932 [preauth]
Dec 06 07:57:28 np0005548789.localdomain sshd[26193]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:57:29 np0005548789.localdomain sshd[26193]: Received disconnect from 195.250.72.168 port 49332:11: Bye Bye [preauth]
Dec 06 07:57:29 np0005548789.localdomain sshd[26193]: Disconnected from authenticating user root 195.250.72.168 port 49332 [preauth]
Dec 06 07:58:01 np0005548789.localdomain anacron[6192]: Job `cron.monthly' started
Dec 06 07:58:01 np0005548789.localdomain anacron[6192]: Job `cron.monthly' terminated
Dec 06 07:58:01 np0005548789.localdomain anacron[6192]: Normal exit (3 jobs run)
Dec 06 07:58:03 np0005548789.localdomain sshd[26197]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:04 np0005548789.localdomain sshd[26197]: Received disconnect from 74.94.234.151 port 46634:11: Bye Bye [preauth]
Dec 06 07:58:04 np0005548789.localdomain sshd[26197]: Disconnected from authenticating user root 74.94.234.151 port 46634 [preauth]
Dec 06 07:58:07 np0005548789.localdomain sshd[26199]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:09 np0005548789.localdomain sshd[26201]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:09 np0005548789.localdomain sshd[26201]: Received disconnect from 162.241.87.197 port 35422:11: Bye Bye [preauth]
Dec 06 07:58:09 np0005548789.localdomain sshd[26201]: Disconnected from authenticating user root 162.241.87.197 port 35422 [preauth]
Dec 06 07:58:10 np0005548789.localdomain sshd[26199]: Received disconnect from 154.201.83.49 port 60704:11: Bye Bye [preauth]
Dec 06 07:58:10 np0005548789.localdomain sshd[26199]: Disconnected from authenticating user root 154.201.83.49 port 60704 [preauth]
Dec 06 07:58:13 np0005548789.localdomain sshd[26203]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:14 np0005548789.localdomain sshd[26203]: Received disconnect from 77.222.100.142 port 44034:11: Bye Bye [preauth]
Dec 06 07:58:14 np0005548789.localdomain sshd[26203]: Disconnected from authenticating user root 77.222.100.142 port 44034 [preauth]
Dec 06 07:58:24 np0005548789.localdomain sshd[26205]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:24 np0005548789.localdomain sshd[26205]: Accepted publickey for ceph-admin from 192.168.122.103 port 58840 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:24 np0005548789.localdomain systemd-logind[766]: New session 14 of user ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:24 np0005548789.localdomain sshd[26222]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Queued start job for default target Main User Target.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Created slice User Application Slice.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Reached target Paths.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Reached target Timers.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Starting D-Bus User Message Bus Socket...
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Starting Create User's Volatile Files and Directories...
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Listening on D-Bus User Message Bus Socket.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Reached target Sockets.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Finished Create User's Volatile Files and Directories.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Reached target Basic System.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Reached target Main User Target.
Dec 06 07:58:24 np0005548789.localdomain systemd[26209]: Startup finished in 114ms.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain sshd[26205]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:24 np0005548789.localdomain sshd[26222]: Accepted publickey for ceph-admin from 192.168.122.103 port 58852 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:24 np0005548789.localdomain systemd-logind[766]: New session 16 of user ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain sshd[26222]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:24 np0005548789.localdomain sudo[26229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:24 np0005548789.localdomain sudo[26229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:24 np0005548789.localdomain sudo[26229]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:24 np0005548789.localdomain sshd[26244]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:24 np0005548789.localdomain sshd[26244]: Accepted publickey for ceph-admin from 192.168.122.103 port 58860 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:24 np0005548789.localdomain systemd-logind[766]: New session 17 of user ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Dec 06 07:58:24 np0005548789.localdomain sshd[26244]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:24 np0005548789.localdomain sudo[26248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005548789.localdomain
Dec 06 07:58:24 np0005548789.localdomain sudo[26248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:24 np0005548789.localdomain sudo[26248]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:25 np0005548789.localdomain sshd[26263]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:25 np0005548789.localdomain sshd[26263]: Accepted publickey for ceph-admin from 192.168.122.103 port 58866 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:25 np0005548789.localdomain systemd-logind[766]: New session 18 of user ceph-admin.
Dec 06 07:58:25 np0005548789.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Dec 06 07:58:25 np0005548789.localdomain sshd[26263]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:25 np0005548789.localdomain sudo[26267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 06 07:58:25 np0005548789.localdomain sudo[26267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:25 np0005548789.localdomain sudo[26267]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:25 np0005548789.localdomain sshd[26282]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:25 np0005548789.localdomain sshd[26282]: Accepted publickey for ceph-admin from 192.168.122.103 port 58876 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:25 np0005548789.localdomain systemd-logind[766]: New session 19 of user ceph-admin.
Dec 06 07:58:25 np0005548789.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Dec 06 07:58:25 np0005548789.localdomain sshd[26282]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:25 np0005548789.localdomain sudo[26286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:25 np0005548789.localdomain sudo[26286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:25 np0005548789.localdomain sudo[26286]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:25 np0005548789.localdomain sshd[26301]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:25 np0005548789.localdomain sshd[26301]: Accepted publickey for ceph-admin from 192.168.122.103 port 58886 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:26 np0005548789.localdomain systemd-logind[766]: New session 20 of user ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain sshd[26301]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:26 np0005548789.localdomain sudo[26305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:26 np0005548789.localdomain sudo[26305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:26 np0005548789.localdomain sudo[26305]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:26 np0005548789.localdomain sshd[26320]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:26 np0005548789.localdomain sshd[26320]: Accepted publickey for ceph-admin from 192.168.122.103 port 58888 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:26 np0005548789.localdomain systemd-logind[766]: New session 21 of user ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain sshd[26320]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:26 np0005548789.localdomain sudo[26324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 06 07:58:26 np0005548789.localdomain sudo[26324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:26 np0005548789.localdomain sudo[26324]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:26 np0005548789.localdomain sshd[26339]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:26 np0005548789.localdomain sshd[26339]: Accepted publickey for ceph-admin from 192.168.122.103 port 53430 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:26 np0005548789.localdomain systemd-logind[766]: New session 22 of user ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Dec 06 07:58:26 np0005548789.localdomain sshd[26339]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:26 np0005548789.localdomain sudo[26343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:58:26 np0005548789.localdomain sudo[26343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:26 np0005548789.localdomain sudo[26343]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:27 np0005548789.localdomain sshd[26358]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:27 np0005548789.localdomain sshd[26358]: Accepted publickey for ceph-admin from 192.168.122.103 port 53434 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:27 np0005548789.localdomain systemd-logind[766]: New session 23 of user ceph-admin.
Dec 06 07:58:27 np0005548789.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Dec 06 07:58:27 np0005548789.localdomain sshd[26358]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:27 np0005548789.localdomain sudo[26362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Dec 06 07:58:27 np0005548789.localdomain sudo[26362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:27 np0005548789.localdomain sudo[26362]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:27 np0005548789.localdomain sshd[26377]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:27 np0005548789.localdomain sshd[26377]: Accepted publickey for ceph-admin from 192.168.122.103 port 53436 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:27 np0005548789.localdomain systemd-logind[766]: New session 24 of user ceph-admin.
Dec 06 07:58:27 np0005548789.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Dec 06 07:58:27 np0005548789.localdomain sshd[26377]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:28 np0005548789.localdomain sshd[26394]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:28 np0005548789.localdomain sshd[26394]: Accepted publickey for ceph-admin from 192.168.122.103 port 53450 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:28 np0005548789.localdomain systemd-logind[766]: New session 25 of user ceph-admin.
Dec 06 07:58:28 np0005548789.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Dec 06 07:58:28 np0005548789.localdomain sshd[26394]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:28 np0005548789.localdomain sudo[26398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Dec 06 07:58:28 np0005548789.localdomain sudo[26398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:28 np0005548789.localdomain sudo[26398]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:28 np0005548789.localdomain sshd[26413]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:58:28 np0005548789.localdomain sshd[26413]: Accepted publickey for ceph-admin from 192.168.122.103 port 53454 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 07:58:28 np0005548789.localdomain systemd-logind[766]: New session 26 of user ceph-admin.
Dec 06 07:58:28 np0005548789.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Dec 06 07:58:28 np0005548789.localdomain sshd[26413]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 07:58:28 np0005548789.localdomain sudo[26417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005548789.localdomain
Dec 06 07:58:28 np0005548789.localdomain sudo[26417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:29 np0005548789.localdomain sudo[26417]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548789.localdomain sudo[26453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 07:58:47 np0005548789.localdomain sudo[26453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548789.localdomain sudo[26453]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548789.localdomain sudo[26468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:47 np0005548789.localdomain sudo[26468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548789.localdomain sudo[26468]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548789.localdomain sudo[26483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 07:58:47 np0005548789.localdomain sudo[26483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:47 np0005548789.localdomain sudo[26483]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548789.localdomain sudo[26520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:47 np0005548789.localdomain sudo[26520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:47 np0005548789.localdomain sudo[26520]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:47 np0005548789.localdomain sudo[26535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 07:58:47 np0005548789.localdomain sudo[26535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548789.localdomain sudo[26535]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:48 np0005548789.localdomain sudo[26587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:48 np0005548789.localdomain sudo[26587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548789.localdomain sudo[26587]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:48 np0005548789.localdomain sudo[26602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 07:58:48 np0005548789.localdomain sudo[26602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26629 (sysctl)
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 06 07:58:48 np0005548789.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 06 07:58:49 np0005548789.localdomain sudo[26602]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548789.localdomain sudo[26651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:49 np0005548789.localdomain sudo[26651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548789.localdomain sudo[26651]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548789.localdomain sudo[26666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 07:58:49 np0005548789.localdomain sudo[26666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:49 np0005548789.localdomain sudo[26666]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548789.localdomain sudo[26700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:58:49 np0005548789.localdomain sudo[26700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:49 np0005548789.localdomain sudo[26700]: pam_unix(sudo:session): session closed for user root
Dec 06 07:58:49 np0005548789.localdomain sudo[26715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 07:58:49 np0005548789.localdomain sudo[26715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:58:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:58:53 np0005548789.localdomain kernel: VFS: idmapped mount is not enabled.
Dec 06 07:59:02 np0005548789.localdomain sshd[26853]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:03 np0005548789.localdomain sshd[26853]: Received disconnect from 195.250.72.168 port 49572:11: Bye Bye [preauth]
Dec 06 07:59:03 np0005548789.localdomain sshd[26853]: Disconnected from authenticating user root 195.250.72.168 port 49572 [preauth]
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:59:13.531811098 +0000 UTC m=+23.102943100 container create 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:58:50.470185507 +0000 UTC m=+0.041317539 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:13 np0005548789.localdomain systemd[1]: Created slice Slice /machine.
Dec 06 07:59:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623.scope.
Dec 06 07:59:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:59:13.619856271 +0000 UTC m=+23.190988313 container init 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1763362218, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:59:13.629673162 +0000 UTC m=+23.200805174 container start 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:59:13.629894739 +0000 UTC m=+23.201026831 container attach 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 07:59:13 np0005548789.localdomain distracted_lewin[26872]: 167 167
Dec 06 07:59:13 np0005548789.localdomain systemd[1]: libpod-7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623.scope: Deactivated successfully.
Dec 06 07:59:13 np0005548789.localdomain podman[26770]: 2025-12-06 07:59:13.634156 +0000 UTC m=+23.205288042 container died 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main)
Dec 06 07:59:13 np0005548789.localdomain podman[26877]: 2025-12-06 07:59:13.721854951 +0000 UTC m=+0.075957472 container remove 7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_lewin, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 07:59:13 np0005548789.localdomain systemd[1]: libpod-conmon-7350105fe9b4d3bcc45729d2644680810591819ba664a3ae0750045a241db623.scope: Deactivated successfully.
Dec 06 07:59:13 np0005548789.localdomain podman[26897]: 
Dec 06 07:59:14 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:13.972943278 +0000 UTC m=+0.074718195 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b5ce0689e8642a6d5ec9da0f1e03028fe29ddd2780e4a00484a4b95724ab05f5-merged.mount: Deactivated successfully.
Dec 06 07:59:17 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:17.695444377 +0000 UTC m=+3.797219294 container create fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Dec 06 07:59:17 np0005548789.localdomain systemd[1]: Started libpod-conmon-fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b.scope.
Dec 06 07:59:17 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d347132d95c91b44e65918585cb82ec910914e068081c068616733ad518cb1c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d347132d95c91b44e65918585cb82ec910914e068081c068616733ad518cb1c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:17 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:17.782661733 +0000 UTC m=+3.884436650 container init fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 07:59:17 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:17.794786175 +0000 UTC m=+3.896561092 container start fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_BRANCH=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7)
Dec 06 07:59:17 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:17.798867751 +0000 UTC m=+3.900642718 container attach fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, name=rhceph, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 07:59:17 np0005548789.localdomain sshd[27172]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:18 np0005548789.localdomain sshd[27172]: Received disconnect from 162.241.87.197 port 57484:11: Bye Bye [preauth]
Dec 06 07:59:18 np0005548789.localdomain sshd[27172]: Disconnected from authenticating user root 162.241.87.197 port 57484 [preauth]
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]: [
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:     {
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "available": false,
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "ceph_device": false,
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "lsm_data": {},
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "lvs": [],
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "path": "/dev/sr0",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "rejected_reasons": [
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "Has a FileSystem",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "Insufficient space (<5GB)"
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         ],
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         "sys_api": {
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "actuators": null,
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "device_nodes": "sr0",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "human_readable_size": "482.00 KB",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "id_bus": "ata",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "model": "QEMU DVD-ROM",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "nr_requests": "2",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "partitions": {},
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "path": "/dev/sr0",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "removable": "1",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "rev": "2.5+",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "ro": "0",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "rotational": "1",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "sas_address": "",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "sas_device_handle": "",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "scheduler_mode": "mq-deadline",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "sectors": 0,
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "sectorsize": "2048",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "size": 493568.0,
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "support_discard": "0",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "type": "disk",
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:             "vendor": "QEMU"
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:         }
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]:     }
Dec 06 07:59:18 np0005548789.localdomain pedantic_franklin[27167]: ]
Dec 06 07:59:18 np0005548789.localdomain systemd[1]: libpod-fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b.scope: Deactivated successfully.
Dec 06 07:59:18 np0005548789.localdomain podman[26897]: 2025-12-06 07:59:18.638878601 +0000 UTC m=+4.740653548 container died fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Dec 06 07:59:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7d347132d95c91b44e65918585cb82ec910914e068081c068616733ad518cb1c-merged.mount: Deactivated successfully.
Dec 06 07:59:18 np0005548789.localdomain podman[28474]: 2025-12-06 07:59:18.746885766 +0000 UTC m=+0.093099488 container remove fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_franklin, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, vcs-type=git, build-date=2025-11-26T19:44:28Z)
Dec 06 07:59:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:18 np0005548789.localdomain systemd[1]: libpod-conmon-fb83a09b9cfd2438e292df7e2c48345cadffd8f1eda05dae7ec2baae9e21431b.scope: Deactivated successfully.
Dec 06 07:59:18 np0005548789.localdomain sudo[26715]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:19 np0005548789.localdomain sudo[28488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:19 np0005548789.localdomain sudo[28488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:19 np0005548789.localdomain sudo[28488]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:19 np0005548789.localdomain sudo[28503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --coredump-max-size=32G
Dec 06 07:59:19 np0005548789.localdomain sudo[28503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: Closed Process Core Dump Socket.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: Stopping Process Core Dump Socket...
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: Listening on Process Core Dump Socket.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:19 np0005548789.localdomain systemd-sysv-generator[28560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:19 np0005548789.localdomain systemd-rc-local-generator[28557]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:19 np0005548789.localdomain systemd-rc-local-generator[28596]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:19 np0005548789.localdomain systemd-sysv-generator[28599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:20 np0005548789.localdomain sudo[28503]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:21 np0005548789.localdomain sshd[28605]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:22 np0005548789.localdomain sshd[28605]: Received disconnect from 77.222.100.142 port 35616:11: Bye Bye [preauth]
Dec 06 07:59:22 np0005548789.localdomain sshd[28605]: Disconnected from authenticating user root 77.222.100.142 port 35616 [preauth]
Dec 06 07:59:25 np0005548789.localdomain sshd[28607]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:26 np0005548789.localdomain sshd[28607]: Received disconnect from 74.94.234.151 port 45046:11: Bye Bye [preauth]
Dec 06 07:59:26 np0005548789.localdomain sshd[28607]: Disconnected from authenticating user root 74.94.234.151 port 45046 [preauth]
Dec 06 07:59:42 np0005548789.localdomain sshd[28609]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 07:59:43 np0005548789.localdomain sshd[28609]: Received disconnect from 154.201.83.49 port 45850:11: Bye Bye [preauth]
Dec 06 07:59:43 np0005548789.localdomain sshd[28609]: Disconnected from authenticating user root 154.201.83.49 port 45850 [preauth]
Dec 06 07:59:49 np0005548789.localdomain sudo[28611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:49 np0005548789.localdomain sudo[28611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:49 np0005548789.localdomain sudo[28611]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:49 np0005548789.localdomain sudo[28626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 07:59:49 np0005548789.localdomain sudo[28626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.236578506 +0000 UTC m=+0.109937237 container create 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, version=7, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.168906675 +0000 UTC m=+0.042265396 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: Started libpod-conmon-143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d.scope.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.3043168 +0000 UTC m=+0.177675521 container init 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, release=1763362218, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.315590151 +0000 UTC m=+0.188948872 container start 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.315904614 +0000 UTC m=+0.189263385 container attach 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, release=1763362218, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 07:59:50 np0005548789.localdomain angry_wilson[28698]: 167 167
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: libpod-143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d.scope: Deactivated successfully.
Dec 06 07:59:50 np0005548789.localdomain podman[28683]: 2025-12-06 07:59:50.319983963 +0000 UTC m=+0.193342684 container died 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 07:59:50 np0005548789.localdomain podman[28703]: 2025-12-06 07:59:50.40669703 +0000 UTC m=+0.075423646 container remove 143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_wilson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, release=1763362218, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: libpod-conmon-143a75030e53358180320dd3b7e07301732dd53d6350cf4ab50c5c8c7136e96d.scope: Deactivated successfully.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:50 np0005548789.localdomain systemd-rc-local-generator[28742]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:50 np0005548789.localdomain systemd-sysv-generator[28747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:50 np0005548789.localdomain systemd-rc-local-generator[28782]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:50 np0005548789.localdomain systemd-sysv-generator[28785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:50 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reached target All Ceph clusters and services.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:51 np0005548789.localdomain systemd-rc-local-generator[28821]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:51 np0005548789.localdomain systemd-sysv-generator[28826]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reached target Ceph cluster 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:51 np0005548789.localdomain systemd-sysv-generator[28862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:51 np0005548789.localdomain systemd-rc-local-generator[28859]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 07:59:51 np0005548789.localdomain systemd-rc-local-generator[28900]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 07:59:51 np0005548789.localdomain systemd-sysv-generator[28906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Created slice Slice /system/ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reached target System Time Set.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Reached target System Time Synchronized.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: Starting Ceph crash.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 07:59:52 np0005548789.localdomain podman[28965]: 
Dec 06 07:59:52 np0005548789.localdomain podman[28965]: 2025-12-06 07:59:52.077199824 +0000 UTC m=+0.063493538 container create ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 07:59:52 np0005548789.localdomain podman[28965]: 2025-12-06 07:59:52.048423397 +0000 UTC m=+0.034717121 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75e1c5dc68106d13b590be81c783884e46aa279b6185456f5563774cd3c14da/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75e1c5dc68106d13b590be81c783884e46aa279b6185456f5563774cd3c14da/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75e1c5dc68106d13b590be81c783884e46aa279b6185456f5563774cd3c14da/merged/etc/ceph/ceph.client.crash.np0005548789.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:52 np0005548789.localdomain podman[28965]: 2025-12-06 07:59:52.191496151 +0000 UTC m=+0.177789855 container init ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, release=1763362218, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Dec 06 07:59:52 np0005548789.localdomain podman[28965]: 2025-12-06 07:59:52.201891488 +0000 UTC m=+0.188185192 container start ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, com.redhat.component=rhceph-container, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7, distribution-scope=public)
Dec 06 07:59:52 np0005548789.localdomain bash[28965]: ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06
Dec 06 07:59:52 np0005548789.localdomain systemd[1]: Started Ceph crash.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 07:59:52 np0005548789.localdomain sudo[28626]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.381+0000 7f7965a25640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.381+0000 7f7965a25640 -1 AuthRegistry(0x7f79600680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.381+0000 7f7965a25640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.381+0000 7f7965a25640 -1 AuthRegistry(0x7f7965a24000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.393+0000 7f795ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.395+0000 7f7964a23640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.395+0000 7f7965224640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: 2025-12-06T07:59:52.395+0000 7f7965a25640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 06 07:59:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789[28980]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 06 07:59:52 np0005548789.localdomain sudo[28988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 07:59:52 np0005548789.localdomain sudo[28988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:52 np0005548789.localdomain sudo[28988]: pam_unix(sudo:session): session closed for user root
Dec 06 07:59:52 np0005548789.localdomain sudo[29012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Dec 06 07:59:52 np0005548789.localdomain sudo[29012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:53.012377205 +0000 UTC m=+0.075934775 container create 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: Started libpod-conmon-5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8.scope.
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:53.079608488 +0000 UTC m=+0.143166058 container init 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1763362218, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:52.983080018 +0000 UTC m=+0.046637578 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: tmp-crun.fZ2w30.mount: Deactivated successfully.
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:53.090523426 +0000 UTC m=+0.154080996 container start 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, ceph=True, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:53.090974654 +0000 UTC m=+0.154532254 container attach 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7)
Dec 06 07:59:53 np0005548789.localdomain nice_davinci[29082]: 167 167
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: libpod-5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8.scope: Deactivated successfully.
Dec 06 07:59:53 np0005548789.localdomain podman[29067]: 2025-12-06 07:59:53.09520936 +0000 UTC m=+0.158766930 container died 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1763362218, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 07:59:53 np0005548789.localdomain podman[29087]: 2025-12-06 07:59:53.180059413 +0000 UTC m=+0.074093483 container remove 5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_davinci, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=)
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: libpod-conmon-5ed0100e3c2733c3eb992470839044245fc813f312d73820944c47774a776eb8.scope: Deactivated successfully.
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 2025-12-06 07:59:53.380311428 +0000 UTC m=+0.069341088 container create 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7)
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: Started libpod-conmon-38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1.scope.
Dec 06 07:59:53 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 07:59:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 2025-12-06 07:59:53.352529759 +0000 UTC m=+0.041559429 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 07:59:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 2025-12-06 07:59:53.498478756 +0000 UTC m=+0.187508396 container init 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4)
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 2025-12-06 07:59:53.508439997 +0000 UTC m=+0.197469647 container start 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 07:59:53 np0005548789.localdomain podman[29106]: 2025-12-06 07:59:53.508708997 +0000 UTC m=+0.197738667 container attach 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=)
Dec 06 07:59:53 np0005548789.localdomain angry_greider[29122]: --> passed data devices: 0 physical, 2 LVM
Dec 06 07:59:53 np0005548789.localdomain angry_greider[29122]: --> relative data size: 1.0
Dec 06 07:59:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c5a62fb99d29291a28bbfc397e985ab005ba51a5933a7594dd4a4809fd49c8b1-merged.mount: Deactivated successfully.
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 1f710487-3a3c-4f3d-8622-d6fac6224470
Dec 06 07:59:54 np0005548789.localdomain lvm[29176]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 07:59:54 np0005548789.localdomain lvm[29176]: VG ceph_vg0 finished
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Dec 06 07:59:54 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Dec 06 07:59:55 np0005548789.localdomain angry_greider[29122]:  stderr: got monmap epoch 3
Dec 06 07:59:55 np0005548789.localdomain angry_greider[29122]: --> Creating keyring file for osd.1
Dec 06 07:59:55 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Dec 06 07:59:55 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Dec 06 07:59:55 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid 1f710487-3a3c-4f3d-8622-d6fac6224470 --setuser ceph --setgroup ceph
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]:  stderr: 2025-12-06T07:59:55.210+0000 7f71cd170a80 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]:  stderr: 2025-12-06T07:59:55.210+0000 7f71cd170a80 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm activate successful for osd ID: 1
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:57 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 876fe068-f1aa-42bd-a56b-91d35874dd8e
Dec 06 07:59:58 np0005548789.localdomain lvm[30120]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 07:59:58 np0005548789.localdomain lvm[30120]: VG ceph_vg1 finished
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-4
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-4/activate.monmap
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]:  stderr: got monmap epoch 3
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: --> Creating keyring file for osd.4
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/keyring
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/
Dec 06 07:59:58 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 4 --monmap /var/lib/ceph/osd/ceph-4/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-4/ --osd-uuid 876fe068-f1aa-42bd-a56b-91d35874dd8e --setuser ceph --setgroup ceph
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]:  stderr: 2025-12-06T07:59:59.043+0000 7fa55524ca80 -1 bluestore(/var/lib/ceph/osd/ceph-4//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]:  stderr: 2025-12-06T07:59:59.043+0000 7fa55524ca80 -1 bluestore(/var/lib/ceph/osd/ceph-4/) _read_fsid unparsable uuid
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm activate successful for osd ID: 4
Dec 06 08:00:01 np0005548789.localdomain angry_greider[29122]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec 06 08:00:01 np0005548789.localdomain systemd[1]: libpod-38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1.scope: Deactivated successfully.
Dec 06 08:00:01 np0005548789.localdomain systemd[1]: libpod-38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1.scope: Consumed 3.719s CPU time.
Dec 06 08:00:01 np0005548789.localdomain podman[29106]: 2025-12-06 08:00:01.878714914 +0000 UTC m=+8.567744594 container died 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, com.redhat.component=rhceph-container)
Dec 06 08:00:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2dec8f4be64f7107f934f752e8830e402d5e21097387621af1ed9f64e6da86cc-merged.mount: Deactivated successfully.
Dec 06 08:00:01 np0005548789.localdomain podman[31036]: 2025-12-06 08:00:01.952673211 +0000 UTC m=+0.066956233 container remove 38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_greider, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, distribution-scope=public)
Dec 06 08:00:01 np0005548789.localdomain systemd[1]: libpod-conmon-38a13d0e460f417fe530dd46a18274b8389e4682afbb9b75997400958b78b3d1.scope: Deactivated successfully.
Dec 06 08:00:01 np0005548789.localdomain sudo[29012]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:02 np0005548789.localdomain sudo[31050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:02 np0005548789.localdomain sudo[31050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:02 np0005548789.localdomain sudo[31050]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:02 np0005548789.localdomain sudo[31065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- lvm list --format json
Dec 06 08:00:02 np0005548789.localdomain sudo[31065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.638183943 +0000 UTC m=+0.072520481 container create a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Dec 06 08:00:02 np0005548789.localdomain systemd[1]: Started libpod-conmon-a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53.scope.
Dec 06 08:00:02 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.606507332 +0000 UTC m=+0.040843890 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.707246588 +0000 UTC m=+0.141583126 container init a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, version=7, name=rhceph, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.716135226 +0000 UTC m=+0.150471754 container start a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.71724304 +0000 UTC m=+0.151579628 container attach a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, release=1763362218)
Dec 06 08:00:02 np0005548789.localdomain amazing_cori[31134]: 167 167
Dec 06 08:00:02 np0005548789.localdomain systemd[1]: libpod-a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53.scope: Deactivated successfully.
Dec 06 08:00:02 np0005548789.localdomain podman[31119]: 2025-12-06 08:00:02.721497706 +0000 UTC m=+0.155834734 container died a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, vendor=Red Hat, Inc., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 08:00:02 np0005548789.localdomain podman[31139]: 2025-12-06 08:00:02.809130409 +0000 UTC m=+0.075535959 container remove a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.)
Dec 06 08:00:02 np0005548789.localdomain systemd[1]: libpod-conmon-a631457bcfd3b6fe04c2edf5730c5696b322d44436f9d12eaabbb7030ad24b53.scope: Deactivated successfully.
Dec 06 08:00:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-475b25ef132999e27939deb7fa4afcdff68a8ceb5311a9c4de79b332733b9ac0-merged.mount: Deactivated successfully.
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:03.01977939 +0000 UTC m=+0.068872029 container create 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, release=1763362218, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container)
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: Started libpod-conmon-8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21.scope.
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e4e551b6994d78ba73ccd0eb46cfbea5f6021c8361b5d3699f7f0bb35bd30f4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:02.994163027 +0000 UTC m=+0.043255696 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e4e551b6994d78ba73ccd0eb46cfbea5f6021c8361b5d3699f7f0bb35bd30f4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e4e551b6994d78ba73ccd0eb46cfbea5f6021c8361b5d3699f7f0bb35bd30f4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:03.112087676 +0000 UTC m=+0.161180335 container init 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1763362218, RELEASE=main, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_CLEAN=True)
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:03.122026775 +0000 UTC m=+0.171119444 container start 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:03.122363988 +0000 UTC m=+0.171456717 container attach 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]: {
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:     "1": [
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:         {
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "devices": [
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "/dev/loop3"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             ],
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_name": "ceph_lv0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_size": "7511998464",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=jxsEmx-Am0R-cg61-xEnr-yg8f-zYJK-uQKeR2,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1939e851-b10c-5c3b-9bb7-8e7f380233e8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=1f710487-3a3c-4f3d-8622-d6fac6224470,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_uuid": "jxsEmx-Am0R-cg61-xEnr-yg8f-zYJK-uQKeR2",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "name": "ceph_lv0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "tags": {
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.block_uuid": "jxsEmx-Am0R-cg61-xEnr-yg8f-zYJK-uQKeR2",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cephx_lockbox_secret": "",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cluster_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cluster_name": "ceph",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.crush_device_class": "",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.encrypted": "0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osd_fsid": "1f710487-3a3c-4f3d-8622-d6fac6224470",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osd_id": "1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.type": "block",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.vdo": "0"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             },
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "type": "block",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "vg_name": "ceph_vg0"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:         }
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:     ],
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:     "4": [
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:         {
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "devices": [
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "/dev/loop4"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             ],
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_name": "ceph_lv1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_size": "7511998464",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=HIwJoG-9m23-NWng-tx4I-d4U0-RA89-Raie0n,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=1939e851-b10c-5c3b-9bb7-8e7f380233e8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=876fe068-f1aa-42bd-a56b-91d35874dd8e,ceph.osd_id=4,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "lv_uuid": "HIwJoG-9m23-NWng-tx4I-d4U0-RA89-Raie0n",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "name": "ceph_lv1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "tags": {
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.block_uuid": "HIwJoG-9m23-NWng-tx4I-d4U0-RA89-Raie0n",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cephx_lockbox_secret": "",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cluster_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.cluster_name": "ceph",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.crush_device_class": "",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.encrypted": "0",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osd_fsid": "876fe068-f1aa-42bd-a56b-91d35874dd8e",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osd_id": "4",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.type": "block",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:                 "ceph.vdo": "0"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             },
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "type": "block",
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:             "vg_name": "ceph_vg1"
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:         }
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]:     ]
Dec 06 08:00:03 np0005548789.localdomain optimistic_sinoussi[31175]: }
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: libpod-8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21.scope: Deactivated successfully.
Dec 06 08:00:03 np0005548789.localdomain podman[31160]: 2025-12-06 08:00:03.457667492 +0000 UTC m=+0.506760191 container died 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.expose-services=, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Dec 06 08:00:03 np0005548789.localdomain podman[31184]: 2025-12-06 08:00:03.549213399 +0000 UTC m=+0.083034774 container remove 8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_sinoussi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7)
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: libpod-conmon-8e539d9b44cc3e4b9b91830dfe33909cac74a401560f3ec34187289a7ead2b21.scope: Deactivated successfully.
Dec 06 08:00:03 np0005548789.localdomain sudo[31065]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:03 np0005548789.localdomain sudo[31199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:03 np0005548789.localdomain sudo[31199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:03 np0005548789.localdomain sudo[31199]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:03 np0005548789.localdomain sudo[31214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 08:00:03 np0005548789.localdomain sudo[31214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: tmp-crun.uplUrX.mount: Deactivated successfully.
Dec 06 08:00:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e4e551b6994d78ba73ccd0eb46cfbea5f6021c8361b5d3699f7f0bb35bd30f4-merged.mount: Deactivated successfully.
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.305574315 +0000 UTC m=+0.076495948 container create 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, version=7, name=rhceph, architecture=x86_64, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0.scope.
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.355703829 +0000 UTC m=+0.126625502 container init 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, version=7, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.365290574 +0000 UTC m=+0.136212227 container start 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, ceph=True, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7)
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.365516523 +0000 UTC m=+0.136438176 container attach 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:04 np0005548789.localdomain musing_wozniak[31287]: 167 167
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: libpod-82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0.scope: Deactivated successfully.
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.367713309 +0000 UTC m=+0.138634972 container died 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:04 np0005548789.localdomain podman[31271]: 2025-12-06 08:00:04.269501712 +0000 UTC m=+0.040423365 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:04 np0005548789.localdomain podman[31292]: 2025-12-06 08:00:04.455804279 +0000 UTC m=+0.074635303 container remove 82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wozniak, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, version=7, ceph=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: libpod-conmon-82cbcc4fedc793ef32a1fe7ddba6188791bd9aaca1b2bf6bd03aac7253f50ec0.scope: Deactivated successfully.
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:04.790557792 +0000 UTC m=+0.081722332 container create af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, CEPH_POINT_RELEASE=, ceph=True, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446.scope.
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:04.762357708 +0000 UTC m=+0.053522248 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:04.909036683 +0000 UTC m=+0.200201223 container init af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=)
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:04.919520004 +0000 UTC m=+0.210684574 container start af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True)
Dec 06 08:00:04 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:04.919883758 +0000 UTC m=+0.211048378 container attach af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 06 08:00:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e107116fe6780fe23b2e832ceddab708ea4d834cfb946d87980d14f7cb6dc17c-merged.mount: Deactivated successfully.
Dec 06 08:00:05 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test[31335]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 06 08:00:05 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test[31335]:                             [--no-systemd] [--no-tmpfs]
Dec 06 08:00:05 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test[31335]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: libpod-af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446.scope: Deactivated successfully.
Dec 06 08:00:05 np0005548789.localdomain podman[31319]: 2025-12-06 08:00:05.144158783 +0000 UTC m=+0.435323353 container died af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=)
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-dbe5d02d81f41d6af86961773c0bd6b6b7e53f323918505650ca531e08ed8274-merged.mount: Deactivated successfully.
Dec 06 08:00:05 np0005548789.localdomain systemd-journald[619]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Dec 06 08:00:05 np0005548789.localdomain systemd-journald[619]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:00:05 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:00:05 np0005548789.localdomain podman[31340]: 2025-12-06 08:00:05.210047384 +0000 UTC m=+0.059378998 container remove af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate-test, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: libpod-conmon-af83484c350c05d44ef4f982d1e4a01668bba121a73a2450b299db2ab8aec446.scope: Deactivated successfully.
Dec 06 08:00:05 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:00:05 np0005548789.localdomain systemd-rc-local-generator[31396]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:05 np0005548789.localdomain systemd-sysv-generator[31401]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:00:05 np0005548789.localdomain systemd-sysv-generator[31445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:05 np0005548789.localdomain systemd-rc-local-generator[31442]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:05 np0005548789.localdomain systemd[1]: Starting Ceph osd.1 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:06.223117526 +0000 UTC m=+0.073936347 container create 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, version=7, GIT_BRANCH=main, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, CEPH_POINT_RELEASE=)
Dec 06 08:00:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:06.193146871 +0000 UTC m=+0.043965692 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:06.358667265 +0000 UTC m=+0.209486076 container init 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc.)
Dec 06 08:00:06 np0005548789.localdomain systemd[1]: tmp-crun.ZpMZlz.mount: Deactivated successfully.
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:06.372863351 +0000 UTC m=+0.223682162 container start 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=1763362218, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Dec 06 08:00:06 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:06.373126271 +0000 UTC m=+0.223945082 container attach 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:06 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 08:00:06 np0005548789.localdomain bash[31504]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 08:00:06 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:06 np0005548789.localdomain bash[31504]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:06 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548789.localdomain bash[31504]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Dec 06 08:00:07 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 08:00:07 np0005548789.localdomain bash[31504]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 08:00:07 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:07 np0005548789.localdomain bash[31504]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:07 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 08:00:07 np0005548789.localdomain bash[31504]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 06 08:00:07 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate[31518]: --> ceph-volume raw activate successful for osd ID: 1
Dec 06 08:00:07 np0005548789.localdomain bash[31504]: --> ceph-volume raw activate successful for osd ID: 1
Dec 06 08:00:07 np0005548789.localdomain podman[31504]: 2025-12-06 08:00:07.056370074 +0000 UTC m=+0.907188865 container died 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, release=1763362218, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container)
Dec 06 08:00:07 np0005548789.localdomain systemd[1]: libpod-597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab.scope: Deactivated successfully.
Dec 06 08:00:07 np0005548789.localdomain podman[31648]: 2025-12-06 08:00:07.140624474 +0000 UTC m=+0.067894131 container remove 597c74aa715fc44def4998c1135bac2b866e166c5a0712bcbc80ba247f8c8eab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1-activate, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Dec 06 08:00:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-36e6fdcc8fbd0db6e6b9bebfdebafeabb7e03697f9c1110028f80fb3a60bc94b-merged.mount: Deactivated successfully.
Dec 06 08:00:07 np0005548789.localdomain podman[31708]: 
Dec 06 08:00:07 np0005548789.localdomain podman[31708]: 2025-12-06 08:00:07.441979518 +0000 UTC m=+0.068981783 container create f0af0a8a2c0d4c2d38536a2fae1423e2ee2dcdd544ad6a01502152adae4fd66a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Dec 06 08:00:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626593515b4b4f1d9cc9f07e015e464e5aa32faa3ff81c2051697af9e4db66cd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:07 np0005548789.localdomain podman[31708]: 2025-12-06 08:00:07.416053453 +0000 UTC m=+0.043055728 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626593515b4b4f1d9cc9f07e015e464e5aa32faa3ff81c2051697af9e4db66cd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626593515b4b4f1d9cc9f07e015e464e5aa32faa3ff81c2051697af9e4db66cd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626593515b4b4f1d9cc9f07e015e464e5aa32faa3ff81c2051697af9e4db66cd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626593515b4b4f1d9cc9f07e015e464e5aa32faa3ff81c2051697af9e4db66cd/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:07 np0005548789.localdomain podman[31708]: 2025-12-06 08:00:07.553677653 +0000 UTC m=+0.180679918 container init f0af0a8a2c0d4c2d38536a2fae1423e2ee2dcdd544ad6a01502152adae4fd66a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, release=1763362218, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:07 np0005548789.localdomain podman[31708]: 2025-12-06 08:00:07.562481558 +0000 UTC m=+0.189483823 container start f0af0a8a2c0d4c2d38536a2fae1423e2ee2dcdd544ad6a01502152adae4fd66a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, release=1763362218, GIT_CLEAN=True, version=7, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main)
Dec 06 08:00:07 np0005548789.localdomain bash[31708]: f0af0a8a2c0d4c2d38536a2fae1423e2ee2dcdd544ad6a01502152adae4fd66a
Dec 06 08:00:07 np0005548789.localdomain systemd[1]: Started Ceph osd.1 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: pidfile_write: ignore empty --pid-file
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) close
Dec 06 08:00:07 np0005548789.localdomain sudo[31214]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:07 np0005548789.localdomain sudo[31739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:07 np0005548789.localdomain sudo[31739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:07 np0005548789.localdomain sudo[31739]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:07 np0005548789.localdomain sudo[31754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 08:00:07 np0005548789.localdomain sudo[31754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:07 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) close
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: load: jerasure load: lrc 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) close
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.381850414 +0000 UTC m=+0.064726617 container create 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) close
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: Started libpod-conmon-6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9.scope.
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.352359948 +0000 UTC m=+0.035236151 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.461875098 +0000 UTC m=+0.144751301 container init 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: tmp-crun.yuQ3KR.mount: Deactivated successfully.
Dec 06 08:00:08 np0005548789.localdomain interesting_cannon[31839]: 167 167
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: libpod-6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9.scope: Deactivated successfully.
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.479019659 +0000 UTC m=+0.161895852 container start 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.479291031 +0000 UTC m=+0.162167274 container attach 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:00:08 np0005548789.localdomain podman[31819]: 2025-12-06 08:00:08.481249717 +0000 UTC m=+0.164125910 container died 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218, vcs-type=git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 06 08:00:08 np0005548789.localdomain podman[31846]: 2025-12-06 08:00:08.562217928 +0000 UTC m=+0.075226928 container remove 6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_cannon, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: libpod-conmon-6c336cb3e5387a7b758f25f809560528f6a37e213826d47eed9754d05d8d14c9.scope: Deactivated successfully.
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180dee00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs mount
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs mount shared_bdev_used = 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Git sha 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DB SUMMARY
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DB Session ID:  94CK83XUDEHNZM6YXUMG
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                     Options.env: 0x55d118372cb0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                Options.info_log: 0x55d119086b80
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.write_buffer_manager: 0x55d1180c8140
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Compression algorithms supported:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b6850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d119086f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a8776748-4cbd-414c-ba30-4c8b866ee02f
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008008721082, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008008721779, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: freelist init
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: freelist _read_cfg
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs umount
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) close
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:08.854346001 +0000 UTC m=+0.058999642 container create c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph)
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: Started libpod-conmon-c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8.scope.
Dec 06 08:00:08 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2a1bb509f16d22ba1841ed29c5a3f6f38c01b4714016f09e3c4d5a545c16b96/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:08.828171606 +0000 UTC m=+0.032825247 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2a1bb509f16d22ba1841ed29c5a3f6f38c01b4714016f09e3c4d5a545c16b96/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2a1bb509f16d22ba1841ed29c5a3f6f38c01b4714016f09e3c4d5a545c16b96/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2a1bb509f16d22ba1841ed29c5a3f6f38c01b4714016f09e3c4d5a545c16b96/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2a1bb509f16d22ba1841ed29c5a3f6f38c01b4714016f09e3c4d5a545c16b96/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bdev(0x55d1180df180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs mount
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:08.981027614 +0000 UTC m=+0.185681245 container init c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1763362218, distribution-scope=public)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluefs mount shared_bdev_used = 4718592
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Git sha 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DB SUMMARY
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DB Session ID:  94CK83XUDEHNZM6YXUMH
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                     Options.env: 0x55d1182044d0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                Options.info_log: 0x55d1190eb7c0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.write_buffer_manager: 0x55d1180c95e0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Compression algorithms supported:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:08.992337677 +0000 UTC m=+0.196991288 container start c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 08:00:08 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:08.992493323 +0000 UTC m=+0.197146934 container attach c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eab00)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b62d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:08 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eb080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b7610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eb080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b7610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55d1190eb080)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55d1180b7610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: a8776748-4cbd-414c-ba30-4c8b866ee02f
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009002171, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009007555, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a8776748-4cbd-414c-ba30-4c8b866ee02f", "db_session_id": "94CK83XUDEHNZM6YXUMH", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009011635, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a8776748-4cbd-414c-ba30-4c8b866ee02f", "db_session_id": "94CK83XUDEHNZM6YXUMH", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009015516, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008009, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "a8776748-4cbd-414c-ba30-4c8b866ee02f", "db_session_id": "94CK83XUDEHNZM6YXUMH", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008009019424, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55d119120380
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: DB pointer 0x55d118fe5a00
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: _get_class not permitted to load lua
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: _get_class not permitted to load sdk
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: _get_class not permitted to load test_remote_reads
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 load_pgs
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 load_pgs opened 0 pgs
Dec 06 08:00:09 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1[31722]: 2025-12-06T08:00:09.054+0000 7fea544e1a80 -1 osd.1 0 log_to_monitors true
Dec 06 08:00:09 np0005548789.localdomain ceph-osd[31726]: osd.1 0 log_to_monitors true
Dec 06 08:00:09 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test[32084]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Dec 06 08:00:09 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test[32084]:                             [--no-systemd] [--no-tmpfs]
Dec 06 08:00:09 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test[32084]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: libpod-c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8.scope: Deactivated successfully.
Dec 06 08:00:09 np0005548789.localdomain podman[32070]: 2025-12-06 08:00:09.223530262 +0000 UTC m=+0.428183943 container died c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:09 np0005548789.localdomain podman[32304]: 2025-12-06 08:00:09.295409268 +0000 UTC m=+0.063550171 container remove c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate-test, io.openshift.expose-services=, release=1763362218, vcs-type=git, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: libpod-conmon-c060757ee533e0eeaee36da875f269e6a9be2f9d4f8f6f19a4c4c083c8503eb8.scope: Deactivated successfully.
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: tmp-crun.qu9pSr.mount: Deactivated successfully.
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c460c71d6d316c31195dbc0386d81e0bda44fb65a44f4b710533bcd0b6a13523-merged.mount: Deactivated successfully.
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:00:09 np0005548789.localdomain systemd-rc-local-generator[32356]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:09 np0005548789.localdomain systemd-sysv-generator[32361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:09 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:00:10 np0005548789.localdomain systemd-sysv-generator[32403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:00:10 np0005548789.localdomain systemd-rc-local-generator[32398]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 06 08:00:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:00:10 np0005548789.localdomain systemd[1]: Starting Ceph osd.4 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:10.50941114 +0000 UTC m=+0.073796801 container create 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:00:10 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:10.479649965 +0000 UTC m=+0.044035626 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:10.63552945 +0000 UTC m=+0.199915081 container init 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:10.644538913 +0000 UTC m=+0.208924544 container start 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:10 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:10.644856076 +0000 UTC m=+0.209241757 container attach 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 done with init, starting boot process
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 start_boot
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 06 08:00:10 np0005548789.localdomain ceph-osd[31726]: osd.1 0  bench count 12288000 bsize 4 KiB
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4
Dec 06 08:00:11 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate[32476]: --> ceph-volume raw activate successful for osd ID: 4
Dec 06 08:00:11 np0005548789.localdomain bash[32461]: --> ceph-volume raw activate successful for osd ID: 4
Dec 06 08:00:11 np0005548789.localdomain systemd[1]: libpod-7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26.scope: Deactivated successfully.
Dec 06 08:00:11 np0005548789.localdomain podman[32461]: 2025-12-06 08:00:11.268198972 +0000 UTC m=+0.832584603 container died 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, io.openshift.expose-services=, release=1763362218, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7)
Dec 06 08:00:11 np0005548789.localdomain systemd[1]: tmp-crun.MW5epm.mount: Deactivated successfully.
Dec 06 08:00:11 np0005548789.localdomain podman[32592]: 2025-12-06 08:00:11.390733052 +0000 UTC m=+0.112208776 container remove 7928993af342d158682f3e2479011935f4d1c74cdb27150d01f8d3055cf76c26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, release=1763362218, architecture=x86_64)
Dec 06 08:00:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-10d3e03b0082edccbb0473fc8c800342c53a718e7ccaf25d40e921884621a8eb-merged.mount: Deactivated successfully.
Dec 06 08:00:11 np0005548789.localdomain podman[32648]: 
Dec 06 08:00:11 np0005548789.localdomain podman[32648]: 2025-12-06 08:00:11.709609543 +0000 UTC m=+0.072306083 container create 26aaf6c09d77c2d4415791ab52c1b187900752ba32ce368e3441bd0a51c6a7d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, name=rhceph, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:11 np0005548789.localdomain podman[32648]: 2025-12-06 08:00:11.682694418 +0000 UTC m=+0.045390968 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d835c0e84d0d73d25a0e895cf870af98669736b533feb63c312a535935e7da2b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d835c0e84d0d73d25a0e895cf870af98669736b533feb63c312a535935e7da2b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d835c0e84d0d73d25a0e895cf870af98669736b533feb63c312a535935e7da2b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d835c0e84d0d73d25a0e895cf870af98669736b533feb63c312a535935e7da2b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d835c0e84d0d73d25a0e895cf870af98669736b533feb63c312a535935e7da2b/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:11 np0005548789.localdomain podman[32648]: 2025-12-06 08:00:11.866305191 +0000 UTC m=+0.229001731 container init 26aaf6c09d77c2d4415791ab52c1b187900752ba32ce368e3441bd0a51c6a7d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1763362218, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:11 np0005548789.localdomain podman[32648]: 2025-12-06 08:00:11.899508301 +0000 UTC m=+0.262204841 container start 26aaf6c09d77c2d4415791ab52c1b187900752ba32ce368e3441bd0a51c6a7d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4, version=7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, name=rhceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:11 np0005548789.localdomain bash[32648]: 26aaf6c09d77c2d4415791ab52c1b187900752ba32ce368e3441bd0a51c6a7d9
Dec 06 08:00:11 np0005548789.localdomain systemd[1]: Started Ceph osd.4 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: pidfile_write: ignore empty --pid-file
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Dec 06 08:00:11 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) close
Dec 06 08:00:11 np0005548789.localdomain sudo[31754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:12 np0005548789.localdomain sudo[32678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:12 np0005548789.localdomain sudo[32678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:12 np0005548789.localdomain sudo[32678]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:12 np0005548789.localdomain sudo[32693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- raw list --format json
Dec 06 08:00:12 np0005548789.localdomain sudo[32693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) close
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: starting osd.4 osd_data /var/lib/ceph/osd/ceph-4 /var/lib/ceph/osd/ceph-4/journal
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: load: jerasure load: lrc 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) close
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) close
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146912e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs mount
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs mount shared_bdev_used = 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Git sha 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DB SUMMARY
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DB Session ID:  3EP43754JMQKP9Z6PGMN
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                     Options.env: 0x55e146ba6cb0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                Options.info_log: 0x55e1478a4b80
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.write_buffer_manager: 0x55e1468fc140
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Compression algorithms supported:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4d40)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478a4f60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2be23737-9346-4332-89ed-d1c619a8d7ac
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012523491, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012523642, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old nid_max 1025
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old blobid_max 10240
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta min_alloc_size 0x1000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: freelist init
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: freelist _read_cfg
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs umount
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) close
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.744274781 +0000 UTC m=+0.086159086 container create 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bdev(0x55e146913180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs mount
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluefs mount shared_bdev_used = 4718592
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Dec 06 08:00:12 np0005548789.localdomain systemd[1]: Started libpod-conmon-062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650.scope.
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: RocksDB version: 7.9.2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Git sha 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DB SUMMARY
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DB Session ID:  3EP43754JMQKP9Z6PGMM
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: CURRENT file:  CURRENT
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.error_if_exists: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.create_if_missing: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                     Options.env: 0x55e146956310
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                Options.info_log: 0x55e1478a5c80
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.statistics: (nil)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.use_fsync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.db_log_dir: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.write_buffer_manager: 0x55e1468fd540
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.unordered_write: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.row_cache: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                              Options.wal_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.two_write_queues: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.wal_compression: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.atomic_flush: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_background_jobs: 4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_background_compactions: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_subcompactions: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.max_open_files: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Compression algorithms supported:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZSTD supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kXpressCompression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kBZip2Compression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kLZ4Compression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kZlibCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         kSnappyCompression supported: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.700724635 +0000 UTC m=+0.042608940 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e5600)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468ea2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e53c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468eb610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Dec 06 08:00:12 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e53c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468eb610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:           Options.merge_operator: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e1478e53c0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55e1468eb610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.compression: LZ4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.num_levels: 7
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.bloom_locality: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                               Options.ttl: 2592000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                       Options.enable_blob_files: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                           Options.min_blob_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2be23737-9346-4332-89ed-d1c619a8d7ac
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012804943, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012811285, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008012, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2be23737-9346-4332-89ed-d1c619a8d7ac", "db_session_id": "3EP43754JMQKP9Z6PGMM", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.830700026 +0000 UTC m=+0.172584331 container init 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012833966, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008012, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2be23737-9346-4332-89ed-d1c619a8d7ac", "db_session_id": "3EP43754JMQKP9Z6PGMM", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:12 np0005548789.localdomain elated_montalcini[32983]: 167 167
Dec 06 08:00:12 np0005548789.localdomain systemd[1]: libpod-062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650.scope: Deactivated successfully.
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.863347496 +0000 UTC m=+0.205231781 container start 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main)
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.863555444 +0000 UTC m=+0.205439799 container attach 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, version=7, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012864151, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765008012, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2be23737-9346-4332-89ed-d1c619a8d7ac", "db_session_id": "3EP43754JMQKP9Z6PGMM", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 06 08:00:12 np0005548789.localdomain podman[32951]: 2025-12-06 08:00:12.865073433 +0000 UTC m=+0.206957728 container died 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, name=rhceph, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1763362218, distribution-scope=public, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765008012869926, "job": 1, "event": "recovery_finished"}
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 06 08:00:12 np0005548789.localdomain podman[33153]: 2025-12-06 08:00:12.951830031 +0000 UTC m=+0.100713396 container remove 062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_montalcini, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64)
Dec 06 08:00:12 np0005548789.localdomain systemd[1]: libpod-conmon-062a7570aa00d3c5da84e3785e99b86cf5c20a4926571b10855a1c3817f75650.scope: Deactivated successfully.
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e14773e380
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: DB pointer 0x55e147803a00
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super from 4, latest 4
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super done
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.2 total, 0.2 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: _get_class not permitted to load lua
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: _get_class not permitted to load sdk
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: _get_class not permitted to load test_remote_reads
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 load_pgs
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 load_pgs opened 0 pgs
Dec 06 08:00:12 np0005548789.localdomain ceph-osd[32665]: osd.4 0 log_to_monitors true
Dec 06 08:00:12 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4[32661]: 2025-12-06T08:00:12.968+0000 7f2ecde47a80 -1 osd.4 0 log_to_monitors true
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.152122957 +0000 UTC m=+0.076124924 container create a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., release=1763362218, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7.scope.
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56c7bb213dc1e6a106017d35c011284484d6206a185226214dd6db3c880dc55d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56c7bb213dc1e6a106017d35c011284484d6206a185226214dd6db3c880dc55d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.120209106 +0000 UTC m=+0.044211093 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56c7bb213dc1e6a106017d35c011284484d6206a185226214dd6db3c880dc55d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.241672615 +0000 UTC m=+0.165674582 container init a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z)
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.247865957 +0000 UTC m=+0.171867894 container start a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=)
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.248030374 +0000 UTC m=+0.172032381 container attach a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, release=1763362218, version=7, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-be233b13925594996adf79477244c69cf3ae9854042743bf672fc2db1adf5227-merged.mount: Deactivated successfully.
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]: {
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:     "1f710487-3a3c-4f3d-8622-d6fac6224470": {
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "ceph_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "osd_id": 1,
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "osd_uuid": "1f710487-3a3c-4f3d-8622-d6fac6224470",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "type": "bluestore"
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:     },
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:     "876fe068-f1aa-42bd-a56b-91d35874dd8e": {
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "ceph_fsid": "1939e851-b10c-5c3b-9bb7-8e7f380233e8",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "osd_id": 4,
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "osd_uuid": "876fe068-f1aa-42bd-a56b-91d35874dd8e",
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:         "type": "bluestore"
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]:     }
Dec 06 08:00:13 np0005548789.localdomain trusting_ishizaka[33223]: }
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: libpod-a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7.scope: Deactivated successfully.
Dec 06 08:00:13 np0005548789.localdomain podman[33208]: 2025-12-06 08:00:13.792419997 +0000 UTC m=+0.716421944 container died a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1763362218, description=Red Hat Ceph Storage 7)
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: tmp-crun.K2VloQ.mount: Deactivated successfully.
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-56c7bb213dc1e6a106017d35c011284484d6206a185226214dd6db3c880dc55d-merged.mount: Deactivated successfully.
Dec 06 08:00:13 np0005548789.localdomain podman[33259]: 2025-12-06 08:00:13.874097866 +0000 UTC m=+0.074613603 container remove a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_ishizaka, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:13 np0005548789.localdomain systemd[1]: libpod-conmon-a61415243badf521a125f5418ef4676d450c16091564e5e0c063250cb26e33c7.scope: Deactivated successfully.
Dec 06 08:00:13 np0005548789.localdomain sudo[32693]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:13 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 06 08:00:13 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 24.918 iops: 6379.037 elapsed_sec: 0.470
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [WRN] : OSD bench result of 6379.037428 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 08:00:14 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1[31722]: 2025-12-06T08:00:14.279+0000 7fea50c75640 -1 osd.1 0 waiting for initial osdmap
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 0 waiting for initial osdmap
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 check_osdmap_features require_osd_release unknown -> reef
Dec 06 08:00:14 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-1[31722]: 2025-12-06T08:00:14.295+0000 7fea4ba8a640 -1 osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 set_numa_affinity not setting numa affinity
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 done with init, starting boot process
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 start_boot
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[32665]: osd.4 0  bench count 12288000 bsize 4 KiB
Dec 06 08:00:14 np0005548789.localdomain ceph-osd[31726]: osd.1 12 state: booting -> active
Dec 06 08:00:15 np0005548789.localdomain sudo[33272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:15 np0005548789.localdomain sudo[33272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:15 np0005548789.localdomain sudo[33272]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:15 np0005548789.localdomain sudo[33287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:15 np0005548789.localdomain sudo[33287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:15 np0005548789.localdomain sudo[33287]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:15 np0005548789.localdomain sudo[33302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:00:15 np0005548789.localdomain sudo[33302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:16 np0005548789.localdomain podman[33387]: 2025-12-06 08:00:16.135811679 +0000 UTC m=+0.085257431 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:16 np0005548789.localdomain podman[33387]: 2025-12-06 08:00:16.26606844 +0000 UTC m=+0.215514182 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True)
Dec 06 08:00:16 np0005548789.localdomain sudo[33302]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:16 np0005548789.localdomain sudo[33453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:16 np0005548789.localdomain sudo[33453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:16 np0005548789.localdomain sudo[33453]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:16 np0005548789.localdomain sudo[33468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:00:16 np0005548789.localdomain sudo[33468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:16 np0005548789.localdomain ceph-osd[31726]: osd.1 14 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 06 08:00:16 np0005548789.localdomain ceph-osd[31726]: osd.1 14 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec 06 08:00:16 np0005548789.localdomain ceph-osd[31726]: osd.1 14 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 06 08:00:17 np0005548789.localdomain sudo[33468]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:17 np0005548789.localdomain sudo[33515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:17 np0005548789.localdomain sudo[33515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:17 np0005548789.localdomain sudo[33515]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:17 np0005548789.localdomain sudo[33530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 08:00:17 np0005548789.localdomain sudo[33530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:17 np0005548789.localdomain podman[33582]: 
Dec 06 08:00:17 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:17.959261024 +0000 UTC m=+0.080463543 container create 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z)
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: Started libpod-conmon-2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e.scope.
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:18 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:17.9244452 +0000 UTC m=+0.045647709 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:18 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:18.034391396 +0000 UTC m=+0.155593895 container init 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: tmp-crun.agFO7Q.mount: Deactivated successfully.
Dec 06 08:00:18 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:18.048813352 +0000 UTC m=+0.170015841 container start 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64)
Dec 06 08:00:18 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:18.049220438 +0000 UTC m=+0.170422927 container attach 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1763362218, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, CEPH_POINT_RELEASE=)
Dec 06 08:00:18 np0005548789.localdomain optimistic_raman[33597]: 167 167
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: libpod-2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e.scope: Deactivated successfully.
Dec 06 08:00:18 np0005548789.localdomain podman[33582]: 2025-12-06 08:00:18.055586987 +0000 UTC m=+0.176789536 container died 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, architecture=x86_64, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 08:00:18 np0005548789.localdomain podman[33602]: 2025-12-06 08:00:18.146099002 +0000 UTC m=+0.076785929 container remove 2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_raman, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1763362218, ceph=True, RELEASE=main, architecture=x86_64, version=7, build-date=2025-11-26T19:44:28Z, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: libpod-conmon-2814111e8d4784651e220743759d702252829c7367776880be297c040ecb485e.scope: Deactivated successfully.
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:18.344889289 +0000 UTC m=+0.079977744 container create cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: Started libpod-conmon-cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836.scope.
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:18.310388067 +0000 UTC m=+0.045476502 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:00:18 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5336e63345ed72e68d84ec37fd1934d59ba978f80334196b9943f6eb0118b02c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:18 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5336e63345ed72e68d84ec37fd1934d59ba978f80334196b9943f6eb0118b02c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:18 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5336e63345ed72e68d84ec37fd1934d59ba978f80334196b9943f6eb0118b02c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:18.437280888 +0000 UTC m=+0.172369313 container init cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:18.447613123 +0000 UTC m=+0.182701568 container start cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, vcs-type=git, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main)
Dec 06 08:00:18 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:18.447841591 +0000 UTC m=+0.182930016 container attach cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 23.704 iops: 6068.149 elapsed_sec: 0.494
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [WRN] : OSD bench result of 6068.149371 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.4. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 0 waiting for initial osdmap
Dec 06 08:00:18 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4[32661]: 2025-12-06T08:00:18.510+0000 7f2ec9dc6640 -1 osd.4 0 waiting for initial osdmap
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 check_osdmap_features require_osd_release unknown -> reef
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 set_numa_affinity not setting numa affinity
Dec 06 08:00:18 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-osd-4[32661]: 2025-12-06T08:00:18.538+0000 7f2ec53f0640 -1 osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 15 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Dec 06 08:00:18 np0005548789.localdomain ceph-osd[32665]: osd.4 16 state: booting -> active
Dec 06 08:00:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f8be4287a234b7198115c856160ae4c831433b3f562b3214a42524bb2137b5ca-merged.mount: Deactivated successfully.
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]: [
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:     {
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "available": false,
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "ceph_device": false,
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "lsm_data": {},
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "lvs": [],
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "path": "/dev/sr0",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "rejected_reasons": [
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "Has a FileSystem",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "Insufficient space (<5GB)"
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         ],
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         "sys_api": {
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "actuators": null,
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "device_nodes": "sr0",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "human_readable_size": "482.00 KB",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "id_bus": "ata",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "model": "QEMU DVD-ROM",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "nr_requests": "2",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "partitions": {},
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "path": "/dev/sr0",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "removable": "1",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "rev": "2.5+",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "ro": "0",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "rotational": "1",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "sas_address": "",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "sas_device_handle": "",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "scheduler_mode": "mq-deadline",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "sectors": 0,
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "sectorsize": "2048",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "size": 493568.0,
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "support_discard": "0",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "type": "disk",
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:             "vendor": "QEMU"
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:         }
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]:     }
Dec 06 08:00:19 np0005548789.localdomain trusting_pasteur[33636]: ]
Dec 06 08:00:19 np0005548789.localdomain systemd[1]: libpod-cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836.scope: Deactivated successfully.
Dec 06 08:00:19 np0005548789.localdomain podman[33621]: 2025-12-06 08:00:19.234701733 +0000 UTC m=+0.969790178 container died cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, version=7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:00:19 np0005548789.localdomain systemd[1]: tmp-crun.zsDxso.mount: Deactivated successfully.
Dec 06 08:00:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5336e63345ed72e68d84ec37fd1934d59ba978f80334196b9943f6eb0118b02c-merged.mount: Deactivated successfully.
Dec 06 08:00:19 np0005548789.localdomain podman[34855]: 2025-12-06 08:00:19.325535311 +0000 UTC m=+0.079943962 container remove cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_pasteur, vendor=Red Hat, Inc., name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 08:00:19 np0005548789.localdomain systemd[1]: libpod-conmon-cd4cef547ad957d0bca97edb5522192192af225d69c0eb731063f49be89ec836.scope: Deactivated successfully.
Dec 06 08:00:19 np0005548789.localdomain sudo[33530]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:19 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=16) [3,4,2] r=1 lpr=16 pi=[14,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:00:22 np0005548789.localdomain sudo[34869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:22 np0005548789.localdomain sudo[34869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:22 np0005548789.localdomain sudo[34869]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:26 np0005548789.localdomain sshd[34884]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:27 np0005548789.localdomain sshd[34886]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:27 np0005548789.localdomain sshd[34886]: Received disconnect from 162.241.87.197 port 40750:11: Bye Bye [preauth]
Dec 06 08:00:27 np0005548789.localdomain sshd[34886]: Disconnected from authenticating user root 162.241.87.197 port 40750 [preauth]
Dec 06 08:00:27 np0005548789.localdomain sshd[34884]: Received disconnect from 77.222.100.142 port 37678:11: Bye Bye [preauth]
Dec 06 08:00:27 np0005548789.localdomain sshd[34884]: Disconnected from authenticating user root 77.222.100.142 port 37678 [preauth]
Dec 06 08:00:27 np0005548789.localdomain sudo[34888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:00:27 np0005548789.localdomain sudo[34888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:27 np0005548789.localdomain sudo[34888]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:28 np0005548789.localdomain sudo[34903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:00:28 np0005548789.localdomain sudo[34903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:28 np0005548789.localdomain systemd[26209]: Starting Mark boot as successful...
Dec 06 08:00:28 np0005548789.localdomain systemd[26209]: Finished Mark boot as successful.
Dec 06 08:00:28 np0005548789.localdomain podman[34985]: 2025-12-06 08:00:28.733521866 +0000 UTC m=+0.100884923 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1763362218, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7)
Dec 06 08:00:28 np0005548789.localdomain podman[34985]: 2025-12-06 08:00:28.84112271 +0000 UTC m=+0.208485757 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vcs-type=git, release=1763362218, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container)
Dec 06 08:00:29 np0005548789.localdomain sudo[34903]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:29 np0005548789.localdomain sudo[35047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:00:29 np0005548789.localdomain sudo[35047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:00:29 np0005548789.localdomain sudo[35047]: pam_unix(sudo:session): session closed for user root
Dec 06 08:00:31 np0005548789.localdomain sshd[35062]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:32 np0005548789.localdomain sshd[35062]: Received disconnect from 195.250.72.168 port 60050:11: Bye Bye [preauth]
Dec 06 08:00:32 np0005548789.localdomain sshd[35062]: Disconnected from authenticating user root 195.250.72.168 port 60050 [preauth]
Dec 06 08:00:45 np0005548789.localdomain sshd[35065]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:00:46 np0005548789.localdomain sshd[35065]: Received disconnect from 74.94.234.151 port 43464:11: Bye Bye [preauth]
Dec 06 08:00:46 np0005548789.localdomain sshd[35065]: Disconnected from authenticating user root 74.94.234.151 port 43464 [preauth]
Dec 06 08:01:01 np0005548789.localdomain CROND[35068]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 08:01:01 np0005548789.localdomain run-parts[35071]: (/etc/cron.hourly) starting 0anacron
Dec 06 08:01:01 np0005548789.localdomain run-parts[35077]: (/etc/cron.hourly) finished 0anacron
Dec 06 08:01:01 np0005548789.localdomain CROND[35067]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 08:01:15 np0005548789.localdomain sshd[35078]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:01:17 np0005548789.localdomain sshd[35078]: Received disconnect from 154.201.83.49 port 48392:11: Bye Bye [preauth]
Dec 06 08:01:17 np0005548789.localdomain sshd[35078]: Disconnected from authenticating user root 154.201.83.49 port 48392 [preauth]
Dec 06 08:01:29 np0005548789.localdomain sudo[35080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:01:29 np0005548789.localdomain sudo[35080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:29 np0005548789.localdomain sudo[35080]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:29 np0005548789.localdomain sudo[35095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:01:29 np0005548789.localdomain sudo[35095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:30 np0005548789.localdomain podman[35182]: 2025-12-06 08:01:30.546054814 +0000 UTC m=+0.081783801 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.expose-services=, version=7, io.buildah.version=1.41.4, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:01:30 np0005548789.localdomain podman[35182]: 2025-12-06 08:01:30.654631292 +0000 UTC m=+0.190360239 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:01:30 np0005548789.localdomain sudo[35095]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:31 np0005548789.localdomain sudo[35249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:01:31 np0005548789.localdomain sudo[35249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:31 np0005548789.localdomain sudo[35249]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:31 np0005548789.localdomain sudo[35264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:01:31 np0005548789.localdomain sudo[35264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:31 np0005548789.localdomain sshd[35278]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:01:31 np0005548789.localdomain sudo[35264]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:32 np0005548789.localdomain sshd[35278]: Received disconnect from 77.222.100.142 port 49318:11: Bye Bye [preauth]
Dec 06 08:01:32 np0005548789.localdomain sshd[35278]: Disconnected from authenticating user root 77.222.100.142 port 49318 [preauth]
Dec 06 08:01:32 np0005548789.localdomain sudo[35314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:01:32 np0005548789.localdomain sudo[35314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:01:32 np0005548789.localdomain sudo[35314]: pam_unix(sudo:session): session closed for user root
Dec 06 08:01:34 np0005548789.localdomain sshd[24685]: Received disconnect from 192.168.122.100 port 57464:11: disconnected by user
Dec 06 08:01:34 np0005548789.localdomain sshd[24685]: Disconnected from user zuul 192.168.122.100 port 57464
Dec 06 08:01:34 np0005548789.localdomain sshd[24682]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:01:34 np0005548789.localdomain systemd-logind[766]: Session 13 logged out. Waiting for processes to exit.
Dec 06 08:01:34 np0005548789.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Dec 06 08:01:34 np0005548789.localdomain systemd[1]: session-13.scope: Consumed 20.640s CPU time.
Dec 06 08:01:34 np0005548789.localdomain systemd-logind[766]: Removed session 13.
Dec 06 08:01:35 np0005548789.localdomain sshd[35329]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:01:35 np0005548789.localdomain sshd[35329]: Received disconnect from 162.241.87.197 port 42174:11: Bye Bye [preauth]
Dec 06 08:01:35 np0005548789.localdomain sshd[35329]: Disconnected from authenticating user root 162.241.87.197 port 42174 [preauth]
Dec 06 08:01:58 np0005548789.localdomain sshd[35331]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:01:59 np0005548789.localdomain sshd[35331]: Received disconnect from 195.250.72.168 port 55732:11: Bye Bye [preauth]
Dec 06 08:01:59 np0005548789.localdomain sshd[35331]: Disconnected from authenticating user root 195.250.72.168 port 55732 [preauth]
Dec 06 08:02:05 np0005548789.localdomain sshd[35333]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:02:06 np0005548789.localdomain sshd[35333]: Received disconnect from 74.94.234.151 port 41878:11: Bye Bye [preauth]
Dec 06 08:02:06 np0005548789.localdomain sshd[35333]: Disconnected from authenticating user root 74.94.234.151 port 41878 [preauth]
Dec 06 08:02:32 np0005548789.localdomain sudo[35335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:02:32 np0005548789.localdomain sudo[35335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:32 np0005548789.localdomain sudo[35335]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:32 np0005548789.localdomain sudo[35350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:02:32 np0005548789.localdomain sudo[35350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:33 np0005548789.localdomain sudo[35350]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:34 np0005548789.localdomain sudo[35397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:02:34 np0005548789.localdomain sudo[35397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:02:34 np0005548789.localdomain sudo[35397]: pam_unix(sudo:session): session closed for user root
Dec 06 08:02:37 np0005548789.localdomain sshd[35412]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:02:38 np0005548789.localdomain sshd[35412]: Received disconnect from 77.222.100.142 port 53356:11: Bye Bye [preauth]
Dec 06 08:02:38 np0005548789.localdomain sshd[35412]: Disconnected from authenticating user root 77.222.100.142 port 53356 [preauth]
Dec 06 08:02:47 np0005548789.localdomain sshd[35414]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:02:47 np0005548789.localdomain sshd[35414]: Received disconnect from 162.241.87.197 port 52108:11: Bye Bye [preauth]
Dec 06 08:02:47 np0005548789.localdomain sshd[35414]: Disconnected from authenticating user root 162.241.87.197 port 52108 [preauth]
Dec 06 08:02:49 np0005548789.localdomain sshd[35416]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:02:51 np0005548789.localdomain sshd[35416]: Received disconnect from 154.201.83.49 port 44930:11: Bye Bye [preauth]
Dec 06 08:02:51 np0005548789.localdomain sshd[35416]: Disconnected from authenticating user root 154.201.83.49 port 44930 [preauth]
Dec 06 08:03:25 np0005548789.localdomain sshd[35418]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:26 np0005548789.localdomain sshd[35418]: Received disconnect from 74.94.234.151 port 40292:11: Bye Bye [preauth]
Dec 06 08:03:26 np0005548789.localdomain sshd[35418]: Disconnected from authenticating user root 74.94.234.151 port 40292 [preauth]
Dec 06 08:03:27 np0005548789.localdomain sshd[35420]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:28 np0005548789.localdomain sshd[35420]: Received disconnect from 195.250.72.168 port 57290:11: Bye Bye [preauth]
Dec 06 08:03:28 np0005548789.localdomain sshd[35420]: Disconnected from authenticating user root 195.250.72.168 port 57290 [preauth]
Dec 06 08:03:34 np0005548789.localdomain sudo[35422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:03:34 np0005548789.localdomain sudo[35422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:34 np0005548789.localdomain sudo[35422]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:34 np0005548789.localdomain sudo[35437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:03:34 np0005548789.localdomain sudo[35437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:34 np0005548789.localdomain sudo[35437]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:35 np0005548789.localdomain sudo[35484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:03:35 np0005548789.localdomain sudo[35484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:03:35 np0005548789.localdomain sudo[35484]: pam_unix(sudo:session): session closed for user root
Dec 06 08:03:46 np0005548789.localdomain sshd[35499]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:47 np0005548789.localdomain sshd[35499]: Received disconnect from 77.222.100.142 port 37778:11: Bye Bye [preauth]
Dec 06 08:03:47 np0005548789.localdomain sshd[35499]: Disconnected from authenticating user root 77.222.100.142 port 37778 [preauth]
Dec 06 08:03:59 np0005548789.localdomain sshd[35501]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:03:59 np0005548789.localdomain sshd[35501]: Received disconnect from 162.241.87.197 port 56376:11: Bye Bye [preauth]
Dec 06 08:03:59 np0005548789.localdomain sshd[35501]: Disconnected from authenticating user root 162.241.87.197 port 56376 [preauth]
Dec 06 08:04:16 np0005548789.localdomain systemd[26209]: Created slice User Background Tasks Slice.
Dec 06 08:04:16 np0005548789.localdomain systemd[26209]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 08:04:16 np0005548789.localdomain systemd[26209]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 08:04:32 np0005548789.localdomain sshd[35504]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:04:34 np0005548789.localdomain sshd[35504]: Received disconnect from 154.201.83.49 port 42274:11: Bye Bye [preauth]
Dec 06 08:04:34 np0005548789.localdomain sshd[35504]: Disconnected from authenticating user root 154.201.83.49 port 42274 [preauth]
Dec 06 08:04:35 np0005548789.localdomain sudo[35506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:04:35 np0005548789.localdomain sudo[35506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:35 np0005548789.localdomain sudo[35506]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:35 np0005548789.localdomain sudo[35521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:04:35 np0005548789.localdomain sudo[35521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:36 np0005548789.localdomain sudo[35521]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:37 np0005548789.localdomain sudo[35569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:04:37 np0005548789.localdomain sudo[35569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:04:37 np0005548789.localdomain sudo[35569]: pam_unix(sudo:session): session closed for user root
Dec 06 08:04:51 np0005548789.localdomain sshd[35584]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:04:52 np0005548789.localdomain sshd[35584]: Received disconnect from 74.94.234.151 port 38708:11: Bye Bye [preauth]
Dec 06 08:04:52 np0005548789.localdomain sshd[35584]: Disconnected from authenticating user root 74.94.234.151 port 38708 [preauth]
Dec 06 08:04:57 np0005548789.localdomain sshd[35586]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:04:58 np0005548789.localdomain sshd[35586]: Received disconnect from 77.222.100.142 port 34912:11: Bye Bye [preauth]
Dec 06 08:04:58 np0005548789.localdomain sshd[35586]: Disconnected from authenticating user root 77.222.100.142 port 34912 [preauth]
Dec 06 08:05:02 np0005548789.localdomain sshd[35588]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:03 np0005548789.localdomain sshd[35588]: Received disconnect from 195.250.72.168 port 49648:11: Bye Bye [preauth]
Dec 06 08:05:03 np0005548789.localdomain sshd[35588]: Disconnected from authenticating user root 195.250.72.168 port 49648 [preauth]
Dec 06 08:05:12 np0005548789.localdomain sshd[35590]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:13 np0005548789.localdomain sshd[35590]: Received disconnect from 162.241.87.197 port 36894:11: Bye Bye [preauth]
Dec 06 08:05:13 np0005548789.localdomain sshd[35590]: Disconnected from authenticating user root 162.241.87.197 port 36894 [preauth]
Dec 06 08:05:14 np0005548789.localdomain sshd[35592]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:14 np0005548789.localdomain sshd[35592]: Accepted publickey for zuul from 192.168.122.100 port 36432 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:05:14 np0005548789.localdomain systemd-logind[766]: New session 27 of user zuul.
Dec 06 08:05:14 np0005548789.localdomain systemd[1]: Started Session 27 of User zuul.
Dec 06 08:05:14 np0005548789.localdomain sshd[35592]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:05:14 np0005548789.localdomain sudo[35638]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kecivhypstvbdpmtltdsvsxqzlwmlduw ; /usr/bin/python3
Dec 06 08:05:14 np0005548789.localdomain sudo[35638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:14 np0005548789.localdomain python3[35640]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 08:05:14 np0005548789.localdomain sudo[35638]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:15 np0005548789.localdomain sudo[35683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnkkcvvffvktwqwskbsvbsxakvbsppez ; /usr/bin/python3
Dec 06 08:05:15 np0005548789.localdomain sudo[35683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:15 np0005548789.localdomain python3[35685]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:05:15 np0005548789.localdomain sudo[35683]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:15 np0005548789.localdomain sudo[35703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvnldjclwutargtcririzonqkqsyatqw ; /usr/bin/python3
Dec 06 08:05:15 np0005548789.localdomain sudo[35703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:15 np0005548789.localdomain python3[35705]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 08:05:16 np0005548789.localdomain useradd[35707]: new group: name=tripleo-admin, GID=1003
Dec 06 08:05:16 np0005548789.localdomain useradd[35707]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Dec 06 08:05:16 np0005548789.localdomain sudo[35703]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:16 np0005548789.localdomain sudo[35759]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywpjfhamiohzrbjfspypmduwcldrkdug ; /usr/bin/python3
Dec 06 08:05:16 np0005548789.localdomain sudo[35759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:16 np0005548789.localdomain python3[35761]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:05:16 np0005548789.localdomain sudo[35759]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:16 np0005548789.localdomain sudo[35802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrgtebuxgxpclydmsesqpgstnlzlmotv ; /usr/bin/python3
Dec 06 08:05:16 np0005548789.localdomain sudo[35802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:16 np0005548789.localdomain python3[35804]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765008316.2960074-66339-160939292098300/source _original_basename=tmpqfh2mgc1 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:17 np0005548789.localdomain sudo[35802]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:17 np0005548789.localdomain sudo[35832]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftkrkejmxlhphympgpbhfdnngzlwgsxd ; /usr/bin/python3
Dec 06 08:05:17 np0005548789.localdomain sudo[35832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:17 np0005548789.localdomain python3[35834]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:17 np0005548789.localdomain sudo[35832]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:17 np0005548789.localdomain sudo[35848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecworazqkjkovnuazhcokfjvjwhxbznk ; /usr/bin/python3
Dec 06 08:05:17 np0005548789.localdomain sudo[35848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:17 np0005548789.localdomain python3[35850]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:17 np0005548789.localdomain sudo[35848]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:18 np0005548789.localdomain sudo[35864]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efwdqgwpapetgokbtdbayeiuhzjrkvnr ; /usr/bin/python3
Dec 06 08:05:18 np0005548789.localdomain sudo[35864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:18 np0005548789.localdomain python3[35866]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:18 np0005548789.localdomain sudo[35864]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:18 np0005548789.localdomain sudo[35880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyjylmhqbrobtvrsrymvuphdfjkrzqpk ; /usr/bin/python3
Dec 06 08:05:18 np0005548789.localdomain sudo[35880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:05:18 np0005548789.localdomain python3[35882]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVgIoETU+ZMXzSQYJdf7tKLhQsLaB9easlDHbhHsBFXd1+Axjoyg338dVOvCx68r/a15lecdlSwbLqd4GXxUOdHnWLa1I9u6bd6azOwE0Dd6ZjnquN3BRq9dLJXMlKHhXMddL6WHNfxT/JOL+gKp0CM74naUBGqrzV05qlb19n7xZJtmxVohAGGeQdFwQJBVoQ6yZOjcJZ5CpbWCs4pFXZT/31fA0KIAJkrzAeUGRRkQEnzXY1riF0RHwvXaNJ0ZoAYfT7q263Pd5gnQEmpiBirUBH6CXJn4lIQyNMyVRbnKWemW9P1kyv2bjZUPg2b1xWBE7MBTs/wMt1RjdO9p+sxtwOd2IQMf1t3JLa2p3xqgxtGTMugpJUBr1TWwdLoHl+eAMuWZwAWofLWICHUlPzyTN8L8acu0im2eR60FEl9XdUjp8DYCBGxhhIVx+xZxj6nTnNc5T7GJpJlCTF+9YPlDVrLg8y/YXly0BoOqr7p+RaqMAJnoZymNDbuu9V3Vs= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:18 np0005548789.localdomain sudo[35880]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:19 np0005548789.localdomain python3[35896]: ansible-ping Invoked with data=pong
Dec 06 08:05:27 np0005548789.localdomain sshd[35897]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:29 np0005548789.localdomain sshd[35897]: Connection closed by authenticating user root 47.237.163.130 port 34168 [preauth]
Dec 06 08:05:30 np0005548789.localdomain sshd[35900]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:30 np0005548789.localdomain sshd[35900]: Accepted publickey for tripleo-admin from 192.168.122.100 port 58206 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 08:05:30 np0005548789.localdomain systemd-logind[766]: New session 28 of user tripleo-admin.
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Queued start job for default target Main User Target.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Created slice User Application Slice.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Reached target Paths.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Reached target Timers.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Starting D-Bus User Message Bus Socket...
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Starting Create User's Volatile Files and Directories...
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Finished Create User's Volatile Files and Directories.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Reached target Sockets.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Reached target Basic System.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Reached target Main User Target.
Dec 06 08:05:30 np0005548789.localdomain systemd[35904]: Startup finished in 126ms.
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 08:05:30 np0005548789.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Dec 06 08:05:30 np0005548789.localdomain sshd[35900]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 08:05:31 np0005548789.localdomain sudo[35964]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzfsizmwoseyvgscbpuwjhnrxtgmuqta ; /usr/bin/python3
Dec 06 08:05:31 np0005548789.localdomain sudo[35964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:31 np0005548789.localdomain python3[35966]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:05:31 np0005548789.localdomain sudo[35964]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:36 np0005548789.localdomain sudo[35984]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqcpaqzpjlseuggnakondchalykraeut ; /usr/bin/python3
Dec 06 08:05:36 np0005548789.localdomain sudo[35984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:36 np0005548789.localdomain python3[35986]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Dec 06 08:05:36 np0005548789.localdomain sudo[35984]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548789.localdomain sudo[35987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:05:37 np0005548789.localdomain sudo[35987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:37 np0005548789.localdomain sudo[35987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548789.localdomain sudo[36016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djhmrojsjltsxnhifuwnzrxfcfqasgke ; /usr/bin/python3
Dec 06 08:05:37 np0005548789.localdomain sudo[36016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:37 np0005548789.localdomain sudo[36015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:05:37 np0005548789.localdomain sudo[36015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:37 np0005548789.localdomain python3[36031]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Dec 06 08:05:37 np0005548789.localdomain sudo[36016]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548789.localdomain sudo[36093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rapfokbpcbazdvfkybrrrbmbpjezamzb ; /usr/bin/python3
Dec 06 08:05:37 np0005548789.localdomain sudo[36093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:37 np0005548789.localdomain sudo[36015]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:37 np0005548789.localdomain python3[36097]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.zk0iba52tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:37 np0005548789.localdomain sudo[36093]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:38 np0005548789.localdomain sudo[36139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqrbskoiijfznxlkigttwdjpbdbucewl ; /usr/bin/python3
Dec 06 08:05:38 np0005548789.localdomain sudo[36139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:38 np0005548789.localdomain python3[36141]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.zk0iba52tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:38 np0005548789.localdomain sudo[36139]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:38 np0005548789.localdomain sudo[36142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:05:38 np0005548789.localdomain sudo[36142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:05:38 np0005548789.localdomain sudo[36142]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:38 np0005548789.localdomain sshd[36157]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:39 np0005548789.localdomain sudo[36171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsuxefpaptvoqkbqtxybqrgwqcgfwfay ; /usr/bin/python3
Dec 06 08:05:39 np0005548789.localdomain sudo[36171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:39 np0005548789.localdomain python3[36173]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.zk0iba52tmphosts insertbefore=BOF block=172.17.0.106 np0005548788.localdomain np0005548788
                                                         172.18.0.106 np0005548788.storage.localdomain np0005548788.storage
                                                         172.20.0.106 np0005548788.storagemgmt.localdomain np0005548788.storagemgmt
                                                         172.17.0.106 np0005548788.internalapi.localdomain np0005548788.internalapi
                                                         172.19.0.106 np0005548788.tenant.localdomain np0005548788.tenant
                                                         192.168.122.106 np0005548788.ctlplane.localdomain np0005548788.ctlplane
                                                         172.17.0.107 np0005548789.localdomain np0005548789
                                                         172.18.0.107 np0005548789.storage.localdomain np0005548789.storage
                                                         172.20.0.107 np0005548789.storagemgmt.localdomain np0005548789.storagemgmt
                                                         172.17.0.107 np0005548789.internalapi.localdomain np0005548789.internalapi
                                                         172.19.0.107 np0005548789.tenant.localdomain np0005548789.tenant
                                                         192.168.122.107 np0005548789.ctlplane.localdomain np0005548789.ctlplane
                                                         172.17.0.108 np0005548790.localdomain np0005548790
                                                         172.18.0.108 np0005548790.storage.localdomain np0005548790.storage
                                                         172.20.0.108 np0005548790.storagemgmt.localdomain np0005548790.storagemgmt
                                                         172.17.0.108 np0005548790.internalapi.localdomain np0005548790.internalapi
                                                         172.19.0.108 np0005548790.tenant.localdomain np0005548790.tenant
                                                         192.168.122.108 np0005548790.ctlplane.localdomain np0005548790.ctlplane
                                                         172.17.0.103 np0005548785.localdomain np0005548785
                                                         172.18.0.103 np0005548785.storage.localdomain np0005548785.storage
                                                         172.20.0.103 np0005548785.storagemgmt.localdomain np0005548785.storagemgmt
                                                         172.17.0.103 np0005548785.internalapi.localdomain np0005548785.internalapi
                                                         172.19.0.103 np0005548785.tenant.localdomain np0005548785.tenant
                                                         192.168.122.103 np0005548785.ctlplane.localdomain np0005548785.ctlplane
                                                         172.17.0.104 np0005548786.localdomain np0005548786
                                                         172.18.0.104 np0005548786.storage.localdomain np0005548786.storage
                                                         172.20.0.104 np0005548786.storagemgmt.localdomain np0005548786.storagemgmt
                                                         172.17.0.104 np0005548786.internalapi.localdomain np0005548786.internalapi
                                                         172.19.0.104 np0005548786.tenant.localdomain np0005548786.tenant
                                                         192.168.122.104 np0005548786.ctlplane.localdomain np0005548786.ctlplane
                                                         172.17.0.105 np0005548787.localdomain np0005548787
                                                         172.18.0.105 np0005548787.storage.localdomain np0005548787.storage
                                                         172.20.0.105 np0005548787.storagemgmt.localdomain np0005548787.storagemgmt
                                                         172.17.0.105 np0005548787.internalapi.localdomain np0005548787.internalapi
                                                         172.19.0.105 np0005548787.tenant.localdomain np0005548787.tenant
                                                         192.168.122.105 np0005548787.ctlplane.localdomain np0005548787.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.250  overcloud.storage.localdomain
                                                         172.20.0.140  overcloud.storagemgmt.localdomain
                                                         172.17.0.168  overcloud.internalapi.localdomain
                                                         172.21.0.196  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:39 np0005548789.localdomain sudo[36171]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:39 np0005548789.localdomain sudo[36188]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkxkgzcybbowjstokrepbbsrfzlrevwy ; /usr/bin/python3
Dec 06 08:05:39 np0005548789.localdomain sudo[36188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:39 np0005548789.localdomain python3[36190]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.zk0iba52tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:39 np0005548789.localdomain sudo[36188]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:40 np0005548789.localdomain sudo[36205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzobzrsgilogjorhrmrfxapjgmvpovzi ; /usr/bin/python3
Dec 06 08:05:40 np0005548789.localdomain sudo[36205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:40 np0005548789.localdomain python3[36207]: ansible-file Invoked with path=/tmp/ansible.zk0iba52tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:05:40 np0005548789.localdomain sudo[36205]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:41 np0005548789.localdomain sudo[36221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbuyfbzfhjuigvroytvcrenfhxmfjpwy ; /usr/bin/python3
Dec 06 08:05:41 np0005548789.localdomain sudo[36221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:41 np0005548789.localdomain python3[36223]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:41 np0005548789.localdomain sudo[36221]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:41 np0005548789.localdomain sudo[36238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scgzeielkhftwlbwxueoomnnfnhyldhd ; /usr/bin/python3
Dec 06 08:05:41 np0005548789.localdomain sudo[36238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:42 np0005548789.localdomain python3[36240]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:05:42 np0005548789.localdomain sshd[36157]: Connection reset by authenticating user root 91.202.233.33 port 57528 [preauth]
Dec 06 08:05:42 np0005548789.localdomain sshd[36242]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:44 np0005548789.localdomain sshd[36242]: Connection reset by authenticating user root 91.202.233.33 port 35910 [preauth]
Dec 06 08:05:44 np0005548789.localdomain sshd[36245]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:46 np0005548789.localdomain sudo[36238]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:46 np0005548789.localdomain sshd[36245]: Invalid user ubuntu from 91.202.233.33 port 35916
Dec 06 08:05:47 np0005548789.localdomain sshd[36245]: Connection reset by invalid user ubuntu 91.202.233.33 port 35916 [preauth]
Dec 06 08:05:47 np0005548789.localdomain sshd[36248]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:47 np0005548789.localdomain sudo[36262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tetcbnvcadnuqtwmwccaojowdmawqnrv ; /usr/bin/python3
Dec 06 08:05:47 np0005548789.localdomain sudo[36262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:47 np0005548789.localdomain python3[36264]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:05:47 np0005548789.localdomain sudo[36262]: pam_unix(sudo:session): session closed for user root
Dec 06 08:05:48 np0005548789.localdomain sudo[36280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khyxbpblkfgfpgozwmcpkbqfuaporpvr ; /usr/bin/python3
Dec 06 08:05:48 np0005548789.localdomain sudo[36280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:05:48 np0005548789.localdomain python3[36282]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:05:50 np0005548789.localdomain sshd[36248]: Connection reset by authenticating user root 91.202.233.33 port 35920 [preauth]
Dec 06 08:05:50 np0005548789.localdomain sshd[36284]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:05:52 np0005548789.localdomain sshd[36284]: Invalid user test1 from 91.202.233.33 port 35924
Dec 06 08:05:53 np0005548789.localdomain sshd[36284]: Connection reset by invalid user test1 91.202.233.33 port 35924 [preauth]
Dec 06 08:06:04 np0005548789.localdomain groupadd[36453]: group added to /etc/group: name=puppet, GID=52
Dec 06 08:06:04 np0005548789.localdomain groupadd[36453]: group added to /etc/gshadow: name=puppet
Dec 06 08:06:04 np0005548789.localdomain groupadd[36453]: new group: name=puppet, GID=52
Dec 06 08:06:04 np0005548789.localdomain useradd[36460]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Dec 06 08:06:07 np0005548789.localdomain sshd[36470]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:06:08 np0005548789.localdomain sshd[36470]: Received disconnect from 77.222.100.142 port 59642:11: Bye Bye [preauth]
Dec 06 08:06:08 np0005548789.localdomain sshd[36470]: Disconnected from authenticating user root 77.222.100.142 port 59642 [preauth]
Dec 06 08:06:09 np0005548789.localdomain sshd[36472]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:06:11 np0005548789.localdomain sshd[36472]: Received disconnect from 154.201.83.49 port 52822:11: Bye Bye [preauth]
Dec 06 08:06:11 np0005548789.localdomain sshd[36472]: Disconnected from authenticating user root 154.201.83.49 port 52822 [preauth]
Dec 06 08:06:14 np0005548789.localdomain sshd[36752]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:06:14 np0005548789.localdomain sshd[36752]: Received disconnect from 74.94.234.151 port 37118:11: Bye Bye [preauth]
Dec 06 08:06:14 np0005548789.localdomain sshd[36752]: Disconnected from authenticating user root 74.94.234.151 port 37118 [preauth]
Dec 06 08:06:24 np0005548789.localdomain sshd[36811]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:06:24 np0005548789.localdomain sshd[36811]: Received disconnect from 162.241.87.197 port 52226:11: Bye Bye [preauth]
Dec 06 08:06:24 np0005548789.localdomain sshd[36811]: Disconnected from authenticating user root 162.241.87.197 port 52226 [preauth]
Dec 06 08:06:34 np0005548789.localdomain sshd[36867]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:06:35 np0005548789.localdomain sshd[36867]: Received disconnect from 195.250.72.168 port 38712:11: Bye Bye [preauth]
Dec 06 08:06:35 np0005548789.localdomain sshd[36867]: Disconnected from authenticating user root 195.250.72.168 port 38712 [preauth]
Dec 06 08:06:38 np0005548789.localdomain sudo[36893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:06:38 np0005548789.localdomain sudo[36893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:38 np0005548789.localdomain sudo[36893]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:38 np0005548789.localdomain sudo[36909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:06:38 np0005548789.localdomain sudo[36909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:39 np0005548789.localdomain sudo[36909]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:39 np0005548789.localdomain sudo[37113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:06:39 np0005548789.localdomain sudo[37113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:06:39 np0005548789.localdomain sudo[37113]: pam_unix(sudo:session): session closed for user root
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  Converting 2699 SID table entries...
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:06:59 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:06:59 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:06:59 np0005548789.localdomain systemd-rc-local-generator[37337]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:06:59 np0005548789.localdomain systemd-sysv-generator[37341]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:06:59 np0005548789.localdomain systemd[1]: run-rf5d9828cd0b34aefadac40a870236621.service: Deactivated successfully.
Dec 06 08:07:00 np0005548789.localdomain sudo[36280]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:01 np0005548789.localdomain sudo[37735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khcqheoajwzulxgbngurnhjvagkwjxis ; /usr/bin/python3
Dec 06 08:07:01 np0005548789.localdomain sudo[37735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:01 np0005548789.localdomain python3[37737]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:02 np0005548789.localdomain sudo[37735]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:03 np0005548789.localdomain sudo[37874]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfzstipwzttzbmqlsucrygeobxchuazj ; /usr/bin/python3
Dec 06 08:07:03 np0005548789.localdomain sudo[37874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:03 np0005548789.localdomain python3[37876]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:07:04 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:07:04 np0005548789.localdomain systemd-rc-local-generator[37901]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:07:04 np0005548789.localdomain systemd-sysv-generator[37907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:07:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:07:04 np0005548789.localdomain sudo[37874]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:05 np0005548789.localdomain sudo[37928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvubvpeatzbxuaenkhpejgwscphgnksc ; /usr/bin/python3
Dec 06 08:07:05 np0005548789.localdomain sudo[37928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:05 np0005548789.localdomain python3[37930]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:05 np0005548789.localdomain sudo[37928]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:05 np0005548789.localdomain sudo[37944]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stzpojbskaulliyeuekawvaoshiorxib ; /usr/bin/python3
Dec 06 08:07:05 np0005548789.localdomain sudo[37944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:05 np0005548789.localdomain python3[37946]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:05 np0005548789.localdomain sudo[37944]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:06 np0005548789.localdomain sudo[37961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtkxzbtssbhaxnibmgfkdutpfamuxqpx ; /usr/bin/python3
Dec 06 08:07:06 np0005548789.localdomain sudo[37961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:06 np0005548789.localdomain python3[37963]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:07:06 np0005548789.localdomain sudo[37961]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:06 np0005548789.localdomain sudo[37979]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmpueanhmhurpwbwfptzhfhvlfaouldu ; /usr/bin/python3
Dec 06 08:07:06 np0005548789.localdomain sudo[37979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:07 np0005548789.localdomain python3[37981]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:07 np0005548789.localdomain sudo[37979]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:07 np0005548789.localdomain sudo[37997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxcwudwijsjaklypotulhzoxnbxsgtaj ; /usr/bin/python3
Dec 06 08:07:07 np0005548789.localdomain sudo[37997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:07 np0005548789.localdomain python3[37999]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:07 np0005548789.localdomain sudo[37997]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:07 np0005548789.localdomain sudo[38015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxlahtxfjbfngkpfqmvkghvgfhlqhess ; /usr/bin/python3
Dec 06 08:07:07 np0005548789.localdomain sudo[38015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:08 np0005548789.localdomain python3[38017]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:07:08 np0005548789.localdomain systemd[1]: Reloading Network Manager...
Dec 06 08:07:08 np0005548789.localdomain NetworkManager[5973]: <info>  [1765008428.1407] audit: op="reload" arg="0" pid=38020 uid=0 result="success"
Dec 06 08:07:08 np0005548789.localdomain NetworkManager[5973]: <info>  [1765008428.1416] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Dec 06 08:07:08 np0005548789.localdomain NetworkManager[5973]: <info>  [1765008428.1417] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 06 08:07:08 np0005548789.localdomain systemd[1]: Reloaded Network Manager.
Dec 06 08:07:08 np0005548789.localdomain sudo[38015]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:08 np0005548789.localdomain sudo[38034]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hziiexixclgfnrrwdgqxrevsyucuezau ; /usr/bin/python3
Dec 06 08:07:08 np0005548789.localdomain sudo[38034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:08 np0005548789.localdomain python3[38036]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:08 np0005548789.localdomain sudo[38034]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:08 np0005548789.localdomain sudo[38051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbbshzgisomxryueljurdiolxcrkivtp ; /usr/bin/python3
Dec 06 08:07:08 np0005548789.localdomain sudo[38051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548789.localdomain python3[38053]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:09 np0005548789.localdomain sudo[38051]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:09 np0005548789.localdomain sudo[38069]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yslrcdaiybjpjjbcrwelyiwgdcgxlujg ; /usr/bin/python3
Dec 06 08:07:09 np0005548789.localdomain sudo[38069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548789.localdomain python3[38071]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:09 np0005548789.localdomain sudo[38069]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:09 np0005548789.localdomain sudo[38085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klvyxvzkgprlioaswislpwtotjyvfscu ; /usr/bin/python3
Dec 06 08:07:09 np0005548789.localdomain sudo[38085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:09 np0005548789.localdomain python3[38087]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:09 np0005548789.localdomain sudo[38085]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:10 np0005548789.localdomain sudo[38101]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaodaobiotxdsdnslxgocnaireydhaik ; /usr/bin/python3
Dec 06 08:07:10 np0005548789.localdomain sudo[38101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:10 np0005548789.localdomain python3[38103]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 08:07:10 np0005548789.localdomain sudo[38101]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:10 np0005548789.localdomain sudo[38117]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqmxmkellotsbnusjyrbxijtbvvgmarm ; /usr/bin/python3
Dec 06 08:07:10 np0005548789.localdomain sudo[38117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:11 np0005548789.localdomain python3[38119]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:11 np0005548789.localdomain sudo[38117]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:11 np0005548789.localdomain sudo[38133]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipuzshtcpwnyxcieztlnsgenqjxqfiot ; /usr/bin/python3
Dec 06 08:07:11 np0005548789.localdomain sudo[38133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:11 np0005548789.localdomain python3[38135]: ansible-blockinfile Invoked with path=/tmp/ansible.sop2m8_l block=[192.168.122.106]*,[np0005548788.ctlplane.localdomain]*,[172.17.0.106]*,[np0005548788.internalapi.localdomain]*,[172.18.0.106]*,[np0005548788.storage.localdomain]*,[172.20.0.106]*,[np0005548788.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005548788.tenant.localdomain]*,[np0005548788.localdomain]*,[np0005548788]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxIoAQH9YZnGrAxYR5prFQwo6HY5mwdDjndb+bp2pwvtVLM4ABIdCi+K1wpbhOpoO7BsYOf/tdBqemvSDleNo/ZLh3v3MmoVtoTtQZqLWsAQWFgJCjcGUGB+H3CHhtbp706coVQMlGD+UQqpCBy8WamMB/Ldy+hSHbLHwzuMzj8tO90vUbEyuKgOuu/X3ZFa+Yjo/asQ+PTrVfirh1QvRQ9aK22xH89KbThA/1an4OjnNGLCP752auSQ894B21QLKfqaMGPlpbjU8Wr6MP4zKV9lUzpQiFr6IU6cd4CeIsJDj7FnAZuBSmi8ewgm/r4ZWkmCSlqw8OpMC5soJnm8Q4PJTIFvT9eyyFCh9xmQkMhzE8P332LtYjZ+vXhYFU14e04mOQx5UrtHN8uWJVbOAwtLNAcenHyRtCQGkAZ6f9q0OvSuYr+o3FhHhN5ABu32AKAD8YpkjLypi+PbaiKNQW8XzPAHHbV8CGZ4B09ZWeQY49VA0bPxIYBXd1mEBlXSE=
                                                         [192.168.122.107]*,[np0005548789.ctlplane.localdomain]*,[172.17.0.107]*,[np0005548789.internalapi.localdomain]*,[172.18.0.107]*,[np0005548789.storage.localdomain]*,[172.20.0.107]*,[np0005548789.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005548789.tenant.localdomain]*,[np0005548789.localdomain]*,[np0005548789]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwH3rhRTvOINLmLdbeRXeXOiMzz+IXEuW2cXYAe50Wcc3ikH2RVGirWQrwLc8hAoA7UFCXADqEMxPg6/fLsQkbP7kLOpUtam8nuXvgt8VHM4RFl5wh9EOgZ7DWgjA7s3r2eQMcBhv82CjVMLY/YjnLuRNXCsJAqeG32qcKedKH/huEFvkb49U/UnNlxi5BfNrMlY9n5UQXE2rd6EKwP58aP/qQ1ie3p8nwHc36/MJcfEIABlLaoHK/LxnadOFTh93OkqVi7A0VQsKSmKD64nABiN7ML0NReoyRIQI5r3Dawe8v2K9jCBh5jY88TVsYUJqgwoZSSU73sYGHX4uF+PY8wL7qwn6mCzA17GGYeB8Dy0N8qwDqah6kUjpcLwGp7YaKf0FIZPBKcLVMrX6Tnwxer1j3kOIt3tgLZoz3mMfstWfCyvt9t+GEW5MCE+MBkY4Eree3uK7pI+wJ3vFQS9XVP00hjNiLWYmoaaW6rl8xtw7QtGhzmjcWbOxaZvHWE5E=
                                                         [192.168.122.108]*,[np0005548790.ctlplane.localdomain]*,[172.17.0.108]*,[np0005548790.internalapi.localdomain]*,[172.18.0.108]*,[np0005548790.storage.localdomain]*,[172.20.0.108]*,[np0005548790.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005548790.tenant.localdomain]*,[np0005548790.localdomain]*,[np0005548790]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDmdMCy44p73Ui+o09YQitqR9FILqoJ6AGYYutFVH6wn5m1j6oEoI4XgVFPR3UpG3SXdoiG7m0DRxC/WZZMpZbaQ3ZHbJJioRh1hV5uQtK5k2gtmS8uePng5UprbLncMXf+HIxNRvirU3r6zdgNGAroK0rN0nWESi/FNb2flu9Aw9JAsgIAAouW4IUoeyMGZ1AflhRhsWsQMstM9UEeGU+iTqV7al1URVCSq1finY99m+QC+Pftpd2C/+agboOIiVa63+D/RqqfYqh4C/PYfDbssYjcZzk3P90+HQ6uMKexX3HRnFbyje4eLSBHC0pjr/4pNfk/eSpdHeyMAPsP+QlBztdcPj9OnjcmT9ymeJRKF7GwNIWg3Pn9L2yY50d8l9Zu6rNIDW786XNcbm88yHdCHA5FE1A8XTWQRQ3eUSUsmsvf03pExAouRM4Fj8dvCu6wzG2SuyWqmdT5yCNrUG0e1CeE6PcfTLBeS5CJAwn5HM8aUndQQldWmaUbMPL5Jis=
                                                         [192.168.122.103]*,[np0005548785.ctlplane.localdomain]*,[172.17.0.103]*,[np0005548785.internalapi.localdomain]*,[172.18.0.103]*,[np0005548785.storage.localdomain]*,[172.20.0.103]*,[np0005548785.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005548785.tenant.localdomain]*,[np0005548785.localdomain]*,[np0005548785]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC89JzJHuRLDUgmU66VPdPVwYLrvslBwa5i2QfiUzrnpt1lKz8ayq6QMRy5y5GgfjQQhX/YZiAjUSoogVsYDkoDaImXdtfQHFlFMLTlJPiYcA/cGAwMAE/vifpWoztBHUXkJ5YWUojkXzGoR8d7ESx/tTLG/9QrQDsW6JcV18mcFCQZdeWYWGWdLn6ynmQOZ0N4U6mYK1FqE+GKgP6L9PEjkC1ePo81AnYcdQ5Z1IETdcCcJytdvvxH/Zie1PiAaMAgMYhsqu7+DZRRTvg+cEMw3mRVuodIyQEbpZs8MjR3itViRfZ+UqYi6uKDnz1viLL0aACaYhOLzrE7bQ6Sl4j1MnMrWncUOv3Sq2fus+Y6oYmed84E6HUNljte7vVP9jwPclbCAmj5WuC/Av9dSqqHEpPRbKJ4tAuBrO2LBKS7J62FjRYiY807V1viyxUgjK5FmsQyfVr3/YOirluSx54e4XwxxDrAjtrd0x68H7/Mt6HP/79cWKaVbC7XUckYRmE=
                                                         [192.168.122.104]*,[np0005548786.ctlplane.localdomain]*,[172.17.0.104]*,[np0005548786.internalapi.localdomain]*,[172.18.0.104]*,[np0005548786.storage.localdomain]*,[172.20.0.104]*,[np0005548786.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005548786.tenant.localdomain]*,[np0005548786.localdomain]*,[np0005548786]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDURzBA/aIGrwPgaIApy0UCTi4wdQhfDEx0QfkSAIn0ZptZcOkaR8BWtl9GijRPEp++Ep4qU04JcwHO1ZULd2UnCdDeg1Imwnf7x9HQBjAr0mH+tE0t4MBLtBbrk8Ep5ggyKATK1CvEl3NuGIS4gSSUWxzkR74Iju/GtrEMuVnMSsOw+auBofiv1ne4zyXqQWZORiK32DSolw1KyXGLyqG+JOpl3Kza5o79S1KUghfRzskZMm/AxFYciPmg4EQK/jL9Izj7qq3v8MaL8baeyqNlPaaRKCh+pkZlYtoPzDhe+vn/jwnDmQgqC1Bh+dkNiKEVlWz3mxoiMoeLY3jP/tMF2M4M8puGakPc2sqJxk1++Tv/lFRO3zBS+V2kECKI5DtQI6XThfLYXxIQl5SHr4yGEoxhMNt6YNQPLp6lg30kHO24YyNNA7LPFYYoOGUCaq5ZVUCF9lagMxcgkN0Bs+ZZqeni+53RqxoutiRZ0m9pIiqxGjrJjbNFXmofgfDBcUE=
                                                         [192.168.122.105]*,[np0005548787.ctlplane.localdomain]*,[172.17.0.105]*,[np0005548787.internalapi.localdomain]*,[172.18.0.105]*,[np0005548787.storage.localdomain]*,[172.20.0.105]*,[np0005548787.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005548787.tenant.localdomain]*,[np0005548787.localdomain]*,[np0005548787]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXe0UZ2kJKcvYaHSnjIOf3QqkGhArLo32nvDm8Pl8ZVNWfdRV8R+e17etAicDq//fxWC+U9jiHp4qI6/0Jm64rPocmJKaA+r79sNpv+598NlGtVUfTYQ34Ze9bgaPkjAwKfPNrzjSDChyfkys4Hm0J7ttog5rvMcuRelxkFmoonOcuzBC+9ufI6qld7br5w4WDookwamkefbMCiwAZxrw2bSjoTu7/TEFbt7SM0lUIdqP5WvxpWK52OkjnakQ0BL4QHdRYz1kBx/vS0TFxXb2pMO291dfkxDl3H2oXXZZYK/LWy3nZyJEX+mD5J6WOEs5HC5GQQ+CNEV0wa2e/gJA7KBsyL5T6RBtH8id22sBHZkzcaDhUz1ZABGAiOx4rdrr4YFFFy/u00nX3ZCuRBPXYh37Pafl7GXcSKyhTmkCZI0591RdNmb1duh9ZIObRmPVp2+WIheAFvS7EU4B0+ZjAEbDJgiSa9VlUrlRFX0ajcFHR8FnwNRcoERO3A3h4/Tc=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:11 np0005548789.localdomain sudo[38133]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:12 np0005548789.localdomain sudo[38149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tepkayrqsytygxatbuuzhljomhrwwlnu ; /usr/bin/python3
Dec 06 08:07:12 np0005548789.localdomain sudo[38149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:12 np0005548789.localdomain python3[38151]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.sop2m8_l' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:12 np0005548789.localdomain sudo[38149]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:12 np0005548789.localdomain sudo[38167]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itvqbhnxgydgretaymjcqkapuettlnqe ; /usr/bin/python3
Dec 06 08:07:12 np0005548789.localdomain sudo[38167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:12 np0005548789.localdomain python3[38169]: ansible-file Invoked with path=/tmp/ansible.sop2m8_l state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:12 np0005548789.localdomain sudo[38167]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:13 np0005548789.localdomain sudo[38183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifzhsdbpowrxcazqfmvtfruejeplcgll ; /usr/bin/python3
Dec 06 08:07:13 np0005548789.localdomain sudo[38183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:13 np0005548789.localdomain python3[38185]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:07:13 np0005548789.localdomain sudo[38183]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:13 np0005548789.localdomain sudo[38199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjsscheadcgzamqppisnevyalycshpgw ; /usr/bin/python3
Dec 06 08:07:13 np0005548789.localdomain sudo[38199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:13 np0005548789.localdomain python3[38201]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:13 np0005548789.localdomain sudo[38199]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548789.localdomain sudo[38217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilhpfuvwcmniiatwzgxdjyzwsudcyuta ; /usr/bin/python3
Dec 06 08:07:14 np0005548789.localdomain sudo[38217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:14 np0005548789.localdomain python3[38219]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:14 np0005548789.localdomain sudo[38217]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548789.localdomain sudo[38236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsevxnezolfhxfkjzjumoijzlwvpmmdu ; /usr/bin/python3
Dec 06 08:07:14 np0005548789.localdomain sudo[38236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:14 np0005548789.localdomain python3[38238]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Dec 06 08:07:14 np0005548789.localdomain sudo[38236]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:14 np0005548789.localdomain sudo[38252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zynasuxulryhfuajumockeodsoxhrudn ; /usr/bin/python3
Dec 06 08:07:14 np0005548789.localdomain sudo[38252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548789.localdomain sudo[38252]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:15 np0005548789.localdomain sudo[38300]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csjtwfgmfxnhldhqgfgzmnznsnjcfeoq ; /usr/bin/python3
Dec 06 08:07:15 np0005548789.localdomain sudo[38300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548789.localdomain sudo[38300]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:15 np0005548789.localdomain sudo[38343]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmbrofhxggqtckldfgvoegugfzqoeyrf ; /usr/bin/python3
Dec 06 08:07:15 np0005548789.localdomain sudo[38343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:15 np0005548789.localdomain sudo[38343]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:17 np0005548789.localdomain sudo[38373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkqyldepmahxqgcosbdobijwvvfnkazt ; /usr/bin/python3
Dec 06 08:07:17 np0005548789.localdomain sudo[38373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:17 np0005548789.localdomain python3[38375]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:17 np0005548789.localdomain sudo[38373]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:17 np0005548789.localdomain sudo[38390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgygaxuvagjroivqsieitppzwriolfnw ; /usr/bin/python3
Dec 06 08:07:17 np0005548789.localdomain sudo[38390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:17 np0005548789.localdomain python3[38392]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:07:20 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:07:20 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:07:21 np0005548789.localdomain systemd-sysv-generator[38465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:07:21 np0005548789.localdomain systemd-rc-local-generator[38457]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: tuned.service: Consumed 1.780s CPU time.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:07:21 np0005548789.localdomain systemd[1]: run-raabb5943b14e4279a74ef3a8ef70db1b.service: Deactivated successfully.
Dec 06 08:07:22 np0005548789.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 08:07:22 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:07:22 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:07:23 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:07:23 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:07:23 np0005548789.localdomain systemd[1]: run-r674fb6b9d2bb4d4f87c7edf8ecbfa6ec.service: Deactivated successfully.
Dec 06 08:07:23 np0005548789.localdomain sudo[38390]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:23 np0005548789.localdomain sudo[38827]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbhegqhalulmwkkeurqianvwxviskvuq ; /usr/bin/python3
Dec 06 08:07:23 np0005548789.localdomain sudo[38827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:24 np0005548789.localdomain python3[38829]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:07:24 np0005548789.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 08:07:24 np0005548789.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 08:07:24 np0005548789.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 08:07:24 np0005548789.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 08:07:25 np0005548789.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 08:07:25 np0005548789.localdomain sudo[38827]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:25 np0005548789.localdomain sudo[39022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpvpvfrxfuxswwnrfspbolaplwzgdrml ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:07:25 np0005548789.localdomain sudo[39022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:26 np0005548789.localdomain python3[39024]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:26 np0005548789.localdomain sudo[39022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:26 np0005548789.localdomain sudo[39039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prcbekxligmvsbkubeibjlxtquurisic ; /usr/bin/python3
Dec 06 08:07:26 np0005548789.localdomain sudo[39039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:26 np0005548789.localdomain python3[39041]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 06 08:07:26 np0005548789.localdomain sudo[39039]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:27 np0005548789.localdomain sudo[39055]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yugrdmndfvdnmrnmdisnxzfthyeqnqnf ; /usr/bin/python3
Dec 06 08:07:27 np0005548789.localdomain sudo[39055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:27 np0005548789.localdomain python3[39057]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:27 np0005548789.localdomain sudo[39055]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:27 np0005548789.localdomain sudo[39071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihuglivuxsboitdxislgkebeoeywpcmr ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:07:27 np0005548789.localdomain sudo[39071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:27 np0005548789.localdomain python3[39073]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:29 np0005548789.localdomain sudo[39071]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:29 np0005548789.localdomain sudo[39091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmxpsarrihactbbarxurkseeurdujndm ; /usr/bin/python3
Dec 06 08:07:29 np0005548789.localdomain sudo[39091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:29 np0005548789.localdomain python3[39093]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:29 np0005548789.localdomain sudo[39091]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:30 np0005548789.localdomain sudo[39108]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tthympkignvqgrmthajyqeraqbjgzyhz ; /usr/bin/python3
Dec 06 08:07:30 np0005548789.localdomain sudo[39108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:30 np0005548789.localdomain python3[39110]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:30 np0005548789.localdomain sudo[39108]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:32 np0005548789.localdomain sudo[39124]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcrcayoaktqavzxgorqtkmntqmxqjicv ; /usr/bin/python3
Dec 06 08:07:32 np0005548789.localdomain sudo[39124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:32 np0005548789.localdomain python3[39126]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:32 np0005548789.localdomain sudo[39124]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:37 np0005548789.localdomain sudo[39140]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxcskypvpyhbgmqjekcwsraxkllrhrwn ; /usr/bin/python3
Dec 06 08:07:37 np0005548789.localdomain sudo[39140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:37 np0005548789.localdomain python3[39142]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:37 np0005548789.localdomain sudo[39140]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:37 np0005548789.localdomain sshd[39143]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:07:38 np0005548789.localdomain sudo[39190]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwiivmcgwhkhcsslvfotjngasitxkbpl ; /usr/bin/python3
Dec 06 08:07:38 np0005548789.localdomain sudo[39190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548789.localdomain python3[39192]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:38 np0005548789.localdomain sudo[39190]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:38 np0005548789.localdomain sshd[39143]: Received disconnect from 74.94.234.151 port 35536:11: Bye Bye [preauth]
Dec 06 08:07:38 np0005548789.localdomain sshd[39143]: Disconnected from authenticating user root 74.94.234.151 port 35536 [preauth]
Dec 06 08:07:38 np0005548789.localdomain sudo[39235]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dksvotphlaplavofmnbvdbgknyunnjyt ; /usr/bin/python3
Dec 06 08:07:38 np0005548789.localdomain sudo[39235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548789.localdomain python3[39237]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008457.875438-71089-216537416560304/source _original_basename=tmpr3ua795s follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:38 np0005548789.localdomain sudo[39235]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:38 np0005548789.localdomain sudo[39265]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccbepmqmiwjumxqtwucykbmrytfincla ; /usr/bin/python3
Dec 06 08:07:38 np0005548789.localdomain sudo[39265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:38 np0005548789.localdomain python3[39267]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:38 np0005548789.localdomain sudo[39265]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:39 np0005548789.localdomain sudo[39313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpyvtchoekusqnvudzugylyrmbgszrst ; /usr/bin/python3
Dec 06 08:07:39 np0005548789.localdomain sudo[39313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:39 np0005548789.localdomain python3[39315]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:39 np0005548789.localdomain sudo[39313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:39 np0005548789.localdomain sudo[39356]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oagxbilgsolvgoizuxwvhuprbkiuqmco ; /usr/bin/python3
Dec 06 08:07:39 np0005548789.localdomain sudo[39356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:39 np0005548789.localdomain python3[39358]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008459.3138149-71179-167077286353394/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=c7cc1670a1e268d7901b4353362279cc1f651214 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:39 np0005548789.localdomain sudo[39356]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548789.localdomain sudo[39359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:07:40 np0005548789.localdomain sudo[39359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:40 np0005548789.localdomain sudo[39359]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548789.localdomain sudo[39388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:07:40 np0005548789.localdomain sudo[39388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:40 np0005548789.localdomain sudo[39455]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjflpzxyrqhiamlfmtrbajquwcmbfazb ; /usr/bin/python3
Dec 06 08:07:40 np0005548789.localdomain sudo[39455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:40 np0005548789.localdomain python3[39465]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:40 np0005548789.localdomain sudo[39455]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548789.localdomain sudo[39388]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:40 np0005548789.localdomain sudo[39523]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldbddbkquhkwxareklvqnzecvzvicuzp ; /usr/bin/python3
Dec 06 08:07:40 np0005548789.localdomain sudo[39523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:40 np0005548789.localdomain python3[39525]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008460.2215257-71240-53134192445290/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=8c98a1379d65c02b867387467a21d26fe82a1c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:40 np0005548789.localdomain sudo[39523]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548789.localdomain sudo[39572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:07:41 np0005548789.localdomain sudo[39572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:07:41 np0005548789.localdomain sudo[39572]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548789.localdomain sudo[39600]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubqlcfgjeqcokasyhnqnndhhxdczgnlz ; /usr/bin/python3
Dec 06 08:07:41 np0005548789.localdomain sudo[39600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:41 np0005548789.localdomain python3[39602]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:41 np0005548789.localdomain sudo[39600]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:41 np0005548789.localdomain sudo[39643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwllprkdcfgwffgyamcfnqbwcwfarfzl ; /usr/bin/python3
Dec 06 08:07:41 np0005548789.localdomain sudo[39643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:41 np0005548789.localdomain python3[39645]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008461.0989947-71240-191536469027122/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=2906872dac8eb33feea0b6fc0243b65109687e47 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:41 np0005548789.localdomain sudo[39643]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:42 np0005548789.localdomain sudo[39705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phhyntfkyppiqdfsshsgrwfchywvyypw ; /usr/bin/python3
Dec 06 08:07:42 np0005548789.localdomain sudo[39705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:42 np0005548789.localdomain python3[39707]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:42 np0005548789.localdomain sudo[39705]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:42 np0005548789.localdomain sudo[39748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkujrptkjdmpibszcxuwogexxfsblptj ; /usr/bin/python3
Dec 06 08:07:42 np0005548789.localdomain sudo[39748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:42 np0005548789.localdomain python3[39750]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008462.029945-71240-177597842225641/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=175c760950d63a47f443f25b58088dba962f090b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:42 np0005548789.localdomain sudo[39748]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548789.localdomain sudo[39810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndcoydjusheayzsguktelsxcjfxaodql ; /usr/bin/python3
Dec 06 08:07:43 np0005548789.localdomain sudo[39810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:43 np0005548789.localdomain python3[39812]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:43 np0005548789.localdomain sudo[39810]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548789.localdomain sudo[39853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxlvbofdwhgxwxlpllojlxexugwfkqmo ; /usr/bin/python3
Dec 06 08:07:43 np0005548789.localdomain sudo[39853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:43 np0005548789.localdomain python3[39855]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008462.997107-71240-160589447451761/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:43 np0005548789.localdomain sudo[39853]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:43 np0005548789.localdomain sudo[39915]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhvksckjjorklfjaqycvwbvjtjijutzn ; /usr/bin/python3
Dec 06 08:07:43 np0005548789.localdomain sudo[39915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548789.localdomain python3[39917]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:44 np0005548789.localdomain sudo[39915]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:44 np0005548789.localdomain sudo[39958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwojqiodzcahyurimwqmgssrnfkwuasr ; /usr/bin/python3
Dec 06 08:07:44 np0005548789.localdomain sudo[39958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548789.localdomain python3[39960]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008463.8333857-71240-118440307322738/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=51d477f907146168895fd1905f3827c38c3a4658 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:44 np0005548789.localdomain sudo[39958]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:44 np0005548789.localdomain sudo[40020]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luswdgkqzbvxfcayuzsqdbbawhxmbexv ; /usr/bin/python3
Dec 06 08:07:44 np0005548789.localdomain sudo[40020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:44 np0005548789.localdomain python3[40022]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:45 np0005548789.localdomain sudo[40020]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:45 np0005548789.localdomain sudo[40063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eezqgfvjrvqshgsaqyafvhkjhnbzvalr ; /usr/bin/python3
Dec 06 08:07:45 np0005548789.localdomain sudo[40063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:45 np0005548789.localdomain python3[40065]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008464.6760578-71240-156811703805453/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:45 np0005548789.localdomain sudo[40063]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:45 np0005548789.localdomain sudo[40125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyzuvfwgrbksncqgzsqyrraikachexqe ; /usr/bin/python3
Dec 06 08:07:45 np0005548789.localdomain sudo[40125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:45 np0005548789.localdomain python3[40127]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:45 np0005548789.localdomain sudo[40125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548789.localdomain sshd[40170]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:07:46 np0005548789.localdomain sudo[40168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkakqzhjlbhxanejzahatypmuirgjppb ; /usr/bin/python3
Dec 06 08:07:46 np0005548789.localdomain sudo[40168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:46 np0005548789.localdomain python3[40172]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008465.5660124-71240-40301974843610/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=955531133cc86a259eb018c78aadbdeb821782e0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:46 np0005548789.localdomain sudo[40168]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548789.localdomain sudo[40232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljyxzovqbuwmdtzjyqjpsyldpdegxgiw ; /usr/bin/python3
Dec 06 08:07:46 np0005548789.localdomain sudo[40232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:46 np0005548789.localdomain python3[40234]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:46 np0005548789.localdomain sudo[40232]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:46 np0005548789.localdomain sudo[40275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stuynxjxfzjkxdjovppdztvuialeyihf ; /usr/bin/python3
Dec 06 08:07:46 np0005548789.localdomain sudo[40275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548789.localdomain python3[40277]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008466.441579-71240-198437280903914/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:47 np0005548789.localdomain sudo[40275]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:47 np0005548789.localdomain sshd[40170]: Received disconnect from 154.201.83.49 port 53648:11: Bye Bye [preauth]
Dec 06 08:07:47 np0005548789.localdomain sshd[40170]: Disconnected from authenticating user root 154.201.83.49 port 53648 [preauth]
Dec 06 08:07:47 np0005548789.localdomain sudo[40337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flqdsigxotsuwjhhakctkmitxrwnxmvh ; /usr/bin/python3
Dec 06 08:07:47 np0005548789.localdomain sudo[40337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548789.localdomain python3[40339]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:47 np0005548789.localdomain sudo[40337]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:47 np0005548789.localdomain sudo[40380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmbudlmkxkyyjxyjumfmrlkbrlkghjlk ; /usr/bin/python3
Dec 06 08:07:47 np0005548789.localdomain sudo[40380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:47 np0005548789.localdomain python3[40382]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008467.2952452-71240-196122356471459/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:48 np0005548789.localdomain sudo[40380]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:48 np0005548789.localdomain sudo[40442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfvbpdnzsclbhzzfehkgvgqwtkgwefar ; /usr/bin/python3
Dec 06 08:07:48 np0005548789.localdomain sudo[40442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:48 np0005548789.localdomain python3[40444]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:48 np0005548789.localdomain sudo[40442]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:48 np0005548789.localdomain sudo[40485]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyxjgczrbddluxgbevldbdjwfaoqeggx ; /usr/bin/python3
Dec 06 08:07:48 np0005548789.localdomain sudo[40485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:48 np0005548789.localdomain python3[40487]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008468.1584904-71240-275681376168253/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=e22fb087c209a147f48a5b0777daca8567166409 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:48 np0005548789.localdomain sudo[40485]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:49 np0005548789.localdomain sudo[40515]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnttgkwpgeojhhvvpmfggkofyyoecpcr ; /usr/bin/python3
Dec 06 08:07:49 np0005548789.localdomain sudo[40515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:49 np0005548789.localdomain python3[40517]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:07:49 np0005548789.localdomain sudo[40515]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:50 np0005548789.localdomain sudo[40563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acfcqqmoriarplxlaglmmdqhoyprwgib ; /usr/bin/python3
Dec 06 08:07:50 np0005548789.localdomain sudo[40563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:50 np0005548789.localdomain python3[40565]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:07:50 np0005548789.localdomain sudo[40563]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:50 np0005548789.localdomain sudo[40606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muxgojipdsombcncrkoixondxoljluyn ; /usr/bin/python3
Dec 06 08:07:50 np0005548789.localdomain sudo[40606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:50 np0005548789.localdomain python3[40608]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008470.0010912-71883-178525614025385/source _original_basename=tmpav3x0jxx follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:07:50 np0005548789.localdomain sudo[40606]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:54 np0005548789.localdomain sudo[40636]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfqpjgkrgdfqjdvspbfacuxptycdxtox ; /usr/bin/python3
Dec 06 08:07:54 np0005548789.localdomain sudo[40636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:55 np0005548789.localdomain python3[40638]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 08:07:55 np0005548789.localdomain sudo[40636]: pam_unix(sudo:session): session closed for user root
Dec 06 08:07:55 np0005548789.localdomain sudo[40697]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azryxrlfleqkgxhnwblcjltufnbfuqud ; /usr/bin/python3
Dec 06 08:07:55 np0005548789.localdomain sudo[40697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:07:55 np0005548789.localdomain python3[40699]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:07:59 np0005548789.localdomain sudo[40697]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:00 np0005548789.localdomain sudo[40714]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwjkrqpjddklpdxjcpevadmocoqkoazz ; /usr/bin/python3
Dec 06 08:08:00 np0005548789.localdomain sudo[40714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:00 np0005548789.localdomain python3[40716]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:04 np0005548789.localdomain sudo[40714]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:05 np0005548789.localdomain sudo[40731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpirqivuglkntelzxvldtpovmwlumohg ; /usr/bin/python3
Dec 06 08:08:05 np0005548789.localdomain sudo[40731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:05 np0005548789.localdomain python3[40733]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:05 np0005548789.localdomain sudo[40731]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:05 np0005548789.localdomain sudo[40754]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmwtrfponcypwybvsuhodwtttogcjprb ; /usr/bin/python3
Dec 06 08:08:05 np0005548789.localdomain sudo[40754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:05 np0005548789.localdomain python3[40756]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:05 np0005548789.localdomain sshd[40758]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:08:07 np0005548789.localdomain sshd[40758]: Received disconnect from 195.250.72.168 port 41004:11: Bye Bye [preauth]
Dec 06 08:08:07 np0005548789.localdomain sshd[40758]: Disconnected from authenticating user root 195.250.72.168 port 41004 [preauth]
Dec 06 08:08:09 np0005548789.localdomain sudo[40754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:10 np0005548789.localdomain sudo[40773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkogljelwjqhhitsekzpihpzqkkobiii ; /usr/bin/python3
Dec 06 08:08:10 np0005548789.localdomain sudo[40773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:10 np0005548789.localdomain python3[40775]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:10 np0005548789.localdomain sudo[40773]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:10 np0005548789.localdomain sudo[40796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyhuhvwemvfnpkyabxrbtnvjgcsoghyz ; /usr/bin/python3
Dec 06 08:08:10 np0005548789.localdomain sudo[40796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:10 np0005548789.localdomain python3[40798]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:14 np0005548789.localdomain sudo[40796]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:15 np0005548789.localdomain sudo[40813]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqoxmupubkbzjhvmoojfpnnyotaxdgiy ; /usr/bin/python3
Dec 06 08:08:15 np0005548789.localdomain sudo[40813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:15 np0005548789.localdomain python3[40815]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:16 np0005548789.localdomain systemd[35904]: Starting Mark boot as successful...
Dec 06 08:08:16 np0005548789.localdomain systemd[35904]: Finished Mark boot as successful.
Dec 06 08:08:19 np0005548789.localdomain sudo[40813]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:19 np0005548789.localdomain sudo[40831]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khwsblitlvzwudrwjuweibwmgezhgviw ; /usr/bin/python3
Dec 06 08:08:19 np0005548789.localdomain sudo[40831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:19 np0005548789.localdomain python3[40833]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:19 np0005548789.localdomain sudo[40831]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:20 np0005548789.localdomain sudo[40854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mflrqzarjqlceydhgxqxumxkzlvovfsm ; /usr/bin/python3
Dec 06 08:08:20 np0005548789.localdomain sudo[40854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:20 np0005548789.localdomain python3[40856]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:24 np0005548789.localdomain sudo[40854]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:24 np0005548789.localdomain sudo[40871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdrwxadckepqfbdlkkfyygtkdaqugfon ; /usr/bin/python3
Dec 06 08:08:24 np0005548789.localdomain sudo[40871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:24 np0005548789.localdomain python3[40873]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:28 np0005548789.localdomain sudo[40871]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:29 np0005548789.localdomain sudo[40888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dagnrndrqxibdqgfixasxbffryjcxzch ; /usr/bin/python3
Dec 06 08:08:29 np0005548789.localdomain sudo[40888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:29 np0005548789.localdomain python3[40890]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:29 np0005548789.localdomain sudo[40888]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:29 np0005548789.localdomain sudo[40911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrgpobfmsimdgxmyutmospcekubjmwic ; /usr/bin/python3
Dec 06 08:08:29 np0005548789.localdomain sudo[40911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:29 np0005548789.localdomain python3[40913]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:33 np0005548789.localdomain sudo[40911]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:33 np0005548789.localdomain sudo[40928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wemukpnbcbgyigynncxnbztyigfydndk ; /usr/bin/python3
Dec 06 08:08:33 np0005548789.localdomain sudo[40928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:34 np0005548789.localdomain python3[40930]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:38 np0005548789.localdomain sudo[40928]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:38 np0005548789.localdomain sudo[40945]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyjykrudmjkvyuhshfyhcsdvafhwaslb ; /usr/bin/python3
Dec 06 08:08:38 np0005548789.localdomain sudo[40945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:38 np0005548789.localdomain python3[40947]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:38 np0005548789.localdomain sudo[40945]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:38 np0005548789.localdomain sudo[40968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzarjpahdmgtgiquwhnbrqtwhmhvefbl ; /usr/bin/python3
Dec 06 08:08:38 np0005548789.localdomain sudo[40968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:39 np0005548789.localdomain python3[40970]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:41 np0005548789.localdomain sudo[40972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:08:41 np0005548789.localdomain sudo[40972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:41 np0005548789.localdomain sudo[40972]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:41 np0005548789.localdomain sudo[40987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:08:41 np0005548789.localdomain sudo[40987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:42 np0005548789.localdomain sudo[40987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:43 np0005548789.localdomain sudo[40968]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:43 np0005548789.localdomain sudo[41046]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qexgmcxouriharbzvmcsdomtgczlozkp ; /usr/bin/python3
Dec 06 08:08:43 np0005548789.localdomain sudo[41046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:43 np0005548789.localdomain python3[41048]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:08:44 np0005548789.localdomain sudo[41050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:08:44 np0005548789.localdomain sudo[41050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:08:44 np0005548789.localdomain sudo[41050]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:47 np0005548789.localdomain sudo[41046]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:50 np0005548789.localdomain sudo[41078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlqlzlqrpezrpkkcuhzxeqftvabmzcsv ; /usr/bin/python3
Dec 06 08:08:50 np0005548789.localdomain sudo[41078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:50 np0005548789.localdomain python3[41080]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:50 np0005548789.localdomain sudo[41078]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:50 np0005548789.localdomain sudo[41126]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umjkpnulmkciwytqgztbsnpzkhsasqzi ; /usr/bin/python3
Dec 06 08:08:50 np0005548789.localdomain sudo[41126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:50 np0005548789.localdomain python3[41128]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:50 np0005548789.localdomain sudo[41126]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:51 np0005548789.localdomain sudo[41144]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axykiyfhjasfnewydryufzxoywamcskx ; /usr/bin/python3
Dec 06 08:08:51 np0005548789.localdomain sudo[41144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:51 np0005548789.localdomain python3[41146]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp_fefs1cu recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:51 np0005548789.localdomain sudo[41144]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:51 np0005548789.localdomain sudo[41174]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxokxitygeytdluzflgqzohyjodsoksd ; /usr/bin/python3
Dec 06 08:08:51 np0005548789.localdomain sudo[41174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:51 np0005548789.localdomain python3[41176]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:51 np0005548789.localdomain sudo[41174]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:52 np0005548789.localdomain sudo[41222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkdqfzljpgusdnaetcegolwfebboscqk ; /usr/bin/python3
Dec 06 08:08:52 np0005548789.localdomain sudo[41222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:52 np0005548789.localdomain python3[41224]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:52 np0005548789.localdomain sudo[41222]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:52 np0005548789.localdomain sudo[41240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hngrvvgdgqctyygwgnsivxfbtazkagkn ; /usr/bin/python3
Dec 06 08:08:52 np0005548789.localdomain sudo[41240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:52 np0005548789.localdomain python3[41242]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:52 np0005548789.localdomain sudo[41240]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:52 np0005548789.localdomain sudo[41302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgdkweeqvpylahxguzqmplbrghgibmmn ; /usr/bin/python3
Dec 06 08:08:52 np0005548789.localdomain sudo[41302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548789.localdomain python3[41304]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:53 np0005548789.localdomain sudo[41302]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:53 np0005548789.localdomain sudo[41320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uucixaqjadfcklcnzcxcdnortsvhwdam ; /usr/bin/python3
Dec 06 08:08:53 np0005548789.localdomain sudo[41320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548789.localdomain python3[41322]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:53 np0005548789.localdomain sudo[41320]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:53 np0005548789.localdomain sudo[41382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujfweaqfyixcobdkufcuebajpbwcjtbl ; /usr/bin/python3
Dec 06 08:08:53 np0005548789.localdomain sudo[41382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:53 np0005548789.localdomain python3[41384]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:53 np0005548789.localdomain sudo[41382]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548789.localdomain sudo[41400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjsoxjvslluxjvsvqkiqwdrcpgczvnqk ; /usr/bin/python3
Dec 06 08:08:54 np0005548789.localdomain sudo[41400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:54 np0005548789.localdomain python3[41402]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:54 np0005548789.localdomain sudo[41400]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548789.localdomain sudo[41462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgfiibydsbzdvpubanwywkzasfpzfviu ; /usr/bin/python3
Dec 06 08:08:54 np0005548789.localdomain sudo[41462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:54 np0005548789.localdomain python3[41464]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:54 np0005548789.localdomain sudo[41462]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:54 np0005548789.localdomain sudo[41480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnqludeocfxbgffwstoorknyznwlsycg ; /usr/bin/python3
Dec 06 08:08:54 np0005548789.localdomain sudo[41480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548789.localdomain python3[41482]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:55 np0005548789.localdomain sudo[41480]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:55 np0005548789.localdomain sudo[41542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aplpqenhkvaupbeccouunnnxynpvxlvm ; /usr/bin/python3
Dec 06 08:08:55 np0005548789.localdomain sudo[41542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548789.localdomain python3[41544]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:55 np0005548789.localdomain sudo[41542]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:55 np0005548789.localdomain sudo[41560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywrxffmrhcjnmnzmnqqohjfcdcqmmxte ; /usr/bin/python3
Dec 06 08:08:55 np0005548789.localdomain sudo[41560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:55 np0005548789.localdomain python3[41562]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:55 np0005548789.localdomain sudo[41560]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548789.localdomain sudo[41622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwbozrzrhqjgtoiulrxgzuhxbrgzvvqk ; /usr/bin/python3
Dec 06 08:08:56 np0005548789.localdomain sudo[41622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:56 np0005548789.localdomain python3[41624]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:56 np0005548789.localdomain sudo[41622]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548789.localdomain sudo[41640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lirlbvusktnxhgtmeybvpbbietarzear ; /usr/bin/python3
Dec 06 08:08:56 np0005548789.localdomain sudo[41640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:56 np0005548789.localdomain python3[41642]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:56 np0005548789.localdomain sudo[41640]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:56 np0005548789.localdomain sudo[41702]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgthirclahvfiswfhypezydtqnjcbfou ; /usr/bin/python3
Dec 06 08:08:56 np0005548789.localdomain sudo[41702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548789.localdomain python3[41704]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:57 np0005548789.localdomain sudo[41702]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548789.localdomain sudo[41720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdqhzamtcudgvcqllwahpmxichssuneb ; /usr/bin/python3
Dec 06 08:08:57 np0005548789.localdomain sudo[41720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548789.localdomain python3[41722]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:57 np0005548789.localdomain sudo[41720]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548789.localdomain sudo[41782]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsktmuwnaovxqpuhgjoiyacqsjgnyuit ; /usr/bin/python3
Dec 06 08:08:57 np0005548789.localdomain sudo[41782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:57 np0005548789.localdomain python3[41784]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:57 np0005548789.localdomain sudo[41782]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:57 np0005548789.localdomain sudo[41800]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xavpwctkrbbwqosjrstdhadxdftrscud ; /usr/bin/python3
Dec 06 08:08:57 np0005548789.localdomain sudo[41800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548789.localdomain python3[41802]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:58 np0005548789.localdomain sudo[41800]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:58 np0005548789.localdomain sudo[41862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abbyimxdvtprxrjdiuhpqvuwuwvpeonc ; /usr/bin/python3
Dec 06 08:08:58 np0005548789.localdomain sudo[41862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548789.localdomain python3[41864]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:58 np0005548789.localdomain sudo[41862]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:58 np0005548789.localdomain sudo[41880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijhyfodqnnzceigvfddorudptisizqcz ; /usr/bin/python3
Dec 06 08:08:58 np0005548789.localdomain sudo[41880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:58 np0005548789.localdomain python3[41882]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:58 np0005548789.localdomain sudo[41880]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548789.localdomain sudo[41942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeetwezgzjfncrxorqnsnansfaiooltp ; /usr/bin/python3
Dec 06 08:08:59 np0005548789.localdomain sudo[41942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:59 np0005548789.localdomain python3[41944]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:08:59 np0005548789.localdomain sudo[41942]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548789.localdomain sudo[41960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khyrqymnwopimeapotdfvbztkqitzasu ; /usr/bin/python3
Dec 06 08:08:59 np0005548789.localdomain sudo[41960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:08:59 np0005548789.localdomain python3[41962]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:08:59 np0005548789.localdomain sudo[41960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:08:59 np0005548789.localdomain sudo[42022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftjoxbdovkbzfricnupczdulylcfdeky ; /usr/bin/python3
Dec 06 08:08:59 np0005548789.localdomain sudo[42022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:00 np0005548789.localdomain python3[42024]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:00 np0005548789.localdomain sudo[42022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:00 np0005548789.localdomain sudo[42040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skqgorrewsxhthqqzmrqderhkiwagjjm ; /usr/bin/python3
Dec 06 08:09:00 np0005548789.localdomain sudo[42040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:00 np0005548789.localdomain python3[42042]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:00 np0005548789.localdomain sudo[42040]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:01 np0005548789.localdomain sudo[42070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gewkyntkegqjychbadcnytktpajhctqf ; /usr/bin/python3
Dec 06 08:09:01 np0005548789.localdomain sudo[42070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:01 np0005548789.localdomain python3[42072]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:01 np0005548789.localdomain sudo[42070]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:01 np0005548789.localdomain sudo[42118]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qunuihdmudgkjhgeaxpqkctoqjpflotx ; /usr/bin/python3
Dec 06 08:09:01 np0005548789.localdomain sudo[42118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:01 np0005548789.localdomain sshd[42121]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:01 np0005548789.localdomain python3[42120]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:01 np0005548789.localdomain sudo[42118]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:02 np0005548789.localdomain sudo[42138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuesywfqzkdvsofnqwklzujpncdqulzz ; /usr/bin/python3
Dec 06 08:09:02 np0005548789.localdomain sudo[42138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:02 np0005548789.localdomain python3[42140]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmp29c5bsqq recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:02 np0005548789.localdomain sudo[42138]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:02 np0005548789.localdomain sshd[42121]: Received disconnect from 74.94.234.151 port 33952:11: Bye Bye [preauth]
Dec 06 08:09:02 np0005548789.localdomain sshd[42121]: Disconnected from authenticating user root 74.94.234.151 port 33952 [preauth]
Dec 06 08:09:04 np0005548789.localdomain sudo[42168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdnkijobpiytceadmuehtjuygcfbybxk ; /usr/bin/python3
Dec 06 08:09:04 np0005548789.localdomain sudo[42168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:04 np0005548789.localdomain python3[42170]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:09:07 np0005548789.localdomain sudo[42168]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:09 np0005548789.localdomain sudo[42185]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjlwwjcjyxmesaixnlvgfhizkyycfrvg ; /usr/bin/python3
Dec 06 08:09:09 np0005548789.localdomain sudo[42185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:09 np0005548789.localdomain python3[42187]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:09 np0005548789.localdomain sudo[42185]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:09 np0005548789.localdomain sudo[42203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhyoyhrfqcnusvbszxuhrxwzezclavlu ; /usr/bin/python3
Dec 06 08:09:09 np0005548789.localdomain sudo[42203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:10 np0005548789.localdomain python3[42205]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:10 np0005548789.localdomain sudo[42203]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:10 np0005548789.localdomain sudo[42221]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okrqwjfhnxzlzhtkmhydohmsvxitzyvk ; /usr/bin/python3
Dec 06 08:09:10 np0005548789.localdomain sudo[42221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:10 np0005548789.localdomain python3[42223]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:09:10 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:09:10 np0005548789.localdomain systemd-rc-local-generator[42248]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:09:11 np0005548789.localdomain systemd-sysv-generator[42253]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:09:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:09:11 np0005548789.localdomain systemd[1]: Starting Netfilter Tables...
Dec 06 08:09:11 np0005548789.localdomain systemd[1]: Finished Netfilter Tables.
Dec 06 08:09:11 np0005548789.localdomain sudo[42221]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:11 np0005548789.localdomain sudo[42311]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzxfjinuipljgtofcvgdjwpskfvghsxx ; /usr/bin/python3
Dec 06 08:09:11 np0005548789.localdomain sudo[42311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:11 np0005548789.localdomain python3[42313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:11 np0005548789.localdomain sudo[42311]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:12 np0005548789.localdomain sudo[42354]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxtyzzaqxdznhaypyuqslxnzzvgcfuee ; /usr/bin/python3
Dec 06 08:09:12 np0005548789.localdomain sudo[42354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:12 np0005548789.localdomain python3[42356]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008551.556916-74833-47967323570456/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:12 np0005548789.localdomain sudo[42354]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:12 np0005548789.localdomain sudo[42384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekuudzbmoesdhenqghblqdfogttnvabt ; /usr/bin/python3
Dec 06 08:09:12 np0005548789.localdomain sudo[42384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:12 np0005548789.localdomain python3[42386]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:12 np0005548789.localdomain sudo[42384]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:12 np0005548789.localdomain sudo[42402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njixxsnzzglmxbvqxqsjwxoosvpnjkmz ; /usr/bin/python3
Dec 06 08:09:12 np0005548789.localdomain sudo[42402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:13 np0005548789.localdomain python3[42404]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:13 np0005548789.localdomain sudo[42402]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:13 np0005548789.localdomain sudo[42451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olqenmxejpzpigsvhphywdgqkgnedegz ; /usr/bin/python3
Dec 06 08:09:13 np0005548789.localdomain sudo[42451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:13 np0005548789.localdomain python3[42453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:13 np0005548789.localdomain sudo[42451]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:13 np0005548789.localdomain sudo[42494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykyzkfenywkqfwnlrnzrweknelghncnz ; /usr/bin/python3
Dec 06 08:09:13 np0005548789.localdomain sudo[42494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:14 np0005548789.localdomain python3[42496]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008553.2878768-74950-124411490590927/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:14 np0005548789.localdomain sudo[42494]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:14 np0005548789.localdomain sudo[42556]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvswfigcdmjhwtsngtlpvxnqrwzjewcf ; /usr/bin/python3
Dec 06 08:09:14 np0005548789.localdomain sudo[42556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:14 np0005548789.localdomain python3[42558]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:14 np0005548789.localdomain sudo[42556]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:14 np0005548789.localdomain sudo[42599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqlwlcgvvadizrtonhqoveovridqyxjw ; /usr/bin/python3
Dec 06 08:09:14 np0005548789.localdomain sudo[42599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:15 np0005548789.localdomain python3[42601]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008554.224436-75009-207700804968311/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:15 np0005548789.localdomain sudo[42599]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:15 np0005548789.localdomain sudo[42661]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbsrybqzzyxzcryrzydvuvijscekdpgc ; /usr/bin/python3
Dec 06 08:09:15 np0005548789.localdomain sudo[42661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:15 np0005548789.localdomain python3[42663]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:15 np0005548789.localdomain sudo[42661]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:15 np0005548789.localdomain sudo[42704]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twmsthsiixbytocdgthccybrmpdftsvn ; /usr/bin/python3
Dec 06 08:09:15 np0005548789.localdomain sudo[42704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:15 np0005548789.localdomain python3[42706]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008555.2181327-75072-117887151202228/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:15 np0005548789.localdomain sudo[42704]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:16 np0005548789.localdomain sudo[42766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-howqymrfooamobujkfezjfikvonbkjik ; /usr/bin/python3
Dec 06 08:09:16 np0005548789.localdomain sudo[42766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:16 np0005548789.localdomain python3[42768]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:16 np0005548789.localdomain sudo[42766]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:16 np0005548789.localdomain sudo[42809]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inxddntfndhnuxqmbruuscfsjxoydnvd ; /usr/bin/python3
Dec 06 08:09:16 np0005548789.localdomain sudo[42809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:16 np0005548789.localdomain python3[42811]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008556.092601-75130-5901537059776/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:16 np0005548789.localdomain sudo[42809]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:17 np0005548789.localdomain sudo[42871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqidiqpihjxnlhxypbxhezrwodfrhjbl ; /usr/bin/python3
Dec 06 08:09:17 np0005548789.localdomain sudo[42871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:17 np0005548789.localdomain python3[42873]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:17 np0005548789.localdomain sudo[42871]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:18 np0005548789.localdomain sudo[42914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykmitcooquibowiisfguknsrkqhbgquk ; /usr/bin/python3
Dec 06 08:09:18 np0005548789.localdomain sudo[42914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:18 np0005548789.localdomain python3[42916]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008557.0158777-75214-124050043087569/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:18 np0005548789.localdomain sudo[42914]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:18 np0005548789.localdomain sudo[42944]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxmqfbxescwbnqptusqzqmimxalfvfva ; /usr/bin/python3
Dec 06 08:09:18 np0005548789.localdomain sudo[42944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:18 np0005548789.localdomain python3[42946]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:18 np0005548789.localdomain sudo[42944]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:19 np0005548789.localdomain sudo[43009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqbyfuvrjgoauryngkabmmyziuqkyhrk ; /usr/bin/python3
Dec 06 08:09:19 np0005548789.localdomain sudo[43009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:19 np0005548789.localdomain python3[43011]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:19 np0005548789.localdomain sudo[43009]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:19 np0005548789.localdomain sudo[43026]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccsrqnaonefgpfjzylbouiykkgkzdpth ; /usr/bin/python3
Dec 06 08:09:19 np0005548789.localdomain sudo[43026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:19 np0005548789.localdomain python3[43028]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:19 np0005548789.localdomain sudo[43026]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548789.localdomain sudo[43043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htcjhuzggddpavdxbsjgrlfhuhceozpi ; /usr/bin/python3
Dec 06 08:09:20 np0005548789.localdomain sudo[43043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:20 np0005548789.localdomain python3[43045]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:20 np0005548789.localdomain sudo[43043]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548789.localdomain sudo[43062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmhwplirwgvfderavanfphqqnqeihqzy ; /usr/bin/python3
Dec 06 08:09:20 np0005548789.localdomain sudo[43062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:20 np0005548789.localdomain python3[43064]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:20 np0005548789.localdomain sudo[43062]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:20 np0005548789.localdomain sudo[43078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cehteqpoiwmgngjljbiasgycktbuofcd ; /usr/bin/python3
Dec 06 08:09:20 np0005548789.localdomain sudo[43078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:20 np0005548789.localdomain python3[43080]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:21 np0005548789.localdomain sudo[43078]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:21 np0005548789.localdomain sudo[43094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjzmvvxsbxjiqtoxewzfdiztfocvbsgy ; /usr/bin/python3
Dec 06 08:09:21 np0005548789.localdomain sudo[43094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:21 np0005548789.localdomain python3[43096]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:21 np0005548789.localdomain sudo[43094]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:21 np0005548789.localdomain sudo[43110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgagxghkzjkvbjciwyiwovlkabkbwnij ; /usr/bin/python3
Dec 06 08:09:21 np0005548789.localdomain sudo[43110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:21 np0005548789.localdomain python3[43112]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:22 np0005548789.localdomain sudo[43110]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:22 np0005548789.localdomain sudo[43131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcajxylsfmpqhqmkanfsoxhbgocwprbf ; /usr/bin/python3
Dec 06 08:09:22 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Dec 06 08:09:22 np0005548789.localdomain sudo[43131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:22 np0005548789.localdomain python3[43133]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:23 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:23 np0005548789.localdomain sudo[43131]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:23 np0005548789.localdomain sudo[43156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdzjsrbijujvultpysxhuvvehdosgppx ; /usr/bin/python3
Dec 06 08:09:23 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 06 08:09:23 np0005548789.localdomain sudo[43156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:24 np0005548789.localdomain python3[43158]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:24 np0005548789.localdomain sshd[43159]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:24 np0005548789.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:25 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:25 np0005548789.localdomain sudo[43156]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:25 np0005548789.localdomain sudo[43179]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzkfqpbvepnvlewofkgcrnizlxdcjqcd ; /usr/bin/python3
Dec 06 08:09:25 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 06 08:09:25 np0005548789.localdomain sudo[43179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:25 np0005548789.localdomain python3[43181]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:09:25 np0005548789.localdomain sshd[43159]: Received disconnect from 154.201.83.49 port 34896:11: Bye Bye [preauth]
Dec 06 08:09:25 np0005548789.localdomain sshd[43159]: Disconnected from authenticating user root 154.201.83.49 port 34896 [preauth]
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  Converting 2703 SID table entries...
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:09:26 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:09:26 np0005548789.localdomain sudo[43179]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:26 np0005548789.localdomain sudo[43257]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzxnrvnzlvnxgvuqvygccsezngmlldzj ; /usr/bin/python3
Dec 06 08:09:26 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 06 08:09:26 np0005548789.localdomain sudo[43257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:26 np0005548789.localdomain python3[43259]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:26 np0005548789.localdomain sudo[43257]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:26 np0005548789.localdomain sudo[43273]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqheephsmgktaazinycpacpjrsyesodh ; /usr/bin/python3
Dec 06 08:09:26 np0005548789.localdomain sudo[43273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548789.localdomain python3[43275]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:27 np0005548789.localdomain sudo[43273]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:27 np0005548789.localdomain sudo[43289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jogaalvyjtaqmaenltwuijzlgptpozmm ; /usr/bin/python3
Dec 06 08:09:27 np0005548789.localdomain sudo[43289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548789.localdomain python3[43291]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:27 np0005548789.localdomain sudo[43289]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:27 np0005548789.localdomain sudo[43305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwbxbfztlqcycjebxktebxbaumrcshlv ; /usr/bin/python3
Dec 06 08:09:27 np0005548789.localdomain sudo[43305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:27 np0005548789.localdomain python3[43307]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:27 np0005548789.localdomain sudo[43305]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:28 np0005548789.localdomain sudo[43321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-witvjitjihnbfsoqgbdflwbcdjsnhgzx ; /usr/bin/python3
Dec 06 08:09:28 np0005548789.localdomain sudo[43321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:28 np0005548789.localdomain python3[43323]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:28 np0005548789.localdomain sudo[43321]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:28 np0005548789.localdomain sudo[43338]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypuubjpfjrgddvvnvukitdtbnzmrfhsb ; /usr/bin/python3
Dec 06 08:09:28 np0005548789.localdomain sudo[43338]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:29 np0005548789.localdomain python3[43340]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:09:32 np0005548789.localdomain sudo[43338]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:32 np0005548789.localdomain sudo[43355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idkipkazhhgcxhnpvlaapglvkkkkrlia ; /usr/bin/python3
Dec 06 08:09:32 np0005548789.localdomain sudo[43355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:32 np0005548789.localdomain python3[43357]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:32 np0005548789.localdomain sudo[43355]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:33 np0005548789.localdomain sudo[43403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tykyriajgfjsgumuswowzkssgxrdhnij ; /usr/bin/python3
Dec 06 08:09:33 np0005548789.localdomain sudo[43403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:33 np0005548789.localdomain python3[43405]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:33 np0005548789.localdomain sudo[43403]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:33 np0005548789.localdomain sudo[43446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdwoamoybeksimgzertibndjbpzybjxq ; /usr/bin/python3
Dec 06 08:09:33 np0005548789.localdomain sudo[43446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:33 np0005548789.localdomain python3[43448]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008573.0895329-75984-238479677627678/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:33 np0005548789.localdomain sudo[43446]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:33 np0005548789.localdomain sudo[43476]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjpevhweybiaginbkgdfmwkvmbeqlndu ; /usr/bin/python3
Dec 06 08:09:33 np0005548789.localdomain sudo[43476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:34 np0005548789.localdomain python3[43478]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:09:34 np0005548789.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 08:09:34 np0005548789.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 08:09:34 np0005548789.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 08:09:34 np0005548789.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 08:09:34 np0005548789.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 06 08:09:34 np0005548789.localdomain kernel: Bridge firewalling registered
Dec 06 08:09:34 np0005548789.localdomain systemd-modules-load[43481]: Inserted module 'br_netfilter'
Dec 06 08:09:34 np0005548789.localdomain systemd-modules-load[43481]: Module 'msr' is built in
Dec 06 08:09:34 np0005548789.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 08:09:34 np0005548789.localdomain sudo[43476]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:35 np0005548789.localdomain sshd[43485]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:35 np0005548789.localdomain sudo[43532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkgdlsginlrxzplmmxiwhouospkpmnhz ; /usr/bin/python3
Dec 06 08:09:35 np0005548789.localdomain sudo[43532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:35 np0005548789.localdomain python3[43534]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:35 np0005548789.localdomain sudo[43532]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:35 np0005548789.localdomain sudo[43575]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhfpfxdbkibozacniqdeuvnpgjvjircv ; /usr/bin/python3
Dec 06 08:09:35 np0005548789.localdomain sudo[43575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:36 np0005548789.localdomain python3[43577]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008575.3885028-76055-153341894485097/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:36 np0005548789.localdomain sudo[43575]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:36 np0005548789.localdomain sudo[43605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpkfpianknddkspffdkyrywcizlwqyug ; /usr/bin/python3
Dec 06 08:09:36 np0005548789.localdomain sudo[43605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:36 np0005548789.localdomain python3[43607]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:36 np0005548789.localdomain sudo[43605]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:36 np0005548789.localdomain sudo[43622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfpurzfzgdahwpvxirctfgkkvqkscsct ; /usr/bin/python3
Dec 06 08:09:36 np0005548789.localdomain sudo[43622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:36 np0005548789.localdomain sshd[43485]: Invalid user ftpuser from 45.140.17.124 port 43010
Dec 06 08:09:36 np0005548789.localdomain python3[43624]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:36 np0005548789.localdomain sudo[43622]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548789.localdomain sudo[43640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pngvsxbjymbvdbymseexxavzskhndknm ; /usr/bin/python3
Dec 06 08:09:37 np0005548789.localdomain sudo[43640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548789.localdomain python3[43642]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:37 np0005548789.localdomain sudo[43640]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548789.localdomain sudo[43658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyysvqztegarxwwjbrbfodflajmxysrq ; /usr/bin/python3
Dec 06 08:09:37 np0005548789.localdomain sudo[43658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548789.localdomain python3[43660]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:37 np0005548789.localdomain sudo[43658]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548789.localdomain sshd[43485]: Connection reset by invalid user ftpuser 45.140.17.124 port 43010 [preauth]
Dec 06 08:09:37 np0005548789.localdomain sudo[43675]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pabbwrdkxbjzkkvhqmvjpemqkcnkvywc ; /usr/bin/python3
Dec 06 08:09:37 np0005548789.localdomain sudo[43675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:37 np0005548789.localdomain sshd[43678]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:37 np0005548789.localdomain python3[43677]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:37 np0005548789.localdomain sudo[43675]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:37 np0005548789.localdomain sudo[43693]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knjllffjhxjizvgmqzbviqwuzamecufo ; /usr/bin/python3
Dec 06 08:09:38 np0005548789.localdomain sudo[43693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548789.localdomain python3[43695]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548789.localdomain sudo[43693]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548789.localdomain sudo[43711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhvtewquyuufqllxioifxdslbbqsgmtl ; /usr/bin/python3
Dec 06 08:09:38 np0005548789.localdomain sudo[43711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548789.localdomain python3[43713]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548789.localdomain sudo[43711]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548789.localdomain sudo[43729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tklhbtvpftzumzlcdnwfmvlmmbamergc ; /usr/bin/python3
Dec 06 08:09:38 np0005548789.localdomain sudo[43729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:38 np0005548789.localdomain python3[43731]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:38 np0005548789.localdomain sudo[43729]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:38 np0005548789.localdomain sshd[43734]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:38 np0005548789.localdomain sudo[43749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjxmgxrclfnzhtnaswxlewjsubdzczya ; /usr/bin/python3
Dec 06 08:09:38 np0005548789.localdomain sudo[43749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548789.localdomain python3[43751]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548789.localdomain sudo[43749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548789.localdomain sudo[43767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfxmctkxjvgswgqsshoamktbjzbncumh ; /usr/bin/python3
Dec 06 08:09:39 np0005548789.localdomain sudo[43767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548789.localdomain python3[43769]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548789.localdomain sudo[43767]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548789.localdomain sudo[43785]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyhchtkybmgidvdlsxfmiyhtcyznmdjs ; /usr/bin/python3
Dec 06 08:09:39 np0005548789.localdomain sudo[43785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548789.localdomain python3[43787]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:39 np0005548789.localdomain sudo[43785]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:39 np0005548789.localdomain sshd[43734]: Received disconnect from 195.250.72.168 port 42470:11: Bye Bye [preauth]
Dec 06 08:09:39 np0005548789.localdomain sshd[43734]: Disconnected from authenticating user root 195.250.72.168 port 42470 [preauth]
Dec 06 08:09:39 np0005548789.localdomain sshd[43678]: Connection reset by authenticating user root 45.140.17.124 port 43018 [preauth]
Dec 06 08:09:39 np0005548789.localdomain sudo[43803]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zliflfbspowtvkjrmeetrazuyfghoccf ; /usr/bin/python3
Dec 06 08:09:39 np0005548789.localdomain sudo[43803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:39 np0005548789.localdomain sshd[43806]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:39 np0005548789.localdomain python3[43805]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548789.localdomain sudo[43803]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548789.localdomain sudo[43822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdjlgupyhtktvjneqynheryfmnkwggpd ; /usr/bin/python3
Dec 06 08:09:40 np0005548789.localdomain sudo[43822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548789.localdomain python3[43824]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548789.localdomain sudo[43822]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548789.localdomain sudo[43841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqcrkjxmgctzgfdequwynscbubpkuseh ; /usr/bin/python3
Dec 06 08:09:40 np0005548789.localdomain sudo[43841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548789.localdomain python3[43843]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548789.localdomain sudo[43841]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:40 np0005548789.localdomain sudo[43858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvbzmgkenzntsryzaogyjkduwdvstlal ; /usr/bin/python3
Dec 06 08:09:40 np0005548789.localdomain sudo[43858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:40 np0005548789.localdomain python3[43860]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:40 np0005548789.localdomain sudo[43858]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548789.localdomain sudo[43875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfxkusmjlmdjvcagtjhwpkbvjnuxohoj ; /usr/bin/python3
Dec 06 08:09:41 np0005548789.localdomain sudo[43875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548789.localdomain python3[43877]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548789.localdomain sudo[43875]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548789.localdomain sudo[43892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svpcjxplmyabipmapnbjsxmrcvkrbezz ; /usr/bin/python3
Dec 06 08:09:41 np0005548789.localdomain sudo[43892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548789.localdomain python3[43894]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548789.localdomain sudo[43892]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548789.localdomain sudo[43909]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbpdsfcvwvayzjmqogmsykjrheiqjylw ; /usr/bin/python3
Dec 06 08:09:41 np0005548789.localdomain sudo[43909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:41 np0005548789.localdomain python3[43911]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Dec 06 08:09:41 np0005548789.localdomain sudo[43909]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:41 np0005548789.localdomain sshd[43806]: Connection reset by authenticating user root 45.140.17.124 port 43022 [preauth]
Dec 06 08:09:41 np0005548789.localdomain sudo[43927]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udthdxnaipihuilwuoogcmrphvwljgnm ; /usr/bin/python3
Dec 06 08:09:41 np0005548789.localdomain sudo[43927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548789.localdomain sshd[43930]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:42 np0005548789.localdomain python3[43929]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 08:09:42 np0005548789.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 06 08:09:42 np0005548789.localdomain sudo[43927]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:42 np0005548789.localdomain sudo[43949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovjkkpzkerdsrmjrrlahicnhjznyognf ; /usr/bin/python3
Dec 06 08:09:42 np0005548789.localdomain sudo[43949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548789.localdomain python3[43951]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:42 np0005548789.localdomain sudo[43949]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:42 np0005548789.localdomain sudo[43965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiboxiqqazyzlysgafjpjtshqwecmkfc ; /usr/bin/python3
Dec 06 08:09:42 np0005548789.localdomain sudo[43965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:42 np0005548789.localdomain python3[43967]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:42 np0005548789.localdomain sudo[43965]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548789.localdomain sudo[43981]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiooocqjercleevoptsblhoissnkzunk ; /usr/bin/python3
Dec 06 08:09:43 np0005548789.localdomain sudo[43981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548789.localdomain python3[43983]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:43 np0005548789.localdomain sudo[43981]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548789.localdomain sudo[43997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-naqrtznxsujobtbdcfqgyedgwwlyeyyo ; /usr/bin/python3
Dec 06 08:09:43 np0005548789.localdomain sudo[43997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548789.localdomain sshd[43930]: Invalid user 1 from 45.140.17.124 port 43032
Dec 06 08:09:43 np0005548789.localdomain python3[43999]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:09:43 np0005548789.localdomain sudo[43997]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:43 np0005548789.localdomain sudo[44013]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btdmwjyckvnfwpkxpbrzduoouzfqpdoc ; /usr/bin/python3
Dec 06 08:09:43 np0005548789.localdomain sudo[44013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:43 np0005548789.localdomain sshd[43930]: Connection reset by invalid user 1 45.140.17.124 port 43032 [preauth]
Dec 06 08:09:43 np0005548789.localdomain python3[44015]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:43 np0005548789.localdomain sudo[44013]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548789.localdomain sshd[44030]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:09:44 np0005548789.localdomain sudo[44029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euqovbjivprmvjnzdjhauzhsomgmoiub ; /usr/bin/python3
Dec 06 08:09:44 np0005548789.localdomain sudo[44029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548789.localdomain python3[44032]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:44 np0005548789.localdomain sudo[44029]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548789.localdomain sudo[44046]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuujadalskwkxyhysrvqucpyaavnamwq ; /usr/bin/python3
Dec 06 08:09:44 np0005548789.localdomain sudo[44046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548789.localdomain python3[44049]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:44 np0005548789.localdomain sudo[44046]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548789.localdomain sudo[44063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asixjaqkqwqejyadltblzczokrtgxrnt ; /usr/bin/python3
Dec 06 08:09:44 np0005548789.localdomain sudo[44063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:44 np0005548789.localdomain sudo[44066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:09:44 np0005548789.localdomain sudo[44066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:44 np0005548789.localdomain sudo[44066]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548789.localdomain python3[44065]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:44 np0005548789.localdomain sudo[44063]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:44 np0005548789.localdomain sudo[44081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:09:44 np0005548789.localdomain sudo[44081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:45 np0005548789.localdomain sudo[44109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsnjdlpswnooireuipbicaylelgympfc ; /usr/bin/python3
Dec 06 08:09:45 np0005548789.localdomain sudo[44109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:45 np0005548789.localdomain python3[44111]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:45 np0005548789.localdomain sudo[44109]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548789.localdomain sudo[44081]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548789.localdomain sudo[44147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:09:45 np0005548789.localdomain sudo[44147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:45 np0005548789.localdomain sudo[44147]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548789.localdomain sudo[44181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:09:45 np0005548789.localdomain sudo[44181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:45 np0005548789.localdomain sudo[44208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bebbvfcsbivixsemiceocloeikaeuksj ; /usr/bin/python3
Dec 06 08:09:45 np0005548789.localdomain sudo[44208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:45 np0005548789.localdomain sshd[44030]: Invalid user admin from 45.140.17.124 port 52520
Dec 06 08:09:45 np0005548789.localdomain python3[44211]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:45 np0005548789.localdomain sudo[44208]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:45 np0005548789.localdomain sudo[44266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csjbfgrpnjtezurliejrbqpdopejtibn ; /usr/bin/python3
Dec 06 08:09:45 np0005548789.localdomain sudo[44266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:45 np0005548789.localdomain sshd[44030]: Connection reset by invalid user admin 45.140.17.124 port 52520 [preauth]
Dec 06 08:09:46 np0005548789.localdomain sudo[44181]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548789.localdomain python3[44271]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008585.3822806-76540-74399181822859/source _original_basename=tmpj09ev3nq follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:46 np0005548789.localdomain sudo[44266]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:46 np0005548789.localdomain sudo[44313]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoiefnufbkcyyscbvaiszjsqmznvprxd ; /usr/bin/python3
Dec 06 08:09:46 np0005548789.localdomain sudo[44313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:46 np0005548789.localdomain python3[44315]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:46 np0005548789.localdomain sudo[44313]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:47 np0005548789.localdomain sudo[44336]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giskoufkjivjeoporytdlubdhkmuxcsl ; /usr/bin/python3
Dec 06 08:09:47 np0005548789.localdomain sudo[44336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:47 np0005548789.localdomain sudo[44324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:09:47 np0005548789.localdomain sudo[44324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:09:47 np0005548789.localdomain sudo[44324]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:47 np0005548789.localdomain python3[44346]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:47 np0005548789.localdomain sudo[44336]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:48 np0005548789.localdomain sudo[44393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqcxqszrptqimdjjwyggdmczdmnumrqo ; /usr/bin/python3
Dec 06 08:09:48 np0005548789.localdomain sudo[44393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:48 np0005548789.localdomain python3[44395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:48 np0005548789.localdomain sudo[44393]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:48 np0005548789.localdomain sudo[44436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llanhcwxozbhreactwnzkdtkfzhdbxhk ; /usr/bin/python3
Dec 06 08:09:48 np0005548789.localdomain sudo[44436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:48 np0005548789.localdomain python3[44438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008587.8587005-76702-98835160614507/source _original_basename=tmprnp7vuv0 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:48 np0005548789.localdomain sudo[44436]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:48 np0005548789.localdomain sudo[44466]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npboxhqggbjxnchjuicrfyeysbdwvugt ; /usr/bin/python3
Dec 06 08:09:48 np0005548789.localdomain sudo[44466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:49 np0005548789.localdomain python3[44468]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:49 np0005548789.localdomain sudo[44466]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548789.localdomain sudo[44482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjkzceqlkghqzfrfmlkqjxqduadkybhw ; /usr/bin/python3
Dec 06 08:09:49 np0005548789.localdomain sudo[44482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:49 np0005548789.localdomain python3[44484]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:49 np0005548789.localdomain sudo[44482]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548789.localdomain sudo[44498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apxaqyyffbnakthegbcszizeayzgwjbd ; /usr/bin/python3
Dec 06 08:09:49 np0005548789.localdomain sudo[44498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:49 np0005548789.localdomain python3[44500]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:49 np0005548789.localdomain sudo[44498]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:49 np0005548789.localdomain sudo[44514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rglgoreicyikmdybyyrbzbwwyoinzyxc ; /usr/bin/python3
Dec 06 08:09:49 np0005548789.localdomain sudo[44514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548789.localdomain python3[44516]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548789.localdomain sudo[44514]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548789.localdomain sudo[44530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-firuozxoshzwjyhqflmctxkjxgtjsiur ; /usr/bin/python3
Dec 06 08:09:50 np0005548789.localdomain sudo[44530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548789.localdomain python3[44532]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548789.localdomain sudo[44530]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548789.localdomain sudo[44546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrpwyrzyehpjvqnbokkxbwornogvezkc ; /usr/bin/python3
Dec 06 08:09:50 np0005548789.localdomain sudo[44546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548789.localdomain python3[44548]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:50 np0005548789.localdomain sudo[44546]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:50 np0005548789.localdomain sudo[44562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diclbopclmsjdtvipyhjrdsnanhpkyal ; /usr/bin/python3
Dec 06 08:09:50 np0005548789.localdomain sudo[44562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:50 np0005548789.localdomain python3[44564]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:50 np0005548789.localdomain sudo[44562]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548789.localdomain sudo[44578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yatwbfdmxjrfdylchcisejgbuldgdsnu ; /usr/bin/python3
Dec 06 08:09:51 np0005548789.localdomain sudo[44578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:51 np0005548789.localdomain python3[44580]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:51 np0005548789.localdomain sudo[44578]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548789.localdomain sudo[44594]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxbzfxewyuqkxnhbvuqiqydnpaxmazac ; /usr/bin/python3
Dec 06 08:09:51 np0005548789.localdomain sudo[44594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:51 np0005548789.localdomain python3[44596]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:51 np0005548789.localdomain sudo[44594]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:51 np0005548789.localdomain sudo[44610]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgbaffkwdrijtmgoaklrkyhcfzcpntsr ; /usr/bin/python3
Dec 06 08:09:51 np0005548789.localdomain sudo[44610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:52 np0005548789.localdomain python3[44612]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Dec 06 08:09:52 np0005548789.localdomain groupadd[44613]: group added to /etc/group: name=qemu, GID=107
Dec 06 08:09:52 np0005548789.localdomain groupadd[44613]: group added to /etc/gshadow: name=qemu
Dec 06 08:09:52 np0005548789.localdomain groupadd[44613]: new group: name=qemu, GID=107
Dec 06 08:09:52 np0005548789.localdomain sudo[44610]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:52 np0005548789.localdomain sudo[44632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqmrfwdqamozuqymdgvczjwwercqunhy ; /usr/bin/python3
Dec 06 08:09:52 np0005548789.localdomain sudo[44632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:52 np0005548789.localdomain python3[44634]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 08:09:52 np0005548789.localdomain useradd[44636]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Dec 06 08:09:52 np0005548789.localdomain sudo[44632]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:52 np0005548789.localdomain sudo[44656]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgtfplzrxkkynatllhlnjmjqrmcbgefo ; /usr/bin/python3
Dec 06 08:09:52 np0005548789.localdomain sudo[44656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548789.localdomain python3[44658]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Dec 06 08:09:53 np0005548789.localdomain sudo[44656]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:53 np0005548789.localdomain sudo[44672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnicmfsfxundblutvapzkrnbchtratwf ; /usr/bin/python3
Dec 06 08:09:53 np0005548789.localdomain sudo[44672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548789.localdomain python3[44674]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:09:53 np0005548789.localdomain sudo[44672]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:53 np0005548789.localdomain sudo[44721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwbgzrabhhrpzxsdzvoaarzveougypoz ; /usr/bin/python3
Dec 06 08:09:53 np0005548789.localdomain sudo[44721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:53 np0005548789.localdomain python3[44723]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:09:53 np0005548789.localdomain sudo[44721]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:54 np0005548789.localdomain sudo[44764]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxcnakbmelcwvekxchcupiphhnjzxift ; /usr/bin/python3
Dec 06 08:09:54 np0005548789.localdomain sudo[44764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:54 np0005548789.localdomain python3[44766]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008593.6286278-76979-185023022115093/source _original_basename=tmpuh2rpkk0 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:54 np0005548789.localdomain sudo[44764]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:54 np0005548789.localdomain sudo[44794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rotfwjlwnkveblpzqqfutufewewfrxds ; /usr/bin/python3
Dec 06 08:09:54 np0005548789.localdomain sudo[44794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:54 np0005548789.localdomain python3[44796]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:55 np0005548789.localdomain sudo[44794]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:55 np0005548789.localdomain sudo[44814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuiforbothdgnrgykbeplfhwgrdrzvdk ; /usr/bin/python3
Dec 06 08:09:55 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 06 08:09:55 np0005548789.localdomain sudo[44814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:55 np0005548789.localdomain python3[44816]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:09:55 np0005548789.localdomain sudo[44814]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:55 np0005548789.localdomain sudo[44830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuoqucrbtkvvvvgjqpizoesffgeyqoqn ; /usr/bin/python3
Dec 06 08:09:55 np0005548789.localdomain sudo[44830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:56 np0005548789.localdomain python3[44832]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:09:56 np0005548789.localdomain sudo[44830]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:56 np0005548789.localdomain sudo[44846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssmoilmafsbzjwibygxvetxztupdsysl ; /usr/bin/python3
Dec 06 08:09:56 np0005548789.localdomain sudo[44846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:56 np0005548789.localdomain python3[44848]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Dec 06 08:09:57 np0005548789.localdomain sudo[44846]: pam_unix(sudo:session): session closed for user root
Dec 06 08:09:57 np0005548789.localdomain sudo[44866]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpebijnwnyqyibqpdclcxpdpsoljufhw ; /usr/bin/python3
Dec 06 08:09:57 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 06 08:09:57 np0005548789.localdomain sudo[44866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:09:57 np0005548789.localdomain python3[44868]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:00 np0005548789.localdomain sudo[44866]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:00 np0005548789.localdomain sudo[44883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-reklelbtiisthjjsajyfnejutmtadsyl ; /usr/bin/python3
Dec 06 08:10:00 np0005548789.localdomain sudo[44883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:00 np0005548789.localdomain python3[44885]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 08:10:01 np0005548789.localdomain sudo[44883]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:01 np0005548789.localdomain sudo[44944]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdnipiyxguiazovsucvsopcvqrghlukh ; /usr/bin/python3
Dec 06 08:10:01 np0005548789.localdomain sudo[44944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:01 np0005548789.localdomain python3[44946]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:01 np0005548789.localdomain sudo[44944]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:01 np0005548789.localdomain sudo[44960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfzassjgyjgwltbckmffjiaeqhniccke ; /usr/bin/python3
Dec 06 08:10:01 np0005548789.localdomain sudo[44960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:01 np0005548789.localdomain python3[44962]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:02 np0005548789.localdomain sudo[44960]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:02 np0005548789.localdomain sudo[45020]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzjwqipuzokoutnbqymxnhltwkmnybfe ; /usr/bin/python3
Dec 06 08:10:02 np0005548789.localdomain sudo[45020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:02 np0005548789.localdomain python3[45022]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:02 np0005548789.localdomain sudo[45020]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:02 np0005548789.localdomain sudo[45063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxcrfijeqrxyhxeaeukvpfkunilevjyz ; /usr/bin/python3
Dec 06 08:10:02 np0005548789.localdomain sudo[45063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:02 np0005548789.localdomain python3[45065]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008602.1679893-77340-184528284703072/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=ac5a4647d8c1518748e8118ddace0562c0bf12f6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:03 np0005548789.localdomain sudo[45063]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:03 np0005548789.localdomain sudo[45125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngfzyzhmsbukauoyaqvmwdgjnolvilub ; /usr/bin/python3
Dec 06 08:10:03 np0005548789.localdomain sudo[45125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:03 np0005548789.localdomain python3[45127]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:03 np0005548789.localdomain sudo[45125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:03 np0005548789.localdomain sudo[45170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptegoangkpgumoclqktycglrkoysomeu ; /usr/bin/python3
Dec 06 08:10:03 np0005548789.localdomain sudo[45170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:04 np0005548789.localdomain python3[45172]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008603.209031-77384-88706933372472/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:04 np0005548789.localdomain sudo[45170]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:04 np0005548789.localdomain sudo[45200]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jktidaljkgteajgafpdpnweeofpdfspi ; /usr/bin/python3
Dec 06 08:10:04 np0005548789.localdomain sudo[45200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:04 np0005548789.localdomain python3[45202]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:04 np0005548789.localdomain sudo[45200]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:04 np0005548789.localdomain sudo[45216]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ounqfggdfcrzakeygygqvowhasxrkeoi ; /usr/bin/python3
Dec 06 08:10:04 np0005548789.localdomain sudo[45216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:04 np0005548789.localdomain python3[45218]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:04 np0005548789.localdomain sudo[45216]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:04 np0005548789.localdomain sudo[45232]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uehpojqfrjifsqnyhmjnftrajrvvvpkb ; /usr/bin/python3
Dec 06 08:10:04 np0005548789.localdomain sudo[45232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548789.localdomain python3[45234]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548789.localdomain sudo[45232]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:05 np0005548789.localdomain sudo[45248]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdpdibivciguangfxqldjfpfpicxrums ; /usr/bin/python3
Dec 06 08:10:05 np0005548789.localdomain sudo[45248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:05 np0005548789.localdomain python3[45250]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:05 np0005548789.localdomain sudo[45248]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:06 np0005548789.localdomain sudo[45296]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipadlrgtjqcowchspfhdkbcxsvxgjxdf ; /usr/bin/python3
Dec 06 08:10:06 np0005548789.localdomain sudo[45296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:06 np0005548789.localdomain python3[45298]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:06 np0005548789.localdomain sudo[45296]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:06 np0005548789.localdomain sudo[45339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psongbbyskmnnbdnebxpxbmvyqbyuxcf ; /usr/bin/python3
Dec 06 08:10:06 np0005548789.localdomain sudo[45339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:06 np0005548789.localdomain python3[45341]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008605.8756714-77514-2091158185547/source _original_basename=tmpvbmkp_9c follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:06 np0005548789.localdomain sudo[45339]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:06 np0005548789.localdomain sudo[45369]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diactgstxghtzdxewpitiklmprkdzslg ; /usr/bin/python3
Dec 06 08:10:06 np0005548789.localdomain sudo[45369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:06 np0005548789.localdomain python3[45371]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:06 np0005548789.localdomain sudo[45369]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:07 np0005548789.localdomain sudo[45385]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsnstillbppctcqxkplszuggwtikvknn ; /usr/bin/python3
Dec 06 08:10:07 np0005548789.localdomain sudo[45385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:07 np0005548789.localdomain python3[45387]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:07 np0005548789.localdomain sudo[45385]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:07 np0005548789.localdomain sudo[45401]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfvluofouduffxhxftodsjlsuxafvvrb ; /usr/bin/python3
Dec 06 08:10:07 np0005548789.localdomain sudo[45401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:07 np0005548789.localdomain python3[45403]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3259 writes, 16K keys, 3259 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3259 writes, 145 syncs, 22.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3259 writes, 16K keys, 3259 commit groups, 1.0 writes per commit group, ingest: 14.68 MB, 0.02 MB/s
                                                          Interval WAL: 3259 writes, 145 syncs, 22.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:10:10 np0005548789.localdomain sudo[45401]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:11 np0005548789.localdomain sudo[45450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slazldfhlzwtgtdkqrqwoghbuqtvnygt ; /usr/bin/python3
Dec 06 08:10:11 np0005548789.localdomain sudo[45450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:11 np0005548789.localdomain python3[45452]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:11 np0005548789.localdomain sudo[45450]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:11 np0005548789.localdomain sudo[45495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxijtpmvrvahvjuqahrnmkgzytfpwnig ; /usr/bin/python3
Dec 06 08:10:11 np0005548789.localdomain sudo[45495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:11 np0005548789.localdomain python3[45497]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008611.103243-77971-262711034243748/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:11 np0005548789.localdomain sudo[45495]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548789.localdomain sudo[45526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukfajtmsvzrnalyixtqvavlcjvewuyue ; /usr/bin/python3
Dec 06 08:10:12 np0005548789.localdomain sudo[45526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:12 np0005548789.localdomain python3[45528]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 08:10:12 np0005548789.localdomain sshd[1130]: Received signal 15; terminating.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: sshd.service: Consumed 7.675s CPU time, read 2.1M from disk, written 52.0K to disk.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 08:10:12 np0005548789.localdomain sshd[45532]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:10:12 np0005548789.localdomain sshd[45532]: Server listening on 0.0.0.0 port 22.
Dec 06 08:10:12 np0005548789.localdomain sshd[45532]: Server listening on :: port 22.
Dec 06 08:10:12 np0005548789.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 08:10:12 np0005548789.localdomain sudo[45526]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548789.localdomain sudo[45546]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkdcwzsnfwuuabiujnqwvkdydksnbqrn ; /usr/bin/python3
Dec 06 08:10:12 np0005548789.localdomain sudo[45546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:12 np0005548789.localdomain python3[45548]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:12 np0005548789.localdomain sudo[45546]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Cumulative writes: 3388 writes, 16K keys, 3388 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3388 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3388 writes, 16K keys, 3388 commit groups, 1.0 writes per commit group, ingest: 15.28 MB, 0.03 MB/s
                                                          Interval WAL: 3388 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:10:13 np0005548789.localdomain sudo[45564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojuphhhkvycmkoicnbmzgprjmpguabxh ; /usr/bin/python3
Dec 06 08:10:13 np0005548789.localdomain sudo[45564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:13 np0005548789.localdomain python3[45566]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:13 np0005548789.localdomain sudo[45564]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:14 np0005548789.localdomain sudo[45582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvvbakbwpmyngmjprthodgetyhzqbonn ; /usr/bin/python3
Dec 06 08:10:14 np0005548789.localdomain sudo[45582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:14 np0005548789.localdomain python3[45584]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:10:16 np0005548789.localdomain sudo[45582]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:17 np0005548789.localdomain sudo[45631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xurxjknfzxizwiktrqgomwleydiyolex ; /usr/bin/python3
Dec 06 08:10:17 np0005548789.localdomain sudo[45631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:17 np0005548789.localdomain python3[45633]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:17 np0005548789.localdomain sudo[45631]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:17 np0005548789.localdomain sudo[45649]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yosvkaejzbyiaawmqjtcvfkdatgoixqv ; /usr/bin/python3
Dec 06 08:10:17 np0005548789.localdomain sudo[45649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:18 np0005548789.localdomain python3[45651]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:18 np0005548789.localdomain sudo[45649]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:18 np0005548789.localdomain sudo[45679]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmxruuburvzafzpahupzendsatkcysde ; /usr/bin/python3
Dec 06 08:10:18 np0005548789.localdomain sudo[45679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:18 np0005548789.localdomain python3[45681]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:18 np0005548789.localdomain sudo[45679]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:19 np0005548789.localdomain sudo[45729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecqgpfrmcxhirwhyhuvlekgipitorccq ; /usr/bin/python3
Dec 06 08:10:19 np0005548789.localdomain sudo[45729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:19 np0005548789.localdomain python3[45731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:19 np0005548789.localdomain sudo[45729]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:19 np0005548789.localdomain sudo[45747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcvjilxqrjsgxhbloqcphlezwekgufur ; /usr/bin/python3
Dec 06 08:10:19 np0005548789.localdomain sudo[45747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:19 np0005548789.localdomain python3[45749]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:19 np0005548789.localdomain sudo[45747]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:20 np0005548789.localdomain sudo[45777]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aznypgchiqkmhkcmrqqzxcihrejsjpdd ; /usr/bin/python3
Dec 06 08:10:20 np0005548789.localdomain sudo[45777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:20 np0005548789.localdomain python3[45779]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:10:20 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:10:20 np0005548789.localdomain systemd-sysv-generator[45808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:10:20 np0005548789.localdomain systemd-rc-local-generator[45803]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:10:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:10:20 np0005548789.localdomain systemd[1]: Starting chronyd online sources service...
Dec 06 08:10:20 np0005548789.localdomain chronyc[45819]: 200 OK
Dec 06 08:10:20 np0005548789.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Dec 06 08:10:20 np0005548789.localdomain systemd[1]: Finished chronyd online sources service.
Dec 06 08:10:20 np0005548789.localdomain sudo[45777]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548789.localdomain sudo[45833]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgummaaeikeshshfbkgntpxogrjfnrso ; /usr/bin/python3
Dec 06 08:10:21 np0005548789.localdomain sudo[45833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:21 np0005548789.localdomain python3[45835]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:21 np0005548789.localdomain chronyd[25988]: System clock was stepped by 0.000002 seconds
Dec 06 08:10:21 np0005548789.localdomain sudo[45833]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548789.localdomain sudo[45850]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnvdszziaikazhmfqgaynjeqfvibciok ; /usr/bin/python3
Dec 06 08:10:21 np0005548789.localdomain sudo[45850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:21 np0005548789.localdomain python3[45852]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:21 np0005548789.localdomain sudo[45850]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:21 np0005548789.localdomain sudo[45867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imaobymtfghyfpwxzvliqhmdqoxgqaqd ; /usr/bin/python3
Dec 06 08:10:21 np0005548789.localdomain sudo[45867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:22 np0005548789.localdomain python3[45869]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:22 np0005548789.localdomain chronyd[25988]: System clock was stepped by 0.000017 seconds
Dec 06 08:10:22 np0005548789.localdomain sudo[45867]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:22 np0005548789.localdomain sudo[45884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npjbdzkyyskjgmlowanllmmwgdkxudcx ; /usr/bin/python3
Dec 06 08:10:22 np0005548789.localdomain sudo[45884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:22 np0005548789.localdomain python3[45886]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:22 np0005548789.localdomain sudo[45884]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:22 np0005548789.localdomain sudo[45901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciryfzonaartrqwvxontmhannqtiflcu ; /usr/bin/python3
Dec 06 08:10:22 np0005548789.localdomain sudo[45901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:22 np0005548789.localdomain python3[45903]: ansible-timezone Invoked with name=UTC hwclock=None
Dec 06 08:10:22 np0005548789.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 08:10:23 np0005548789.localdomain systemd[1]: Started Time & Date Service.
Dec 06 08:10:23 np0005548789.localdomain sudo[45901]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:23 np0005548789.localdomain sudo[45921]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szfqkhywggpadgyhzwwkejxuarxymphe ; /usr/bin/python3
Dec 06 08:10:23 np0005548789.localdomain sudo[45921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:24 np0005548789.localdomain python3[45923]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:24 np0005548789.localdomain sudo[45921]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:24 np0005548789.localdomain sudo[45938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnkqsrrjszfikihkisdhvoxpffciybwf ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Dec 06 08:10:24 np0005548789.localdomain sudo[45938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:24 np0005548789.localdomain python3[45940]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:24 np0005548789.localdomain sudo[45938]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:24 np0005548789.localdomain sudo[45955]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yikexnmkvdvdqoxevjrpegktombdjxvs ; /usr/bin/python3
Dec 06 08:10:24 np0005548789.localdomain sudo[45955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:25 np0005548789.localdomain python3[45957]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Dec 06 08:10:25 np0005548789.localdomain sudo[45955]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:25 np0005548789.localdomain sudo[45971]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fikdxsteeptgpzlsvnwzlhzdzcincgbc ; /usr/bin/python3
Dec 06 08:10:25 np0005548789.localdomain sudo[45971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:25 np0005548789.localdomain python3[45973]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:10:25 np0005548789.localdomain sudo[45971]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:26 np0005548789.localdomain sudo[45987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkmgmvifkishuuzucbialdjrvvghjghy ; /usr/bin/python3
Dec 06 08:10:26 np0005548789.localdomain sudo[45987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:26 np0005548789.localdomain python3[45989]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:26 np0005548789.localdomain sudo[45987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:26 np0005548789.localdomain sudo[46003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juygfabqohbnimhvoqbjkszhrdrgcgmw ; /usr/bin/python3
Dec 06 08:10:26 np0005548789.localdomain sudo[46003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:26 np0005548789.localdomain python3[46005]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:26 np0005548789.localdomain sudo[46003]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:26 np0005548789.localdomain sudo[46051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhspkadydekknfobbefqskbxrxlaktbd ; /usr/bin/python3
Dec 06 08:10:26 np0005548789.localdomain sudo[46051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:26 np0005548789.localdomain python3[46053]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:26 np0005548789.localdomain sudo[46051]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:27 np0005548789.localdomain sudo[46094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khdcxqythyoyvsvhxtnzvharkkoowtvv ; /usr/bin/python3
Dec 06 08:10:27 np0005548789.localdomain sudo[46094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:27 np0005548789.localdomain python3[46096]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008626.6419804-78802-199585271786632/source _original_basename=tmpalzo4kk7 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:27 np0005548789.localdomain sudo[46094]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:27 np0005548789.localdomain sudo[46156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uugvwfsaqariqwnfiatlmwovehibirkw ; /usr/bin/python3
Dec 06 08:10:27 np0005548789.localdomain sudo[46156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:27 np0005548789.localdomain python3[46158]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:27 np0005548789.localdomain sudo[46156]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:27 np0005548789.localdomain sudo[46199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyovwbimdveoyehpzdjiykmxrxxviwua ; /usr/bin/python3
Dec 06 08:10:28 np0005548789.localdomain sudo[46199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:28 np0005548789.localdomain python3[46201]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008627.4954515-78852-90190994906431/source _original_basename=tmpvy089fds follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:28 np0005548789.localdomain sudo[46199]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:28 np0005548789.localdomain sudo[46229]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzdwbdtdfgoaboemzafqchhjnrtvngxc ; /usr/bin/python3
Dec 06 08:10:28 np0005548789.localdomain sudo[46229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:28 np0005548789.localdomain python3[46231]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:10:28 np0005548789.localdomain sshd[46233]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:10:28 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:10:28 np0005548789.localdomain systemd-rc-local-generator[46259]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:10:28 np0005548789.localdomain systemd-sysv-generator[46262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:10:28 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:10:28 np0005548789.localdomain sudo[46229]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:29 np0005548789.localdomain sudo[46284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyyoaihaibzdrswheiedvlnhvqqdtxxx ; /usr/bin/python3
Dec 06 08:10:29 np0005548789.localdomain sudo[46284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:29 np0005548789.localdomain python3[46286]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:29 np0005548789.localdomain sudo[46284]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:29 np0005548789.localdomain sshd[46233]: Received disconnect from 74.94.234.151 port 60594:11: Bye Bye [preauth]
Dec 06 08:10:29 np0005548789.localdomain sshd[46233]: Disconnected from authenticating user root 74.94.234.151 port 60594 [preauth]
Dec 06 08:10:29 np0005548789.localdomain sudo[46300]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvdozzyosmeuigposbijzvyoryechroe ; /usr/bin/python3
Dec 06 08:10:29 np0005548789.localdomain sudo[46300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:29 np0005548789.localdomain python3[46302]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:29 np0005548789.localdomain sudo[46300]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:29 np0005548789.localdomain sudo[46317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uezjlwhyzmsbwnvwvdzgzentsvcpvauj ; /usr/bin/python3
Dec 06 08:10:29 np0005548789.localdomain sudo[46317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:30 np0005548789.localdomain python3[46319]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:10:30 np0005548789.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Dec 06 08:10:30 np0005548789.localdomain sudo[46317]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:30 np0005548789.localdomain sudo[46334]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxajtuxchbnodbwpswduvtnvkiqgfxvh ; /usr/bin/python3
Dec 06 08:10:30 np0005548789.localdomain sudo[46334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:30 np0005548789.localdomain python3[46336]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:10:30 np0005548789.localdomain sudo[46334]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:30 np0005548789.localdomain sudo[46350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkktyftvofpehqjuajwhyvqngfucdzkz ; /usr/bin/python3
Dec 06 08:10:30 np0005548789.localdomain sudo[46350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:30 np0005548789.localdomain python3[46352]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:30 np0005548789.localdomain sudo[46350]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:31 np0005548789.localdomain sudo[46398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puguopqrlfrjnamvvakgazlfqcnpgioh ; /usr/bin/python3
Dec 06 08:10:31 np0005548789.localdomain sudo[46398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:31 np0005548789.localdomain python3[46400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:10:31 np0005548789.localdomain sudo[46398]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:31 np0005548789.localdomain sudo[46441]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilfguritknlfqkovgtgenpjwsljgqkfn ; /usr/bin/python3
Dec 06 08:10:31 np0005548789.localdomain sudo[46441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:31 np0005548789.localdomain python3[46443]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008630.9105864-79105-215058964916488/source _original_basename=tmp8gldg326 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:31 np0005548789.localdomain sudo[46441]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:47 np0005548789.localdomain sudo[46458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:10:47 np0005548789.localdomain sudo[46458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:47 np0005548789.localdomain sudo[46458]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:47 np0005548789.localdomain sudo[46473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:10:47 np0005548789.localdomain sudo[46473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:48 np0005548789.localdomain sudo[46473]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:49 np0005548789.localdomain sudo[46520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:10:49 np0005548789.localdomain sudo[46520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:10:49 np0005548789.localdomain sudo[46520]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:53 np0005548789.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 08:10:57 np0005548789.localdomain sudo[46550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sywrpwrqeyjxmbtpbswzyhodwwrjwwny ; /usr/bin/python3
Dec 06 08:10:57 np0005548789.localdomain sudo[46550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:57 np0005548789.localdomain python3[46552]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:10:57 np0005548789.localdomain sudo[46550]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:57 np0005548789.localdomain sudo[46566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xblmzduxpcntzaebvzlrslmmssdysvza ; /usr/bin/python3
Dec 06 08:10:57 np0005548789.localdomain sudo[46566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548789.localdomain python3[46568]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Dec 06 08:10:58 np0005548789.localdomain sudo[46566]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548789.localdomain sudo[46582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjjsimuvfsywigandkhyvklsmonhtcbs ; /usr/bin/python3
Dec 06 08:10:58 np0005548789.localdomain sudo[46582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548789.localdomain python3[46584]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:10:58 np0005548789.localdomain sudo[46582]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548789.localdomain sudo[46598]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqmizeufpsxrgdimzjogwqryjaaqjwit ; /usr/bin/python3
Dec 06 08:10:58 np0005548789.localdomain sudo[46598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:58 np0005548789.localdomain python3[46600]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:58 np0005548789.localdomain sudo[46598]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:58 np0005548789.localdomain sudo[46614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qselrnlozdeqsgkebxuxisxylxpaakrd ; /usr/bin/python3
Dec 06 08:10:58 np0005548789.localdomain sudo[46614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:59 np0005548789.localdomain python3[46616]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:10:59 np0005548789.localdomain sudo[46614]: pam_unix(sudo:session): session closed for user root
Dec 06 08:10:59 np0005548789.localdomain sudo[46630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeyxjahzbhdphkdqljngbwzwaongnxil ; /usr/bin/python3
Dec 06 08:10:59 np0005548789.localdomain sudo[46630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:10:59 np0005548789.localdomain python3[46632]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:11:00 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:11:00 np0005548789.localdomain sudo[46630]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:00 np0005548789.localdomain sudo[46651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpxauxihhupdvwcwozxaoakqjzyevmba ; /usr/bin/python3
Dec 06 08:11:00 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 06 08:11:00 np0005548789.localdomain sudo[46651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:00 np0005548789.localdomain python3[46653]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:11:00 np0005548789.localdomain sudo[46651]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:01 np0005548789.localdomain sudo[46667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyuskvvjnamgksrphozbqjggsxsmxdjp ; /usr/bin/python3
Dec 06 08:11:01 np0005548789.localdomain sudo[46667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:01 np0005548789.localdomain sudo[46667]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:01 np0005548789.localdomain sudo[46715]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yakkbcuqetjhoszdxslifpncelkutngj ; /usr/bin/python3
Dec 06 08:11:01 np0005548789.localdomain sudo[46715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:01 np0005548789.localdomain sudo[46715]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:02 np0005548789.localdomain sudo[46758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqmvxqqthhaenfposgdqmkslxglvpvov ; /usr/bin/python3
Dec 06 08:11:02 np0005548789.localdomain sudo[46758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:02 np0005548789.localdomain sudo[46758]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:02 np0005548789.localdomain sshd[46775]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:11:02 np0005548789.localdomain sudo[46790]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqtulnuqztqhmruhtgdussjfldboipmx ; /usr/bin/python3
Dec 06 08:11:02 np0005548789.localdomain sudo[46790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:03 np0005548789.localdomain python3[46792]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Dec 06 08:11:03 np0005548789.localdomain sudo[46790]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:03 np0005548789.localdomain rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Dec 06 08:11:03 np0005548789.localdomain sudo[46806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ietrgiwpjqnxiwmgbcdwyacatvtcnvxf ; /usr/bin/python3
Dec 06 08:11:03 np0005548789.localdomain sudo[46806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:03 np0005548789.localdomain python3[46808]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:03 np0005548789.localdomain sudo[46806]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:03 np0005548789.localdomain sudo[46822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xfkzkxzllwszxbggoqehurphrqpwnpsz ; /usr/bin/python3
Dec 06 08:11:03 np0005548789.localdomain sudo[46822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:04 np0005548789.localdomain python3[46824]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:04 np0005548789.localdomain sudo[46822]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:04 np0005548789.localdomain sudo[46838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jletfdghcigdpdlxxbyixdtkddrzeihy ; /usr/bin/python3
Dec 06 08:11:04 np0005548789.localdomain sudo[46838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:04 np0005548789.localdomain sshd[46775]: Received disconnect from 154.201.83.49 port 52950:11: Bye Bye [preauth]
Dec 06 08:11:04 np0005548789.localdomain sshd[46775]: Disconnected from authenticating user root 154.201.83.49 port 52950 [preauth]
Dec 06 08:11:04 np0005548789.localdomain python3[46840]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Dec 06 08:11:04 np0005548789.localdomain sudo[46838]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:09 np0005548789.localdomain sudo[46886]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhrxxzgahwizhmabinmhozathjcrysid ; /usr/bin/python3
Dec 06 08:11:09 np0005548789.localdomain sudo[46886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:09 np0005548789.localdomain python3[46888]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:11:09 np0005548789.localdomain sudo[46886]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:10 np0005548789.localdomain sudo[46929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgyqfukcbywazjraabnczadwmquqvykb ; /usr/bin/python3
Dec 06 08:11:10 np0005548789.localdomain sudo[46929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:10 np0005548789.localdomain python3[46931]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008669.4534256-80758-148523531882988/source _original_basename=tmptet8qcjo follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:11:10 np0005548789.localdomain sudo[46929]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:10 np0005548789.localdomain sudo[46959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlzrvedeigotnwvksfwvcceiwkpgpmqa ; /usr/bin/python3
Dec 06 08:11:10 np0005548789.localdomain sudo[46959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:10 np0005548789.localdomain python3[46961]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:11:10 np0005548789.localdomain sudo[46959]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:11 np0005548789.localdomain sudo[47009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjivtxsprjiuicwolqrjuottachkhbtb ; /usr/bin/python3
Dec 06 08:11:11 np0005548789.localdomain sudo[47009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:11 np0005548789.localdomain sudo[47009]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:11 np0005548789.localdomain sudo[47052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgjkqwwxfxtqpcbqcfhdfzsebtcqrsoi ; /usr/bin/python3
Dec 06 08:11:11 np0005548789.localdomain sudo[47052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:12 np0005548789.localdomain sudo[47052]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:12 np0005548789.localdomain sudo[47082]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtinwfaktumefooighiruqirgrzrqbue ; /usr/bin/python3
Dec 06 08:11:12 np0005548789.localdomain sudo[47082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:12 np0005548789.localdomain python3[47084]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:11:12 np0005548789.localdomain sudo[47082]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:13 np0005548789.localdomain sudo[47130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgpekotidbnwegakacnxwbptkdnwfwfy ; /usr/bin/python3
Dec 06 08:11:13 np0005548789.localdomain sudo[47130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:13 np0005548789.localdomain sudo[47130]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:13 np0005548789.localdomain sshd[47159]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:11:13 np0005548789.localdomain sudo[47175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdwmxzmldyfzosmomqfknjllahzncfcb ; /usr/bin/python3
Dec 06 08:11:13 np0005548789.localdomain sudo[47175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:13 np0005548789.localdomain sudo[47175]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:14 np0005548789.localdomain sudo[47205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opojqaainxjgodzzipyflluaufmdyumk ; /usr/bin/python3
Dec 06 08:11:14 np0005548789.localdomain sudo[47205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:14 np0005548789.localdomain python3[47207]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:11:14 np0005548789.localdomain sudo[47205]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:14 np0005548789.localdomain sshd[47159]: Received disconnect from 195.250.72.168 port 34730:11: Bye Bye [preauth]
Dec 06 08:11:14 np0005548789.localdomain sshd[47159]: Disconnected from authenticating user root 195.250.72.168 port 34730 [preauth]
Dec 06 08:11:16 np0005548789.localdomain systemd[35904]: Created slice User Background Tasks Slice.
Dec 06 08:11:16 np0005548789.localdomain systemd[35904]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 08:11:16 np0005548789.localdomain systemd[35904]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 08:11:17 np0005548789.localdomain sudo[47222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arvjmcgwypspprpdcchlqkeqosygxfca ; /usr/bin/python3
Dec 06 08:11:17 np0005548789.localdomain sudo[47222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:17 np0005548789.localdomain python3[47224]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:11:17 np0005548789.localdomain sudo[47222]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:18 np0005548789.localdomain sudo[47239]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjlouaoocezxpfjddnqxabxbmtchharc ; /usr/bin/python3
Dec 06 08:11:18 np0005548789.localdomain sudo[47239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:18 np0005548789.localdomain python3[47241]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:11:22 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:22 np0005548789.localdomain dbus-broker-launch[18452]: Noticed file-system modification, trigger reload.
Dec 06 08:11:22 np0005548789.localdomain dbus-broker-launch[18452]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 06 08:11:22 np0005548789.localdomain dbus-broker-launch[18452]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 06 08:11:22 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:22 np0005548789.localdomain systemd[1]: Reexecuting.
Dec 06 08:11:22 np0005548789.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 08:11:22 np0005548789.localdomain systemd[1]: Detected virtualization kvm.
Dec 06 08:11:22 np0005548789.localdomain systemd[1]: Detected architecture x86-64.
Dec 06 08:11:22 np0005548789.localdomain systemd-rc-local-generator[47294]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:22 np0005548789.localdomain systemd-sysv-generator[47300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  Converting 2706 SID table entries...
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 08:11:31 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 08:11:31 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:31 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 06 08:11:31 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:11:32 np0005548789.localdomain systemd-rc-local-generator[47439]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:32 np0005548789.localdomain systemd-sysv-generator[47443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[619]: Journal stopped
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[619]: Received SIGTERM from PID 1 (systemd).
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Stopping Journal Service...
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Stopped Journal Service.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: systemd-journald.service: Consumed 1.903s CPU time.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Starting Journal Service...
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: systemd-udevd.service: Consumed 3.044s CPU time.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[47810]: Journal started
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[47810]: Runtime Journal (/run/log/journal/4b30904fc4748c16d0c72dbebcabab49) is 12.4M, max 314.7M, 302.3M free.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Started Journal Service.
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 06 08:11:32 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:11:32 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:11:32 np0005548789.localdomain systemd-udevd[47819]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 08:11:32 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:11:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:11:32 np0005548789.localdomain systemd-rc-local-generator[48356]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:11:32 np0005548789.localdomain systemd-sysv-generator[48359]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.288s CPU time.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: run-rc341a2c620854cb2a706901147bc79b1.service: Deactivated successfully.
Dec 06 08:11:33 np0005548789.localdomain systemd[1]: run-r691c2e9d44df4f228f6875fd4b471bbc.service: Deactivated successfully.
Dec 06 08:11:34 np0005548789.localdomain sudo[47239]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:34 np0005548789.localdomain sudo[48732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmjgqyednrzbrvhznkiakyqiqtvpcouc ; /usr/bin/python3
Dec 06 08:11:34 np0005548789.localdomain sudo[48732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:35 np0005548789.localdomain python3[48734]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Dec 06 08:11:35 np0005548789.localdomain sudo[48732]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:35 np0005548789.localdomain sudo[48751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwjfjjcqfiwjjpynkpwdjwdzoyndwtrd ; /usr/bin/python3
Dec 06 08:11:35 np0005548789.localdomain sudo[48751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:35 np0005548789.localdomain python3[48753]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:11:35 np0005548789.localdomain sudo[48751]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:36 np0005548789.localdomain sudo[48769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xggtjxgbefiwyrntmctjwjthofxjhapf ; /usr/bin/python3
Dec 06 08:11:36 np0005548789.localdomain sudo[48769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:36 np0005548789.localdomain python3[48771]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:36 np0005548789.localdomain python3[48771]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Dec 06 08:11:36 np0005548789.localdomain python3[48771]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Dec 06 08:11:43 np0005548789.localdomain podman[48783]: 2025-12-06 08:11:36.723103586 +0000 UTC m=+0.045978759 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:11:43 np0005548789.localdomain python3[48771]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json
Dec 06 08:11:43 np0005548789.localdomain sudo[48769]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:43 np0005548789.localdomain sudo[48882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfshthfpyqgofdbxetfetngwjhdghziq ; /usr/bin/python3
Dec 06 08:11:43 np0005548789.localdomain sudo[48882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:44 np0005548789.localdomain python3[48884]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:44 np0005548789.localdomain python3[48884]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Dec 06 08:11:44 np0005548789.localdomain python3[48884]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Dec 06 08:11:49 np0005548789.localdomain sshd[48959]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:11:50 np0005548789.localdomain sudo[48961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:11:50 np0005548789.localdomain sudo[48961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:50 np0005548789.localdomain sudo[48961]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:50 np0005548789.localdomain sshd[48959]: Received disconnect from 74.94.234.151 port 59014:11: Bye Bye [preauth]
Dec 06 08:11:50 np0005548789.localdomain sshd[48959]: Disconnected from authenticating user root 74.94.234.151 port 59014 [preauth]
Dec 06 08:11:50 np0005548789.localdomain sudo[48976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:11:50 np0005548789.localdomain sudo[48976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:51 np0005548789.localdomain podman[48896]: 2025-12-06 08:11:44.178582322 +0000 UTC m=+0.034199437 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:11:51 np0005548789.localdomain python3[48884]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json
Dec 06 08:11:51 np0005548789.localdomain sudo[48882]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548789.localdomain sudo[49084]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-socjchitfrylnmtisneaxweafpdijcnx ; /usr/bin/python3
Dec 06 08:11:51 np0005548789.localdomain sudo[49084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:11:51 np0005548789.localdomain podman[49099]: 2025-12-06 08:11:51.496384675 +0000 UTC m=+0.090054217 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, version=7, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main)
Dec 06 08:11:51 np0005548789.localdomain python3[49097]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:11:51 np0005548789.localdomain python3[49097]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Dec 06 08:11:51 np0005548789.localdomain python3[49097]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Dec 06 08:11:51 np0005548789.localdomain podman[49099]: 2025-12-06 08:11:51.607282757 +0000 UTC m=+0.200952349 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph)
Dec 06 08:11:51 np0005548789.localdomain sudo[48976]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548789.localdomain sudo[49188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:11:51 np0005548789.localdomain sudo[49188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:51 np0005548789.localdomain sudo[49188]: pam_unix(sudo:session): session closed for user root
Dec 06 08:11:51 np0005548789.localdomain sudo[49203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:11:51 np0005548789.localdomain sudo[49203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:11:52 np0005548789.localdomain sudo[49203]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:00 np0005548789.localdomain sudo[49408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:12:00 np0005548789.localdomain sudo[49408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:12:00 np0005548789.localdomain sudo[49408]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:07 np0005548789.localdomain podman[49129]: 2025-12-06 08:11:51.642183605 +0000 UTC m=+0.047915470 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:12:07 np0005548789.localdomain python3[49097]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json
Dec 06 08:12:07 np0005548789.localdomain sudo[49084]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:08 np0005548789.localdomain sudo[49649]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxatfldqhtyoebpbqvpajiggzbdvgzzk ; /usr/bin/python3
Dec 06 08:12:08 np0005548789.localdomain sudo[49649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:08 np0005548789.localdomain python3[49651]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:08 np0005548789.localdomain python3[49651]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Dec 06 08:12:08 np0005548789.localdomain python3[49651]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Dec 06 08:12:21 np0005548789.localdomain podman[49664]: 2025-12-06 08:12:08.424226229 +0000 UTC m=+0.042978435 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:12:21 np0005548789.localdomain python3[49651]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json
Dec 06 08:12:21 np0005548789.localdomain sudo[49649]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:22 np0005548789.localdomain sudo[49765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzciucuwbebsfkdhjysypqqyrlbvodem ; /usr/bin/python3
Dec 06 08:12:22 np0005548789.localdomain sudo[49765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:22 np0005548789.localdomain python3[49767]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:22 np0005548789.localdomain python3[49767]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Dec 06 08:12:22 np0005548789.localdomain python3[49767]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Dec 06 08:12:29 np0005548789.localdomain podman[49780]: 2025-12-06 08:12:22.399588726 +0000 UTC m=+0.044637355 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:12:29 np0005548789.localdomain python3[49767]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json
Dec 06 08:12:29 np0005548789.localdomain sudo[49765]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:30 np0005548789.localdomain sudo[49915]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtauaxwrvjmqpjnexahdmudkjgpejglu ; /usr/bin/python3
Dec 06 08:12:30 np0005548789.localdomain sudo[49915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:30 np0005548789.localdomain python3[49917]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:30 np0005548789.localdomain python3[49917]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Dec 06 08:12:30 np0005548789.localdomain python3[49917]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Dec 06 08:12:34 np0005548789.localdomain podman[49931]: 2025-12-06 08:12:30.324909617 +0000 UTC m=+0.035824519 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:12:34 np0005548789.localdomain python3[49917]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json
Dec 06 08:12:34 np0005548789.localdomain sudo[49915]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:35 np0005548789.localdomain sudo[50007]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emvqwypwjvirftflqkjrsxvkhxpwrhgv ; /usr/bin/python3
Dec 06 08:12:35 np0005548789.localdomain sudo[50007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:35 np0005548789.localdomain python3[50009]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:35 np0005548789.localdomain python3[50009]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Dec 06 08:12:35 np0005548789.localdomain python3[50009]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Dec 06 08:12:37 np0005548789.localdomain podman[50022]: 2025-12-06 08:12:35.328120073 +0000 UTC m=+0.043086986 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:12:37 np0005548789.localdomain python3[50009]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json
Dec 06 08:12:37 np0005548789.localdomain sudo[50007]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:37 np0005548789.localdomain sudo[50096]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vediiuncpcwafwhjjryyjtthhyqhnxlr ; /usr/bin/python3
Dec 06 08:12:37 np0005548789.localdomain sudo[50096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:38 np0005548789.localdomain python3[50098]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:38 np0005548789.localdomain python3[50098]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Dec 06 08:12:38 np0005548789.localdomain python3[50098]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Dec 06 08:12:40 np0005548789.localdomain podman[50111]: 2025-12-06 08:12:38.13721956 +0000 UTC m=+0.030071113 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:12:40 np0005548789.localdomain python3[50098]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json
Dec 06 08:12:40 np0005548789.localdomain sudo[50096]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:40 np0005548789.localdomain sudo[50188]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqqnxzpioqpuohpjmtftccgtkroaodfn ; /usr/bin/python3
Dec 06 08:12:40 np0005548789.localdomain sudo[50188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:40 np0005548789.localdomain python3[50190]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:40 np0005548789.localdomain python3[50190]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Dec 06 08:12:40 np0005548789.localdomain python3[50190]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Dec 06 08:12:41 np0005548789.localdomain sshd[50212]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:12:42 np0005548789.localdomain sshd[50212]: Received disconnect from 154.201.83.49 port 57238:11: Bye Bye [preauth]
Dec 06 08:12:42 np0005548789.localdomain sshd[50212]: Disconnected from authenticating user root 154.201.83.49 port 57238 [preauth]
Dec 06 08:12:43 np0005548789.localdomain podman[50202]: 2025-12-06 08:12:41.019283458 +0000 UTC m=+0.042646793 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:12:43 np0005548789.localdomain python3[50190]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json
Dec 06 08:12:43 np0005548789.localdomain sudo[50188]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:43 np0005548789.localdomain sudo[50278]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgxexggodwnmoygkinrcunoipbjgcsic ; /usr/bin/python3
Dec 06 08:12:43 np0005548789.localdomain sudo[50278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:44 np0005548789.localdomain python3[50280]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:44 np0005548789.localdomain python3[50280]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Dec 06 08:12:44 np0005548789.localdomain python3[50280]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Dec 06 08:12:44 np0005548789.localdomain sshd[50306]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:12:45 np0005548789.localdomain sshd[50306]: Received disconnect from 195.250.72.168 port 48618:11: Bye Bye [preauth]
Dec 06 08:12:45 np0005548789.localdomain sshd[50306]: Disconnected from authenticating user root 195.250.72.168 port 48618 [preauth]
Dec 06 08:12:47 np0005548789.localdomain podman[50293]: 2025-12-06 08:12:44.111993586 +0000 UTC m=+0.045438840 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:12:47 np0005548789.localdomain python3[50280]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json
Dec 06 08:12:47 np0005548789.localdomain sudo[50278]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:48 np0005548789.localdomain sudo[50380]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flmsodwtintmeaoqetkjpewqutokqwmt ; /usr/bin/python3
Dec 06 08:12:48 np0005548789.localdomain sudo[50380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:48 np0005548789.localdomain python3[50382]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Dec 06 08:12:48 np0005548789.localdomain python3[50382]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Dec 06 08:12:48 np0005548789.localdomain python3[50382]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Dec 06 08:12:50 np0005548789.localdomain podman[50396]: 2025-12-06 08:12:48.327865689 +0000 UTC m=+0.041551709 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:12:50 np0005548789.localdomain python3[50382]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json
Dec 06 08:12:50 np0005548789.localdomain sudo[50380]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:51 np0005548789.localdomain sudo[50470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nygbdjqolupsuwqfnertdmwwtbbopsfp ; /usr/bin/python3
Dec 06 08:12:51 np0005548789.localdomain sudo[50470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:51 np0005548789.localdomain python3[50472]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:12:51 np0005548789.localdomain sudo[50470]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:51 np0005548789.localdomain sudo[50520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nesxocsjmiuyvigujlnmmmbcjdieqhlt ; /usr/bin/python3
Dec 06 08:12:51 np0005548789.localdomain sudo[50520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:51 np0005548789.localdomain sudo[50520]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:51 np0005548789.localdomain sudo[50538]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymegytfclrevmzxltkhaactzuucqeifj ; /usr/bin/python3
Dec 06 08:12:51 np0005548789.localdomain sudo[50538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:51 np0005548789.localdomain sudo[50538]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:52 np0005548789.localdomain sudo[50642]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-henmhpvhxurjvorfsfrzvmpwcioquafc ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008772.1605537-83491-237608108090073/async_wrapper.py 305873006874 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008772.1605537-83491-237608108090073/AnsiballZ_command.py _
Dec 06 08:12:52 np0005548789.localdomain sudo[50642]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:12:52 np0005548789.localdomain ansible-async_wrapper.py[50644]: Invoked with 305873006874 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008772.1605537-83491-237608108090073/AnsiballZ_command.py _
Dec 06 08:12:52 np0005548789.localdomain ansible-async_wrapper.py[50647]: Starting module and watcher
Dec 06 08:12:52 np0005548789.localdomain ansible-async_wrapper.py[50647]: Start watching 50648 (3600)
Dec 06 08:12:52 np0005548789.localdomain ansible-async_wrapper.py[50648]: Start module (50648)
Dec 06 08:12:52 np0005548789.localdomain ansible-async_wrapper.py[50644]: Return async_wrapper task started.
Dec 06 08:12:52 np0005548789.localdomain sudo[50642]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:53 np0005548789.localdomain sudo[50666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axotcgoknyejocrwbxyxamzdbxjiotsa ; /usr/bin/python3
Dec 06 08:12:53 np0005548789.localdomain sudo[50666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:12:53 np0005548789.localdomain python3[50668]: ansible-ansible.legacy.async_status Invoked with jid=305873006874.50644 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:12:53 np0005548789.localdomain sudo[50666]: pam_unix(sudo:session): session closed for user root
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    (file & line not available)
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    (file & line not available)
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.12 seconds
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Notice: Applied catalog in 0.06 seconds
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Application:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    Initial environment: production
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    Converged environment: production
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:          Run mode: user
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Changes:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:             Total: 3
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Events:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:           Success: 3
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:             Total: 3
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Resources:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:           Changed: 3
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:       Out of sync: 3
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:             Total: 10
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Time:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:          Schedule: 0.00
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:        Filebucket: 0.00
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:              File: 0.00
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:              Exec: 0.01
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:            Augeas: 0.02
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    Transaction evaluation: 0.05
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    Catalog application: 0.06
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:    Config retrieval: 0.16
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:          Last run: 1765008776
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:             Total: 0.06
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]: Version:
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:            Config: 1765008776
Dec 06 08:12:56 np0005548789.localdomain puppet-user[50652]:            Puppet: 7.10.0
Dec 06 08:12:57 np0005548789.localdomain ansible-async_wrapper.py[50648]: Module complete (50648)
Dec 06 08:12:57 np0005548789.localdomain ansible-async_wrapper.py[50647]: Done in kid B.
Dec 06 08:13:00 np0005548789.localdomain sudo[50780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:13:00 np0005548789.localdomain sudo[50780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:00 np0005548789.localdomain sudo[50780]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:00 np0005548789.localdomain sudo[50795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:13:00 np0005548789.localdomain sudo[50795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:00 np0005548789.localdomain sudo[50795]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:01 np0005548789.localdomain sudo[50841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:13:01 np0005548789.localdomain sudo[50841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:13:01 np0005548789.localdomain sudo[50841]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:03 np0005548789.localdomain sudo[50869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgvhcdtqywlbbhwnmjpdxbuinrgusmin ; /usr/bin/python3
Dec 06 08:13:03 np0005548789.localdomain sudo[50869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:03 np0005548789.localdomain python3[50871]: ansible-ansible.legacy.async_status Invoked with jid=305873006874.50644 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:13:03 np0005548789.localdomain sudo[50869]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:04 np0005548789.localdomain sudo[50885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfohswmkucvflpeaqrwdkjpyqjllqkmy ; /usr/bin/python3
Dec 06 08:13:04 np0005548789.localdomain sudo[50885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:04 np0005548789.localdomain python3[50887]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:13:04 np0005548789.localdomain sudo[50885]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:04 np0005548789.localdomain sudo[50901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwvboobckchjighqjnwhxpgierbqiedj ; /usr/bin/python3
Dec 06 08:13:04 np0005548789.localdomain sudo[50901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:04 np0005548789.localdomain python3[50903]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:04 np0005548789.localdomain sudo[50901]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:05 np0005548789.localdomain sudo[50949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuyieojdqqezbnakeixurgpwnlnjzxhz ; /usr/bin/python3
Dec 06 08:13:05 np0005548789.localdomain sudo[50949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:05 np0005548789.localdomain python3[50951]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:05 np0005548789.localdomain sudo[50949]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:05 np0005548789.localdomain sudo[50992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvqvwokusrxcoaqyxbupoenolcddxhtf ; /usr/bin/python3
Dec 06 08:13:05 np0005548789.localdomain sudo[50992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:05 np0005548789.localdomain python3[50994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008784.9808478-83700-189720324973468/source _original_basename=tmpq8fh5mc5 follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:13:05 np0005548789.localdomain sudo[50992]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:05 np0005548789.localdomain sudo[51022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynajbiisgcdlaomutsewujnaqhwoayyb ; /usr/bin/python3
Dec 06 08:13:05 np0005548789.localdomain sudo[51022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:06 np0005548789.localdomain python3[51024]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:06 np0005548789.localdomain sudo[51022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:06 np0005548789.localdomain sudo[51038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjvzrpfckcsacjrivprflqhvtxtulbrj ; /usr/bin/python3
Dec 06 08:13:06 np0005548789.localdomain sudo[51038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:07 np0005548789.localdomain sudo[51038]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:07 np0005548789.localdomain sudo[51125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sujxdyxrtbnjgemdvpgiazaobnvirlrv ; /usr/bin/python3
Dec 06 08:13:07 np0005548789.localdomain sudo[51125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:07 np0005548789.localdomain python3[51127]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:13:07 np0005548789.localdomain sudo[51125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:08 np0005548789.localdomain sudo[51144]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueccbcxrhhedqeutqqbgskqfmyssueny ; /usr/bin/python3
Dec 06 08:13:08 np0005548789.localdomain sudo[51144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:08 np0005548789.localdomain python3[51146]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 08:13:08 np0005548789.localdomain sudo[51144]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:08 np0005548789.localdomain sudo[51160]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubhipqobnwmlcltbzhepnersmydnbogt ; /usr/bin/python3
Dec 06 08:13:08 np0005548789.localdomain sudo[51160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:08 np0005548789.localdomain python3[51162]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005548789 step=1 update_config_hash_only=False
Dec 06 08:13:08 np0005548789.localdomain sudo[51160]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:09 np0005548789.localdomain sudo[51176]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joswgpuzfpzotthngjbyhoyjonrvgugl ; /usr/bin/python3
Dec 06 08:13:09 np0005548789.localdomain sudo[51176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:09 np0005548789.localdomain python3[51178]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:09 np0005548789.localdomain sudo[51176]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:09 np0005548789.localdomain sudo[51192]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouivlujlhwbphbeggsvmbyvmlxutbwnb ; /usr/bin/python3
Dec 06 08:13:09 np0005548789.localdomain sudo[51192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:10 np0005548789.localdomain python3[51194]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:13:10 np0005548789.localdomain sudo[51192]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:10 np0005548789.localdomain sudo[51208]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrumufiwjfnxmlfoqjjvpofasrnsgwxb ; /usr/bin/python3
Dec 06 08:13:10 np0005548789.localdomain sudo[51208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:10 np0005548789.localdomain python3[51210]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 08:13:11 np0005548789.localdomain sudo[51208]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:11 np0005548789.localdomain sudo[51250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrraxofabikellyvwhohukgkhdzemloe ; /usr/bin/python3
Dec 06 08:13:11 np0005548789.localdomain sudo[51250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:11 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:13:12 np0005548789.localdomain podman[51402]: 2025-12-06 08:13:12.231674147 +0000 UTC m=+0.085827322 container create f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:13:12 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:12.247333582 +0000 UTC m=+0.055533512 container create 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libpod-conmon-f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5.scope.
Dec 06 08:13:12 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:12.267363832 +0000 UTC m=+0.092504828 container create 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, container_name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libpod-conmon-9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa.scope.
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93af1cca3c5bc2fd6a62da9c5b24ae5188daa60c95e07f180cd96e6bf14ae456/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62c1e5bed2cbe019aeb004cf1ae8b59e235260475c84680568f7ac998ec16abe/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548789.localdomain podman[51402]: 2025-12-06 08:13:12.194273757 +0000 UTC m=+0.048426952 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:12 np0005548789.localdomain podman[51402]: 2025-12-06 08:13:12.296895578 +0000 UTC m=+0.151048753 container init f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=container-puppet-metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libpod-conmon-4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109.scope.
Dec 06 08:13:12 np0005548789.localdomain podman[51402]: 2025-12-06 08:13:12.304868976 +0000 UTC m=+0.159022151 container start f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=container-puppet-metrics_qdr, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, release=1761123044)
Dec 06 08:13:12 np0005548789.localdomain podman[51402]: 2025-12-06 08:13:12.305182115 +0000 UTC m=+0.159335310 container attach f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, version=17.1.12)
Dec 06 08:13:12 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:12 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:12.321295685 +0000 UTC m=+0.146436681 container init 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:13:12 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:12.222830263 +0000 UTC m=+0.047971269 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:13:12 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:12.223223015 +0000 UTC m=+0.031422965 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:13:12 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:12.2414888 +0000 UTC m=+0.032214949 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:13:12 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:12.238015583 +0000 UTC m=+0.037190624 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:13:13 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:13.345595715 +0000 UTC m=+1.170736701 container start 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, distribution-scope=public, container_name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:13:13 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:13.346999049 +0000 UTC m=+1.172140125 container attach 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, container_name=container-puppet-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Dec 06 08:13:13 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:13.389022752 +0000 UTC m=+1.188197803 container create 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, vcs-type=git)
Dec 06 08:13:13 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:13.452801509 +0000 UTC m=+1.243527648 container create b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-collectd, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:13:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1.scope.
Dec 06 08:13:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7114dad9ca5bd35640cc71d1bc50a0dfc385dbab46b008e450dae4492651614e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:13 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:13.483397238 +0000 UTC m=+1.291597178 container init 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12)
Dec 06 08:13:13 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:13.520349923 +0000 UTC m=+1.319524954 container init 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-nova_libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:13:13 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:13.598490835 +0000 UTC m=+1.397665906 container start 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=container-puppet-nova_libvirt, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4)
Dec 06 08:13:13 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:13.598955409 +0000 UTC m=+1.398130460 container attach 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044)
Dec 06 08:13:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c.scope.
Dec 06 08:13:13 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:13.667858035 +0000 UTC m=+1.476057975 container start 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, release=1761123044, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, url=https://www.redhat.com)
Dec 06 08:13:13 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:13.668066592 +0000 UTC m=+1.476266532 container attach 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, version=17.1.12, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git)
Dec 06 08:13:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d82191509656bbf6f64f1f50570f9d09f17aadb036e941dc9fdbfc1b9557da8/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:13 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:13.693272692 +0000 UTC m=+1.483998861 container init b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:13:13 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:13.705245014 +0000 UTC m=+1.495971143 container start b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=container-puppet-collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Dec 06 08:13:13 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:13.705685108 +0000 UTC m=+1.496411327 container attach b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=container-puppet-collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 08:13:14 np0005548789.localdomain podman[51327]: 2025-12-06 08:13:12.105943549 +0000 UTC m=+0.037577816 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:14 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:14.495894533 +0000 UTC m=+0.066505633 container create e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, name=rhosp17/openstack-ceilometer-central, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., tcib_managed=true, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:11:59Z, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:13:14 np0005548789.localdomain systemd[1]: Started libpod-conmon-e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c.scope.
Dec 06 08:13:14 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:14 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:14.462516648 +0000 UTC m=+0.033127768 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:14 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c8811abddbaacb6f32387bcfbf857078338be762178d55f30b77fc76cc9dc19/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:14 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:14.578463891 +0000 UTC m=+0.149075021 container init e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=container-puppet-ceilometer, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-central, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:14 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:14.598440132 +0000 UTC m=+0.169051232 container start e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:59Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-central-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-central)
Dec 06 08:13:14 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:14.598639868 +0000 UTC m=+0.169250978 container attach e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-19T00:11:59Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, container_name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: Accepting previously invalid value for target type 'Integer'
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: tmp-crun.HANlUp.mount: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.14 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}34a9d8752d7fd9af34f20c785b68df439604d3fa295519168b060d10c3f23b42'
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Notice: Applied catalog in 0.03 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Application:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    Initial environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    Converged environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:          Run mode: user
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Changes:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:             Total: 7
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Events:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:           Success: 7
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:             Total: 7
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Resources:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:           Skipped: 13
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:           Changed: 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:       Out of sync: 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:             Total: 20
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Time:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:              File: 0.01
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    Transaction evaluation: 0.02
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    Catalog application: 0.03
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:    Config retrieval: 0.17
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:          Last run: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:             Total: 0.03
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]: Version:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:            Config: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51785]:            Puppet: 7.10.0
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.10 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain ovs-vsctl[52105]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.08 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Dec 06 08:13:15 np0005548789.localdomain crontab[52131]: (root) LIST (root)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Dec 06 08:13:15 np0005548789.localdomain crontab[52145]: (root) REPLACE (root)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Notice: Applied catalog in 0.06 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Application:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    Initial environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    Converged environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:          Run mode: user
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Changes:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:             Total: 2
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Events:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:           Success: 2
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:             Total: 2
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Resources:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:           Changed: 2
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:       Out of sync: 2
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:           Skipped: 7
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:             Total: 9
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Time:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:              File: 0.01
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:              Cron: 0.02
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    Transaction evaluation: 0.05
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    Catalog application: 0.06
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:    Config retrieval: 0.10
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:          Last run: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:             Total: 0.06
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]: Version:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:            Config: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51814]:            Puppet: 7.10.0
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]:    (file & line not available)
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5.scope: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5.scope: Consumed 2.134s CPU time.
Dec 06 08:13:15 np0005548789.localdomain podman[52272]: 2025-12-06 08:13:15.649145461 +0000 UTC m=+0.040583409 container died f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_puppet_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, container_name=container-puppet-metrics_qdr, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git)
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: tmp-crun.K53d5d.mount: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain podman[52272]: 2025-12-06 08:13:15.710544824 +0000 UTC m=+0.101982742 container cleanup f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, container_name=container-puppet-metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1)
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-conmon-f735a830357edaf91b07470ca762854575514a50cfbb9cf4aaacf994a5cb9ec5.scope: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa.scope: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa.scope: Consumed 2.041s CPU time.
Dec 06 08:13:15 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:15 np0005548789.localdomain podman[51447]: 2025-12-06 08:13:15.717432168 +0000 UTC m=+3.525632108 container died 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, url=https://www.redhat.com, container_name=container-puppet-crond, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: in a future release. Use nova::cinder::os_region_name instead
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: in a future release. Use nova::cinder::catalog_info instead
Dec 06 08:13:15 np0005548789.localdomain podman[52309]: 2025-12-06 08:13:15.789695818 +0000 UTC m=+0.065834812 container cleanup 9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-type=git, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=container-puppet-crond)
Dec 06 08:13:15 np0005548789.localdomain systemd[1]: libpod-conmon-9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa.scope: Deactivated successfully.
Dec 06 08:13:15 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Notice: Applied catalog in 0.45 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Application:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    Initial environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    Converged environment: production
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:          Run mode: user
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Changes:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:             Total: 4
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Events:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:           Success: 4
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:             Total: 4
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Resources:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:           Changed: 4
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:       Out of sync: 4
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:           Skipped: 8
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:             Total: 13
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Time:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:              File: 0.00
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:              Exec: 0.05
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    Config retrieval: 0.13
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:            Augeas: 0.38
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    Transaction evaluation: 0.44
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:    Catalog application: 0.45
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:          Last run: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:             Total: 0.45
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]: Version:
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:            Config: 1765008795
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51786]:            Puppet: 7.10.0
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51838]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.35 seconds
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Dec 06 08:13:15 np0005548789.localdomain puppet-user[51819]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Dec 06 08:13:16 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:16.07171081 +0000 UTC m=+0.058508095 container create ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}0d4e701b7b2398bbf396579a0713d46d3c496c79edc52f2e260456f359c9a46c'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: Started libpod-conmon-ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c.scope.
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b1be86c3700ae830e3281535879d3e256788ac5e6b2a708dc79f8a857d5536b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:16.11820038 +0000 UTC m=+0.104997675 container init ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, container_name=container-puppet-rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109.scope: Consumed 2.604s CPU time.
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Dec 06 08:13:16 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:16.133219246 +0000 UTC m=+0.120016531 container start ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-rsyslog, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, batch=17.1_20251118.1)
Dec 06 08:13:16 np0005548789.localdomain podman[51436]: 2025-12-06 08:13:16.133780694 +0000 UTC m=+3.958921700 container died 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git)
Dec 06 08:13:16 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:16.133404032 +0000 UTC m=+0.120201327 container attach ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Dec 06 08:13:16 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:16.044428574 +0000 UTC m=+0.031225899 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51819]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Notice: Applied catalog in 0.23 seconds
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Application:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Initial environment: production
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Converged environment: production
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:          Run mode: user
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Changes:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:             Total: 43
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Events:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:           Success: 43
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:             Total: 43
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Resources:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:           Skipped: 14
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:           Changed: 38
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:       Out of sync: 38
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:             Total: 82
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Time:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Concat fragment: 0.00
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:       Concat file: 0.00
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:              File: 0.12
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Transaction evaluation: 0.22
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Catalog application: 0.23
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:    Config retrieval: 0.48
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:          Last run: 1765008796
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:             Total: 0.23
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]: Version:
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:            Config: 1765008795
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51838]:            Puppet: 7.10.0
Dec 06 08:13:16 np0005548789.localdomain podman[52500]: 2025-12-06 08:13:16.199466329 +0000 UTC m=+0.056370838 container cleanup 4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, release=1761123044, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-iscsid, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-conmon-4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe-merged.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4704c7e23d4793bb8ec51f342b561857983350f0042af9af009532473579e109-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-62c1e5bed2cbe019aeb004cf1ae8b59e235260475c84680568f7ac998ec16abe-merged.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9506016e406df32ac4299ea465041d6a128e0f3dc9ea98102586360564954bfa-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-93af1cca3c5bc2fd6a62da9c5b24ae5188daa60c95e07f180cd96e6bf14ae456-merged.mount: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:16.271894605 +0000 UTC m=+0.172935222 container create 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=container-puppet-ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: Started libpod-conmon-973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47.scope.
Dec 06 08:13:16 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:16.232284477 +0000 UTC m=+0.133325134 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:16 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:16.353562636 +0000 UTC m=+0.254603253 container init 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_puppet_step1, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:13:16 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:16.364027171 +0000 UTC m=+0.265067818 container start 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_puppet_step1, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, tcib_managed=true, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:13:16 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:16.364370681 +0000 UTC m=+0.265411318 container attach 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c.scope: Consumed 2.606s CPU time.
Dec 06 08:13:16 np0005548789.localdomain podman[51463]: 2025-12-06 08:13:16.602794822 +0000 UTC m=+4.393520961 container died b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]:    (file & line not available)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]:    (file & line not available)
Dec 06 08:13:16 np0005548789.localdomain podman[52687]: 2025-12-06 08:13:16.690466419 +0000 UTC m=+0.077684908 container cleanup b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, container_name=container-puppet-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:13:16 np0005548789.localdomain systemd[1]: libpod-conmon-b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c.scope: Deactivated successfully.
Dec 06 08:13:16 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51819]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 1.24 seconds
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Dec 06 08:13:16 np0005548789.localdomain puppet-user[51916]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.38 seconds
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}b5992c61c5e6c0fa60ac7720677a0efdfb73ceba695978e2f56794a0d035436f'
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Warning: Empty environment setting 'TLS_PASSWORD'
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}8f9f91b7bc846aa12da1e2df7356fc45f862596082e133d7976104ee8d1893c1'
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: tmp-crun.maWaK9.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5d82191509656bbf6f64f1f50570f9d09f17aadb036e941dc9fdbfc1b9557da8-merged.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b567bdc6731749e2c2df0cfc659cd5d0e02971a859c8934938817051a1ec8b8c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Notice: Applied catalog in 0.43 seconds
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Application:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Initial environment: production
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Converged environment: production
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:          Run mode: user
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Changes:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:             Total: 31
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Events:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:           Success: 31
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:             Total: 31
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Resources:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:           Skipped: 22
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:           Changed: 31
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:       Out of sync: 31
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:             Total: 151
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Time:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:           Package: 0.01
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Ceilometer config: 0.35
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Transaction evaluation: 0.42
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Catalog application: 0.43
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:    Config retrieval: 0.45
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:          Last run: 1765008797
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:         Resources: 0.00
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:             Total: 0.43
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]: Version:
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:            Config: 1765008796
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51916]:            Puppet: 7.10.0
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: libpod-e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c.scope: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: libpod-e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c.scope: Consumed 2.959s CPU time.
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]:    (file & line not available)
Dec 06 08:13:17 np0005548789.localdomain podman[51872]: 2025-12-06 08:13:17.883722448 +0000 UTC m=+3.454333598 container died e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, build-date=2025-11-19T00:11:59Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-central, version=17.1.12, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:17 np0005548789.localdomain puppet-user[52519]:    (file & line not available)
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: tmp-crun.GBdEz1.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2c8811abddbaacb6f32387bcfbf857078338be762178d55f30b77fc76cc9dc19-merged.mount: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain podman[52857]: 2025-12-06 08:13:17.979639622 +0000 UTC m=+0.086053929 container cleanup e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ceilometer-central, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, build-date=2025-11-19T00:11:59Z, container_name=container-puppet-ceilometer)
Dec 06 08:13:17 np0005548789.localdomain systemd[1]: libpod-conmon-e6a13c3a9cefd2fa1b521b34f74f31f76fdd0b457acb3e515d649fdb95bdeb0c.scope: Deactivated successfully.
Dec 06 08:13:17 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Dec 06 08:13:17 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.28 seconds
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]:    (file & line not available)
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]:    (file & line not available)
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}c8a076a5cc4f95986ab769a4c95ebfeb53a6814b6c917c7233a963f16ed74f11'
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Notice: Applied catalog in 0.10 seconds
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Application:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Initial environment: production
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Converged environment: production
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:          Run mode: user
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Changes:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:             Total: 3
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Events:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:           Success: 3
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:             Total: 3
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Resources:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:           Skipped: 11
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:           Changed: 3
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:       Out of sync: 3
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:             Total: 25
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Time:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:       Concat file: 0.00
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Concat fragment: 0.00
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:              File: 0.01
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Transaction evaluation: 0.09
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Catalog application: 0.10
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:    Config retrieval: 0.32
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:          Last run: 1765008798
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:             Total: 0.10
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]: Version:
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:            Config: 1765008797
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52519]:            Puppet: 7.10.0
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.27 seconds
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[52978]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[52983]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: libpod-ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c.scope: Deactivated successfully.
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: libpod-ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c.scope: Consumed 2.319s CPU time.
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[52993]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.107
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain podman[52430]: 2025-12-06 08:13:18.638977119 +0000 UTC m=+2.625774424 container died ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=container-puppet-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible)
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53006]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005548789.localdomain
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005548789.novalocal' to 'np0005548789.localdomain'
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: tmp-crun.3SSv0L.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53009]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2b1be86c3700ae830e3281535879d3e256788ac5e6b2a708dc79f8a857d5536b-merged.mount: Deactivated successfully.
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53011]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53013]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain podman[52995]: 2025-12-06 08:13:18.802061514 +0000 UTC m=+0.150380162 container cleanup ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain systemd[1]: libpod-conmon-ae8b38ba4b5ab4a99a42be92184928140343d5747d40ab542a9394bab47afa5c.scope: Deactivated successfully.
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53015]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}9f797f9d49cf12085061840a6e15e35ef08aaf3c80bbe03bcf23d28dd55767ae'
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53029]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53042]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53044]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:0b:71:f7
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53046]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain ovs-vsctl[53048]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Dec 06 08:13:18 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain ovs-vsctl[53050]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Notice: Applied catalog in 0.56 seconds
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Application:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:    Initial environment: production
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:    Converged environment: production
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:          Run mode: user
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Changes:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:             Total: 14
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Events:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:           Success: 14
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:             Total: 14
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Resources:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:           Skipped: 12
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:           Changed: 14
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:       Out of sync: 14
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:             Total: 29
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Time:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:              Exec: 0.02
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:    Config retrieval: 0.30
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:         Vs config: 0.48
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:    Transaction evaluation: 0.55
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:    Catalog application: 0.56
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:          Last run: 1765008799
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:             Total: 0.56
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]: Version:
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:            Config: 1765008798
Dec 06 08:13:19 np0005548789.localdomain puppet-user[52585]:            Puppet: 7.10.0
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Dec 06 08:13:19 np0005548789.localdomain systemd[1]: libpod-973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47.scope: Deactivated successfully.
Dec 06 08:13:19 np0005548789.localdomain systemd[1]: libpod-973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47.scope: Consumed 2.969s CPU time.
Dec 06 08:13:19 np0005548789.localdomain podman[52477]: 2025-12-06 08:13:19.501155415 +0000 UTC m=+3.402196042 container died 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, version=17.1.12, container_name=container-puppet-ovn_controller)
Dec 06 08:13:19 np0005548789.localdomain podman[52580]: 2025-12-06 08:13:16.445507836 +0000 UTC m=+0.040974861 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6-merged.mount: Deactivated successfully.
Dec 06 08:13:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:20 np0005548789.localdomain podman[53097]: 2025-12-06 08:13:20.010857376 +0000 UTC m=+0.499010010 container cleanup 973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64)
Dec 06 08:13:20 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:13:20 np0005548789.localdomain systemd[1]: libpod-conmon-973e97e54c593ac5d052f8871b0acf7de04e674b931c55cda2af5dfc37a43e47.scope: Deactivated successfully.
Dec 06 08:13:20 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:19.742248488 +0000 UTC m=+0.050075633 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:20 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:20.045687475 +0000 UTC m=+0.353514660 container create 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:23:27Z, distribution-scope=public, name=rhosp17/openstack-neutron-server, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-neutron-server-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 neutron-server)
Dec 06 08:13:20 np0005548789.localdomain systemd[1]: Started libpod-conmon-17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915.scope.
Dec 06 08:13:20 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Dec 06 08:13:20 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:20 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:20.121969819 +0000 UTC m=+0.429796974 container init 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:23:27Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-server)
Dec 06 08:13:20 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:20.127696806 +0000 UTC m=+0.435523961 container start 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp17/openstack-neutron-server, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vcs-type=git, release=1761123044, container_name=container-puppet-neutron, build-date=2025-11-19T00:23:27Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:13:20 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:20.127928143 +0000 UTC m=+0.435755298 container attach 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-server, config_id=tripleo_puppet_step1, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:20 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3a12438802493a75725c4f7704f2af6db1ef72af396369e5de28f6f4d6a7ed98'
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Notice: Applied catalog in 4.73 seconds
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Application:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Initial environment: production
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Converged environment: production
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:          Run mode: user
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Changes:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:             Total: 183
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Events:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:           Success: 183
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:             Total: 183
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Resources:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:           Changed: 183
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:       Out of sync: 183
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:           Skipped: 57
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:             Total: 487
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Time:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Concat fragment: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:            Anchor: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:         File line: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtlogd config: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:              Exec: 0.01
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtsecretd config: 0.01
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtqemud config: 0.01
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:           Package: 0.01
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtstoraged config: 0.01
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtnodedevd config: 0.02
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Virtproxyd config: 0.03
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:              File: 0.06
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:            Augeas: 1.17
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Config retrieval: 1.50
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:          Last run: 1765008801
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:       Nova config: 3.12
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Transaction evaluation: 4.66
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:    Catalog application: 4.73
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:         Resources: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:       Concat file: 0.00
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:             Total: 4.73
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]: Version:
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:            Config: 1765008795
Dec 06 08:13:21 np0005548789.localdomain puppet-user[51819]:            Puppet: 7.10.0
Dec 06 08:13:21 np0005548789.localdomain puppet-user[53191]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]:    (file & line not available)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]:    (file & line not available)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.60 seconds
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: libpod-329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1.scope: Deactivated successfully.
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: libpod-329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1.scope: Consumed 8.552s CPU time.
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain podman[51452]: 2025-12-06 08:13:22.74381788 +0000 UTC m=+10.542992941 container died 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: tmp-crun.6xrvmn.mount: Deactivated successfully.
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7114dad9ca5bd35640cc71d1bc50a0dfc385dbab46b008e450dae4492651614e-merged.mount: Deactivated successfully.
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain podman[53303]: 2025-12-06 08:13:22.889268479 +0000 UTC m=+0.137628937 container cleanup 329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.openshift.expose-services=)
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain systemd[1]: libpod-conmon-329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1.scope: Deactivated successfully.
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Dec 06 08:13:22 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Notice: Applied catalog in 0.55 seconds
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Application:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Initial environment: production
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Converged environment: production
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:          Run mode: user
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Changes:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:             Total: 33
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Events:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:           Success: 33
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:             Total: 33
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Resources:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:           Skipped: 21
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:           Changed: 33
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:       Out of sync: 33
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:             Total: 155
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Time:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:         Resources: 0.00
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Ovn metadata agent config: 0.08
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Neutron config: 0.41
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Transaction evaluation: 0.54
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Catalog application: 0.55
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:    Config retrieval: 0.67
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:          Last run: 1765008803
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:             Total: 0.55
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]: Version:
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:            Config: 1765008802
Dec 06 08:13:23 np0005548789.localdomain puppet-user[53191]:            Puppet: 7.10.0
Dec 06 08:13:23 np0005548789.localdomain systemd[1]: libpod-17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915.scope: Deactivated successfully.
Dec 06 08:13:23 np0005548789.localdomain systemd[1]: libpod-17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915.scope: Consumed 3.496s CPU time.
Dec 06 08:13:23 np0005548789.localdomain podman[53136]: 2025-12-06 08:13:23.752160307 +0000 UTC m=+4.059987462 container died 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-server, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-neutron, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:13:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411-merged.mount: Deactivated successfully.
Dec 06 08:13:23 np0005548789.localdomain podman[53377]: 2025-12-06 08:13:23.86968371 +0000 UTC m=+0.110123234 container cleanup 17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2025-11-19T00:23:27Z, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 08:13:23 np0005548789.localdomain systemd[1]: libpod-conmon-17558ff0747fce48f7968377192af9e4833071686c98fabf2053cfd3378b2915.scope: Deactivated successfully.
Dec 06 08:13:23 np0005548789.localdomain python3[51252]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005548789 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005548789', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Dec 06 08:13:24 np0005548789.localdomain sudo[51250]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:24 np0005548789.localdomain sudo[53428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucjwvqxtwfyrglviyuyegoigxefmdxdz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:24 np0005548789.localdomain sudo[53428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:24 np0005548789.localdomain python3[53430]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:24 np0005548789.localdomain sudo[53428]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:25 np0005548789.localdomain sudo[53444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krpfcmxjbygzjxmbiuouxqnthabxejjr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:25 np0005548789.localdomain sudo[53444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:25 np0005548789.localdomain sudo[53444]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:25 np0005548789.localdomain sudo[53460]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izzaddmwyiyluagbtgfybfamficpwnkk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:25 np0005548789.localdomain sudo[53460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:25 np0005548789.localdomain python3[53462]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:25 np0005548789.localdomain sudo[53460]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:26 np0005548789.localdomain sudo[53510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gprprmfpatllisupenpleftzncfzerqt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:26 np0005548789.localdomain sudo[53510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:26 np0005548789.localdomain python3[53512]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:26 np0005548789.localdomain sudo[53510]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:26 np0005548789.localdomain sudo[53553]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrrgrgathjbqtquckvchpjrvoxefoqgx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:26 np0005548789.localdomain sudo[53553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:26 np0005548789.localdomain python3[53555]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008805.9646614-84320-120609074200236/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:26 np0005548789.localdomain sudo[53553]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:26 np0005548789.localdomain sudo[53615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upetcbctlheulxcxclteioxdvevohiaf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:26 np0005548789.localdomain sudo[53615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:27 np0005548789.localdomain python3[53617]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:27 np0005548789.localdomain sudo[53615]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:27 np0005548789.localdomain sudo[53658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqihtxawhdvddyyutkyfeibelivvkgrh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:27 np0005548789.localdomain sudo[53658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:27 np0005548789.localdomain python3[53660]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008806.766931-84320-138114916142107/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:27 np0005548789.localdomain sudo[53658]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:27 np0005548789.localdomain sudo[53720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebvfdfqvnfbyumvimkeikysltqdkoghi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:27 np0005548789.localdomain sudo[53720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:28 np0005548789.localdomain python3[53722]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:28 np0005548789.localdomain sudo[53720]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:28 np0005548789.localdomain sudo[53763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnxkxhanspowrjvekmkavdfyieskptne ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:28 np0005548789.localdomain sudo[53763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:28 np0005548789.localdomain python3[53765]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008807.7425575-84395-17108290442330/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:28 np0005548789.localdomain sudo[53763]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:28 np0005548789.localdomain sudo[53825]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbzuuiithpvoquyupwmjvglcyqksvxdq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:28 np0005548789.localdomain sudo[53825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:29 np0005548789.localdomain python3[53827]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:29 np0005548789.localdomain sudo[53825]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:29 np0005548789.localdomain sudo[53868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjcvcitwairzfvcvzoiepjcoyykqdrku ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:29 np0005548789.localdomain sudo[53868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:29 np0005548789.localdomain python3[53870]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008808.7203999-84416-27283574029558/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:29 np0005548789.localdomain sudo[53868]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:29 np0005548789.localdomain sudo[53898]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vigsqtailsvgauftalchiqtagalrpzff ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:29 np0005548789.localdomain sudo[53898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:29 np0005548789.localdomain python3[53900]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:29 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:30 np0005548789.localdomain systemd-rc-local-generator[53922]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:30 np0005548789.localdomain systemd-sysv-generator[53926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: Starting dnf makecache...
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:30 np0005548789.localdomain systemd-sysv-generator[53968]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:30 np0005548789.localdomain systemd-rc-local-generator[53963]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:30 np0005548789.localdomain dnf[53938]: Updating Subscription Management repositories.
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: Starting TripleO Container Shutdown...
Dec 06 08:13:30 np0005548789.localdomain systemd[1]: Finished TripleO Container Shutdown.
Dec 06 08:13:30 np0005548789.localdomain sudo[53898]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:30 np0005548789.localdomain sudo[54022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otjdkvlytbrpqmzhzrkucmdtlbpuqlyq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:30 np0005548789.localdomain sudo[54022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:30 np0005548789.localdomain python3[54024]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:31 np0005548789.localdomain sudo[54022]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:31 np0005548789.localdomain sudo[54065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scqsvpazvcbnqlteqearpoqlxkiybhnj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:31 np0005548789.localdomain sudo[54065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:31 np0005548789.localdomain python3[54067]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008810.6624763-84495-27298766745223/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:31 np0005548789.localdomain sudo[54065]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:31 np0005548789.localdomain sudo[54127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehhtrepfrripuvifzcvfqsubppguvvpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:31 np0005548789.localdomain sudo[54127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:31 np0005548789.localdomain python3[54129]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:13:31 np0005548789.localdomain sudo[54127]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:32 np0005548789.localdomain sudo[54170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iaifealqbeherwbsycqqznlknyaguifx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:32 np0005548789.localdomain sudo[54170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Failed determining last makecache time.
Dec 06 08:13:32 np0005548789.localdomain python3[54172]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008811.5923584-84532-23126471426552/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:32 np0005548789.localdomain sudo[54170]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   32 kB/s | 4.1 kB     00:00
Dec 06 08:13:32 np0005548789.localdomain sudo[54202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulihsmedvlcwqyylevcubhnehvgmlyhd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:32 np0005548789.localdomain sudo[54202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   47 kB/s | 4.1 kB     00:00
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  48 kB/s | 4.0 kB     00:00
Dec 06 08:13:32 np0005548789.localdomain python3[54204]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  62 kB/s | 4.5 kB     00:00
Dec 06 08:13:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:32 np0005548789.localdomain systemd-sysv-generator[54239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:32 np0005548789.localdomain systemd-rc-local-generator[54233]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:32 np0005548789.localdomain dnf[53938]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  48 kB/s | 4.5 kB     00:00
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:33 np0005548789.localdomain dnf[53938]: Fast Datapath for RHEL 9 x86_64 (RPMs)           47 kB/s | 4.0 kB     00:00
Dec 06 08:13:33 np0005548789.localdomain systemd-sysv-generator[54278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:33 np0005548789.localdomain systemd-rc-local-generator[54272]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:33 np0005548789.localdomain dnf[53938]: Red Hat Enterprise Linux 9 for x86_64 - High Av  48 kB/s | 4.0 kB     00:00
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:13:33 np0005548789.localdomain sudo[54202]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:33 np0005548789.localdomain dnf[53938]: Metadata cache created.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: Finished dnf makecache.
Dec 06 08:13:33 np0005548789.localdomain systemd[1]: dnf-makecache.service: Consumed 2.736s CPU time.
Dec 06 08:13:33 np0005548789.localdomain sudo[54301]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivxiqanhlabixuntawgpevruraywrjbh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:33 np0005548789.localdomain sudo[54301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: e8f60832f8f2382eeceefcaaff307d45
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: 4767aaabc3de112d8791c290aa2b669d
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 18576754feb36b85b5c8742ad9b5643d
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 7a657a42c3cbd75086c59cf211d6fafe
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 728090aef247cfdd273031dadf6d1125
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 728090aef247cfdd273031dadf6d1125
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 270cf6e6b67cba1ef197c7fa89d5bb20
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain python3[54303]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 179caa3982511c1fd3314b961771f96c
Dec 06 08:13:33 np0005548789.localdomain sudo[54301]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:34 np0005548789.localdomain sudo[54317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwyjiufcsrfbfdworncxmitkkcfjesaz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:34 np0005548789.localdomain sudo[54317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:34 np0005548789.localdomain sudo[54317]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:35 np0005548789.localdomain sudo[54359]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ernobwcvffmpphyponyhsmlwgrquupcb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:35 np0005548789.localdomain sudo[54359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:35 np0005548789.localdomain python3[54361]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.579806028 +0000 UTC m=+0.087750040 container create 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, container_name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4)
Dec 06 08:13:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471.scope.
Dec 06 08:13:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.537041863 +0000 UTC m=+0.044985845 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cfdd88bd7d8f375c1733109d5a26e8b3ffb2befe75a3de2b4653848e69b6e4/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.65184499 +0000 UTC m=+0.159788972 container init 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.666257727 +0000 UTC m=+0.174201719 container start 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.666656169 +0000 UTC m=+0.174600211 container attach 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step1, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:13:35 np0005548789.localdomain systemd[1]: libpod-977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471.scope: Deactivated successfully.
Dec 06 08:13:35 np0005548789.localdomain podman[54400]: 2025-12-06 08:13:35.671136828 +0000 UTC m=+0.179080830 container died 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 06 08:13:35 np0005548789.localdomain podman[54420]: 2025-12-06 08:13:35.771149247 +0000 UTC m=+0.086995957 container cleanup 977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:13:35 np0005548789.localdomain systemd[1]: libpod-conmon-977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471.scope: Deactivated successfully.
Dec 06 08:13:35 np0005548789.localdomain python3[54361]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Dec 06 08:13:36 np0005548789.localdomain podman[54499]: 2025-12-06 08:13:36.228330274 +0000 UTC m=+0.083507499 container create 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.scope.
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:13:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/beaf327340ccd7215a759765519263eed11e8999b460cf785f7dbab3207ce8ee/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/beaf327340ccd7215a759765519263eed11e8999b460cf785f7dbab3207ce8ee/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:13:36 np0005548789.localdomain podman[54499]: 2025-12-06 08:13:36.188096587 +0000 UTC m=+0.043273892 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:13:36 np0005548789.localdomain podman[54499]: 2025-12-06 08:13:36.30469197 +0000 UTC m=+0.159869275 container init 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64)
Dec 06 08:13:36 np0005548789.localdomain sudo[54520]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:13:36 np0005548789.localdomain sudo[54520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:13:36 np0005548789.localdomain podman[54499]: 2025-12-06 08:13:36.339475868 +0000 UTC m=+0.194653103 container start 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, container_name=metrics_qdr)
Dec 06 08:13:36 np0005548789.localdomain python3[54361]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=e8f60832f8f2382eeceefcaaff307d45 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Dec 06 08:13:36 np0005548789.localdomain sudo[54520]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:36 np0005548789.localdomain podman[54521]: 2025-12-06 08:13:36.483315295 +0000 UTC m=+0.139164123 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: tmp-crun.n5AVBM.mount: Deactivated successfully.
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f4cfdd88bd7d8f375c1733109d5a26e8b3ffb2befe75a3de2b4653848e69b6e4-merged.mount: Deactivated successfully.
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-977f365293d6490c82213a07a5e48ea92b9bd55fc3f3bbe6f0c9ec1022b39471-userdata-shm.mount: Deactivated successfully.
Dec 06 08:13:36 np0005548789.localdomain sudo[54359]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:36 np0005548789.localdomain podman[54521]: 2025-12-06 08:13:36.727238484 +0000 UTC m=+0.383087312 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd)
Dec 06 08:13:36 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:13:36 np0005548789.localdomain sudo[54595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knhctfqnylkurmrlkwwmeuxlbpjgccqw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:36 np0005548789.localdomain sudo[54595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:36 np0005548789.localdomain python3[54597]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:36 np0005548789.localdomain sudo[54595]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:37 np0005548789.localdomain sudo[54611]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xognmrhgearewtvmpvbaozdbdmydmgjq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:37 np0005548789.localdomain sudo[54611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:37 np0005548789.localdomain python3[54613]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:13:37 np0005548789.localdomain sudo[54611]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:37 np0005548789.localdomain sudo[54672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfkayxqmiccwlevqjzveqquomikwiuxp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:37 np0005548789.localdomain sudo[54672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:37 np0005548789.localdomain python3[54674]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765008817.2148407-84696-207396134986591/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:37 np0005548789.localdomain sudo[54672]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:37 np0005548789.localdomain sudo[54688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxagybbbzwevxpgmmzziayctzhaknzxr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:37 np0005548789.localdomain sudo[54688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:38 np0005548789.localdomain python3[54690]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:13:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:38 np0005548789.localdomain systemd-rc-local-generator[54714]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:38 np0005548789.localdomain systemd-sysv-generator[54719]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:38 np0005548789.localdomain sudo[54688]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:38 np0005548789.localdomain sudo[54740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzrfsjzsbjtjwwihskkbeolzobkmnlon ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:13:38 np0005548789.localdomain sudo[54740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:39 np0005548789.localdomain python3[54742]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:13:39 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:13:39 np0005548789.localdomain systemd-sysv-generator[54773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:13:39 np0005548789.localdomain systemd-rc-local-generator[54768]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:13:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:13:39 np0005548789.localdomain systemd[1]: Starting metrics_qdr container...
Dec 06 08:13:39 np0005548789.localdomain systemd[1]: Started metrics_qdr container.
Dec 06 08:13:39 np0005548789.localdomain sudo[54740]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:39 np0005548789.localdomain sudo[54820]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-begdllrxtfqxxyscfkfwedkasvugkpio ; /usr/bin/python3
Dec 06 08:13:39 np0005548789.localdomain sudo[54820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:39 np0005548789.localdomain python3[54822]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:39 np0005548789.localdomain sudo[54820]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:40 np0005548789.localdomain sudo[54868]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amkouyqglpuyzqiuvngpeecqyimhvxui ; /usr/bin/python3
Dec 06 08:13:40 np0005548789.localdomain sudo[54868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:40 np0005548789.localdomain sudo[54868]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:40 np0005548789.localdomain sudo[54911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovwppwwivsxveglbbxwovsuuzsewmbke ; /usr/bin/python3
Dec 06 08:13:40 np0005548789.localdomain sudo[54911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:40 np0005548789.localdomain sudo[54911]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:41 np0005548789.localdomain sudo[54941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgkhklkthgxesyikgqggkdrsufimbycy ; /usr/bin/python3
Dec 06 08:13:41 np0005548789.localdomain sudo[54941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:41 np0005548789.localdomain python3[54943]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005548789 step=1 update_config_hash_only=False
Dec 06 08:13:41 np0005548789.localdomain sudo[54941]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:41 np0005548789.localdomain sudo[54957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frepttwmjudjxrderflwtxkpkzdidwpf ; /usr/bin/python3
Dec 06 08:13:41 np0005548789.localdomain sudo[54957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:41 np0005548789.localdomain python3[54959]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:13:41 np0005548789.localdomain sudo[54957]: pam_unix(sudo:session): session closed for user root
Dec 06 08:13:41 np0005548789.localdomain sudo[54973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxmkhqyowiemdhjfjqgnhtdzovrwfvsk ; /usr/bin/python3
Dec 06 08:13:41 np0005548789.localdomain sudo[54973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:13:42 np0005548789.localdomain python3[54975]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:13:42 np0005548789.localdomain sudo[54973]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:01 np0005548789.localdomain sudo[54976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:14:01 np0005548789.localdomain sudo[54976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:01 np0005548789.localdomain sudo[54976]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:01 np0005548789.localdomain sudo[54991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:14:01 np0005548789.localdomain sudo[54991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:02 np0005548789.localdomain sudo[54991]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:03 np0005548789.localdomain sudo[55039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:14:03 np0005548789.localdomain sudo[55039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:14:03 np0005548789.localdomain sudo[55039]: pam_unix(sudo:session): session closed for user root
Dec 06 08:14:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:14:06 np0005548789.localdomain podman[55054]: 2025-12-06 08:14:06.916306151 +0000 UTC m=+0.078039050 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:14:07 np0005548789.localdomain podman[55054]: 2025-12-06 08:14:07.098013051 +0000 UTC m=+0.259745930 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:14:07 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:14:11 np0005548789.localdomain sshd[55083]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:14:12 np0005548789.localdomain sshd[55083]: Received disconnect from 195.250.72.168 port 48378:11: Bye Bye [preauth]
Dec 06 08:14:12 np0005548789.localdomain sshd[55083]: Disconnected from authenticating user root 195.250.72.168 port 48378 [preauth]
Dec 06 08:14:15 np0005548789.localdomain sshd[55085]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:14:16 np0005548789.localdomain sshd[55085]: Received disconnect from 154.201.83.49 port 42108:11: Bye Bye [preauth]
Dec 06 08:14:16 np0005548789.localdomain sshd[55085]: Disconnected from authenticating user root 154.201.83.49 port 42108 [preauth]
Dec 06 08:14:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:14:37 np0005548789.localdomain podman[55088]: 2025-12-06 08:14:37.922845189 +0000 UTC m=+0.083528398 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, distribution-scope=public, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:14:38 np0005548789.localdomain podman[55088]: 2025-12-06 08:14:38.130730692 +0000 UTC m=+0.291413861 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, release=1761123044, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 06 08:14:38 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:15:03 np0005548789.localdomain sudo[55116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:15:03 np0005548789.localdomain sudo[55116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:03 np0005548789.localdomain sudo[55116]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:03 np0005548789.localdomain sudo[55131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:15:03 np0005548789.localdomain sudo[55131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:03 np0005548789.localdomain sudo[55131]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:04 np0005548789.localdomain sudo[55178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:15:04 np0005548789.localdomain sudo[55178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:15:04 np0005548789.localdomain sudo[55178]: pam_unix(sudo:session): session closed for user root
Dec 06 08:15:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:15:08 np0005548789.localdomain podman[55193]: 2025-12-06 08:15:08.921945215 +0000 UTC m=+0.079983321 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1761123044, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:15:09 np0005548789.localdomain podman[55193]: 2025-12-06 08:15:09.117359568 +0000 UTC m=+0.275397624 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.)
Dec 06 08:15:09 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:15:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:15:39 np0005548789.localdomain systemd[1]: tmp-crun.9ArRx9.mount: Deactivated successfully.
Dec 06 08:15:39 np0005548789.localdomain podman[55222]: 2025-12-06 08:15:39.928296018 +0000 UTC m=+0.087339686 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1)
Dec 06 08:15:40 np0005548789.localdomain podman[55222]: 2025-12-06 08:15:40.121204784 +0000 UTC m=+0.280248432 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12)
Dec 06 08:15:40 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:15:40 np0005548789.localdomain sshd[55252]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:15:41 np0005548789.localdomain sshd[55252]: Received disconnect from 195.250.72.168 port 54920:11: Bye Bye [preauth]
Dec 06 08:15:41 np0005548789.localdomain sshd[55252]: Disconnected from authenticating user root 195.250.72.168 port 54920 [preauth]
Dec 06 08:15:49 np0005548789.localdomain sshd[55254]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:15:51 np0005548789.localdomain sshd[55254]: Received disconnect from 154.201.83.49 port 56906:11: Bye Bye [preauth]
Dec 06 08:15:51 np0005548789.localdomain sshd[55254]: Disconnected from authenticating user root 154.201.83.49 port 56906 [preauth]
Dec 06 08:16:04 np0005548789.localdomain sudo[55256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:16:04 np0005548789.localdomain sudo[55256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:04 np0005548789.localdomain sudo[55256]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:04 np0005548789.localdomain sudo[55271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:16:04 np0005548789.localdomain sudo[55271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:05 np0005548789.localdomain sudo[55271]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:06 np0005548789.localdomain sudo[55317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:16:06 np0005548789.localdomain sudo[55317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:16:06 np0005548789.localdomain sudo[55317]: pam_unix(sudo:session): session closed for user root
Dec 06 08:16:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:16:10 np0005548789.localdomain podman[55332]: 2025-12-06 08:16:10.922885059 +0000 UTC m=+0.081911905 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:16:11 np0005548789.localdomain podman[55332]: 2025-12-06 08:16:11.109966475 +0000 UTC m=+0.268993351 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z)
Dec 06 08:16:11 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:16:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:16:41 np0005548789.localdomain systemd[1]: tmp-crun.5zskWq.mount: Deactivated successfully.
Dec 06 08:16:41 np0005548789.localdomain podman[55363]: 2025-12-06 08:16:41.93435171 +0000 UTC m=+0.091101149 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:16:42 np0005548789.localdomain podman[55363]: 2025-12-06 08:16:42.173448236 +0000 UTC m=+0.330197605 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:16:42 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:17:06 np0005548789.localdomain sudo[55392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:17:06 np0005548789.localdomain sudo[55392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:06 np0005548789.localdomain sudo[55392]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:06 np0005548789.localdomain sudo[55407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:17:06 np0005548789.localdomain sudo[55407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:06 np0005548789.localdomain sudo[55407]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:07 np0005548789.localdomain sudo[55454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:17:07 np0005548789.localdomain sudo[55454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:17:07 np0005548789.localdomain sudo[55454]: pam_unix(sudo:session): session closed for user root
Dec 06 08:17:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:17:12 np0005548789.localdomain systemd[1]: tmp-crun.QC7XRG.mount: Deactivated successfully.
Dec 06 08:17:12 np0005548789.localdomain podman[55469]: 2025-12-06 08:17:12.921031341 +0000 UTC m=+0.083810382 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1)
Dec 06 08:17:13 np0005548789.localdomain podman[55469]: 2025-12-06 08:17:13.160446928 +0000 UTC m=+0.323225899 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, tcib_managed=true)
Dec 06 08:17:13 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:17:29 np0005548789.localdomain sshd[55498]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:17:31 np0005548789.localdomain sshd[55498]: Received disconnect from 154.201.83.49 port 58926:11: Bye Bye [preauth]
Dec 06 08:17:31 np0005548789.localdomain sshd[55498]: Disconnected from authenticating user root 154.201.83.49 port 58926 [preauth]
Dec 06 08:17:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:17:43 np0005548789.localdomain podman[55500]: 2025-12-06 08:17:43.907013594 +0000 UTC m=+0.073025030 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-qdrouterd)
Dec 06 08:17:44 np0005548789.localdomain podman[55500]: 2025-12-06 08:17:44.128209206 +0000 UTC m=+0.294220612 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, version=17.1.12)
Dec 06 08:17:44 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:18:07 np0005548789.localdomain sudo[55530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:18:07 np0005548789.localdomain sudo[55530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:07 np0005548789.localdomain sudo[55530]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:07 np0005548789.localdomain sudo[55545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:18:07 np0005548789.localdomain sudo[55545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:08 np0005548789.localdomain sudo[55545]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:08 np0005548789.localdomain sudo[55592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:08 np0005548789.localdomain sudo[55592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:08 np0005548789.localdomain sudo[55592]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:18:14 np0005548789.localdomain podman[55607]: 2025-12-06 08:18:14.924109176 +0000 UTC m=+0.080923944 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:18:15 np0005548789.localdomain podman[55607]: 2025-12-06 08:18:15.116090717 +0000 UTC m=+0.272905445 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:18:15 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:18:21 np0005548789.localdomain sshd[55637]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:23 np0005548789.localdomain sshd[55637]: Invalid user admin from 45.135.232.92 port 55290
Dec 06 08:18:23 np0005548789.localdomain sshd[55637]: Connection reset by invalid user admin 45.135.232.92 port 55290 [preauth]
Dec 06 08:18:23 np0005548789.localdomain sshd[55639]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:25 np0005548789.localdomain sshd[55639]: Connection reset by authenticating user root 45.135.232.92 port 55320 [preauth]
Dec 06 08:18:26 np0005548789.localdomain sshd[55641]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:27 np0005548789.localdomain sshd[55641]: Connection reset by authenticating user root 45.135.232.92 port 63244 [preauth]
Dec 06 08:18:27 np0005548789.localdomain sshd[55643]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:28 np0005548789.localdomain sshd[55643]: Invalid user guest from 45.135.232.92 port 63254
Dec 06 08:18:29 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [3,1,5] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:29 np0005548789.localdomain sshd[55643]: Connection reset by invalid user guest 45.135.232.92 port 63254 [preauth]
Dec 06 08:18:29 np0005548789.localdomain sshd[55645]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:18:30 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [1,3,2] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:31 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [1,3,2] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:31 np0005548789.localdomain sshd[55645]: Connection reset by authenticating user root 45.135.232.92 port 63260 [preauth]
Dec 06 08:18:33 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 25 pg[5.0( empty local-lis/les=0/0 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [4,5,3] r=0 lpr=25 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:33 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [5,0,1] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:34 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 26 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=0/0 les/c/f=0/0/0 sis=25) [4,5,3] r=0 lpr=25 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:37 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 29 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=29 pruub=15.217467308s) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 active pruub 1124.114501953s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,5], acting [3,1,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:37 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 29 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=29 pruub=15.214870453s) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1124.114501953s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1e( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1f( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1b( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1c( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.a( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.9( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.8( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.7( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.5( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.4( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.3( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.6( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.2( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.b( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.c( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.e( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.f( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.10( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.11( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.d( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.13( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.12( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.14( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.15( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.17( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.16( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.18( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.19( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1a( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:38 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 30 pg[2.1d( empty local-lis/les=20/21 n=0 ec=29/20 lis/c=20/20 les/c/f=21/21/0 sis=29) [3,1,5] r=1 lpr=29 pi=[20,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:39 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 31 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=31 pruub=15.685779572s) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active pruub 1126.021484375s@ mbc={}] start_peering_interval up [1,3,2] -> [1,3,2], acting [1,3,2] -> [1,3,2], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:39 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 31 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=31 pruub=10.205893517s) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 active pruub 1120.541748047s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,1], acting [5,0,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:39 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 31 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=31 pruub=15.685779572s) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1126.021484375s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:39 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 31 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=31 pruub=10.202860832s) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1120.541748047s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.19( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.18( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1a( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.f( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.e( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.c( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1d( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.3( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.2( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.5( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.4( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.7( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.6( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.d( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.a( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.b( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.8( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1b( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.9( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.16( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.14( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.12( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.15( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.13( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.17( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.10( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.11( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1f( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1e( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[4.1c( empty local-lis/les=24/25 n=0 ec=31/24 lis/c=24/24 les/c/f=25/25/0 sis=31) [5,0,1] r=2 lpr=31 pi=[24,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1b( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.18( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.19( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.17( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.16( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.14( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.15( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.12( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.13( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.10( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.e( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.11( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.c( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.f( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.d( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.a( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.3( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.2( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.5( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.7( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.4( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.6( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.8( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.b( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1a( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1d( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1c( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1e( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1f( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.9( empty local-lis/les=22/23 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.0( empty local-lis/les=31/32 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:40 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 32 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=22/22 les/c/f=23/23/0 sis=31) [1,3,2] r=0 lpr=31 pi=[22,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:41 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 33 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=8.847756386s) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 1117.332031250s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:41 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 33 pg[5.0( empty local-lis/les=25/26 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=8.847756386s) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown pruub 1117.332031250s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.19( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.16( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.15( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.14( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.17( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.13( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.11( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.12( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.10( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.e( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.f( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.d( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.c( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.4( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1b( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.b( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.2( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.18( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.3( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.5( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.6( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.7( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1a( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.8( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.a( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.9( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1c( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1f( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1e( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1d( empty local-lis/les=25/26 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.0( empty local-lis/les=33/34 n=0 ec=25/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.13( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:42 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 34 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=25/25 les/c/f=26/26/0 sis=33) [4,5,3] r=0 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:44 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Dec 06 08:18:44 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Dec 06 08:18:45 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.0 deep-scrub starts
Dec 06 08:18:45 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.0 deep-scrub ok
Dec 06 08:18:45 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 06 08:18:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:18:45 np0005548789.localdomain systemd[1]: tmp-crun.1VrhIS.mount: Deactivated successfully.
Dec 06 08:18:45 np0005548789.localdomain podman[55647]: 2025-12-06 08:18:45.925667846 +0000 UTC m=+0.085035641 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 06 08:18:46 np0005548789.localdomain podman[55647]: 2025-12-06 08:18:46.122114945 +0000 UTC m=+0.281482780 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:18:46 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:18:46 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 35 pg[6.0( empty local-lis/les=0/0 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [5,0,1] r=2 lpr=35 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:46 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 06 08:18:46 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 06 08:18:47 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.4 deep-scrub starts
Dec 06 08:18:47 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.4 deep-scrub ok
Dec 06 08:18:48 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec 06 08:18:48 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec 06 08:18:48 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 06 08:18:48 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 06 08:18:48 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 37 pg[7.0( empty local-lis/les=0/0 n=0 ec=37/37 lis/c=0/0 les/c/f=0/0/0 sis=37) [0,1,5] r=1 lpr=37 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:49 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 06 08:18:49 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 06 08:18:50 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 06 08:18:50 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 06 08:18:50 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 06 08:18:50 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 06 08:18:52 np0005548789.localdomain sudo[55676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:52 np0005548789.localdomain sudo[55676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:52 np0005548789.localdomain sudo[55676]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:52 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 06 08:18:52 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.340268135s) [3,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518920898s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.340203285s) [3,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518920898s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338585854s) [3,4,5] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518066406s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338965416s) [5,3,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518676758s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,4], acting [4,5,3] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.339303017s) [0,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519042969s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.339650154s) [5,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519287109s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.339261055s) [0,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519042969s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338898659s) [5,3,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518676758s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.339530945s) [5,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519287109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338674545s) [0,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518676758s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338496208s) [3,4,5] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518066406s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338641167s) [0,4,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518676758s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338445663s) [3,5,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518676758s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.339014053s) [2,1,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519287109s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,3], acting [4,5,3] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338404655s) [3,5,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518676758s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338947296s) [2,1,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519287109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338918686s) [1,0,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519287109s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,5], acting [4,5,3] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338329315s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518798828s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338288307s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518798828s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338768959s) [1,0,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519287109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338772774s) [2,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519287109s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338633537s) [2,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519287109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338482857s) [0,1,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519165039s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,5], acting [4,5,3] -> [0,1,5], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338046074s) [5,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518798828s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338452339s) [0,1,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519165039s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.337989807s) [5,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518798828s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338069916s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518920898s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338170052s) [3,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519165039s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,1], acting [4,5,3] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338128090s) [3,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519165039s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.337833405s) [0,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518920898s@ mbc={}] start_peering_interval up [4,5,3] -> [0,5,1], acting [4,5,3] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.337875366s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518920898s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338096619s) [0,5,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.519287109s@ mbc={}] start_peering_interval up [4,5,3] -> [0,5,4], acting [4,5,3] -> [0,5,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.338067055s) [0,5,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.519287109s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.337771416s) [0,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518920898s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.336598396s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.517944336s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.336528778s) [1,2,3] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.517944336s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.12( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,2,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.336084366s) [3,2,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518310547s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335754395s) [5,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.517944336s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335641861s) [5,0,4] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.517944336s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335768700s) [4,0,2] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518066406s@ mbc={}] start_peering_interval up [4,5,3] -> [4,0,2], acting [4,5,3] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335768700s) [4,0,2] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.518066406s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.336028099s) [0,1,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518676758s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,2], acting [4,5,3] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335985184s) [0,1,2] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518676758s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335522652s) [0,4,5] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518310547s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,5], acting [4,5,3] -> [0,4,5], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335630417s) [2,4,0] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518554688s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335442543s) [0,4,5] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518310547s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335517883s) [2,4,0] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518554688s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335613251s) [1,3,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518676758s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,5], acting [4,5,3] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335433006s) [3,2,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518310547s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.335539818s) [1,3,5] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518676758s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334833145s) [5,1,0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.517944336s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334788322s) [5,1,0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.517944336s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334922791s) [3,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.518310547s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,1], acting [4,5,3] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334559441s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.517944336s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334863663s) [3,5,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.518310547s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=13.334429741s) [2,3,1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.517944336s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.10( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,5,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,0,5] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.289616585s) [2,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.385009766s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,4], acting [1,3,2] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.289537430s) [2,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.385009766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.289740562s) [2,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.385375977s@ mbc={}] start_peering_interval up [1,3,2] -> [2,0,1], acting [1,3,2] -> [2,0,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.242539406s) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.338378906s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,2], acting [3,1,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.242082596s) [1,5,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337890625s@ mbc={}] start_peering_interval up [3,1,5] -> [1,5,3], acting [3,1,5] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.242539406s) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.338378906s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.288921356s) [1,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.384887695s@ mbc={}] start_peering_interval up [1,3,2] -> [1,0,2], acting [1,3,2] -> [1,0,2], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.242082596s) [1,5,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.337890625s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,0,5] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.288921356s) [1,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.384887695s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.289569855s) [2,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.385375977s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.286011696s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.382202148s@ mbc={}] start_peering_interval up [1,3,2] -> [4,5,0], acting [1,3,2] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.288366318s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.384765625s@ mbc={}] start_peering_interval up [1,3,2] -> [4,3,2], acting [1,3,2] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.285967827s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.382202148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.288305283s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.384765625s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240776062s) [3,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337524414s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,2], acting [3,1,5] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.285346985s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.382080078s@ mbc={}] start_peering_interval up [1,3,2] -> [2,1,3], acting [1,3,2] -> [2,1,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.285290718s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.382080078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240674973s) [3,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337524414s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.1a( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240542412s) [1,2,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337890625s@ mbc={}] start_peering_interval up [3,1,5] -> [1,2,0], acting [3,1,5] -> [1,2,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.240542412s) [1,2,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.337890625s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.239907265s) [3,2,1] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337524414s@ mbc={}] start_peering_interval up [3,1,5] -> [3,2,1], acting [3,1,5] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.239867210s) [3,2,1] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337524414s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.284351349s) [3,5,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381958008s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,4], acting [1,3,2] -> [3,5,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.284281731s) [3,5,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381958008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.2( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.245015144s) [3,5,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343261719s@ mbc={}] start_peering_interval up [3,1,5] -> [3,5,4], acting [3,1,5] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.283912659s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.382080078s@ mbc={}] start_peering_interval up [1,3,2] -> [2,1,3], acting [1,3,2] -> [2,1,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.283845901s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.382080078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.2( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.244854927s) [3,5,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238761902s) [1,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337524414s@ mbc={}] start_peering_interval up [3,1,5] -> [1,3,2], acting [3,1,5] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.282959938s) [3,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381713867s@ mbc={}] start_peering_interval up [1,3,2] -> [3,1,2], acting [1,3,2] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.b( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238761902s) [1,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.337524414s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.285868645s) [3,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.384887695s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.282713890s) [3,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381713867s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.282619476s) [2,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381713867s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,4], acting [1,3,2] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.282588005s) [2,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381713867s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.285821915s) [3,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.384887695s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237878799s) [1,0,5] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337280273s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,5], acting [3,1,5] -> [1,0,5], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237878799s) [1,0,5] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.337280273s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.8( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237279892s) [1,5,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337158203s@ mbc={}] start_peering_interval up [3,1,5] -> [1,5,0], acting [3,1,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237279892s) [1,5,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.337158203s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.279194832s) [3,2,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379760742s@ mbc={}] start_peering_interval up [1,3,2] -> [3,2,1], acting [1,3,2] -> [3,2,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.279093742s) [3,2,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379760742s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.1f( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.12( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.235630035s) [4,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.336669922s@ mbc={}] start_peering_interval up [3,1,5] -> [4,2,3], acting [3,1,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.12( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.235581398s) [4,2,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.336669922s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.280607224s) [3,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381835938s@ mbc={}] start_peering_interval up [1,3,2] -> [3,1,5], acting [1,3,2] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.278309822s) [2,3,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379760742s@ mbc={}] start_peering_interval up [1,3,2] -> [2,3,1], acting [1,3,2] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.280846596s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381591797s@ mbc={}] start_peering_interval up [1,3,2] -> [4,0,2], acting [1,3,2] -> [4,0,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.11( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.279998779s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381591797s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.13( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.236655235s) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.338256836s@ mbc={}] start_peering_interval up [3,1,5] -> [1,0,2], acting [3,1,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.13( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.236655235s) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.338256836s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.280501366s) [3,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381835938s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.277237892s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.380004883s@ mbc={}] start_peering_interval up [1,3,2] -> [3,4,2], acting [1,3,2] -> [3,4,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.276803970s) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379516602s@ mbc={}] start_peering_interval up [1,3,2] -> [1,3,5], acting [1,3,2] -> [1,3,5], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.14( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.234052658s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.336669922s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,2], acting [3,1,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.277863503s) [2,3,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379760742s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.6( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233534813s) [1,2,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.336669922s@ mbc={}] start_peering_interval up [3,1,5] -> [1,2,3], acting [3,1,5] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.233534813s) [1,2,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.336669922s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.276803970s) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.379516602s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.1d( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.276491165s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.380004883s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,3,5] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.14( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.232672691s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.336669922s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238903999s) [3,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343261719s@ mbc={}] start_peering_interval up [3,1,5] -> [3,1,2], acting [3,1,5] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238856316s) [3,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.18( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238903999s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343505859s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.275229454s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379638672s@ mbc={}] start_peering_interval up [1,3,2] -> [3,2,4], acting [1,3,2] -> [3,2,4], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.14( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271645546s) [1,2,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376586914s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,3], acting [5,0,1] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.18( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238822937s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343505859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271645546s) [1,2,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.376586914s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238750458s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343505859s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,2], acting [3,1,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.274788857s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379638672s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.15( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.279878616s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.385009766s@ mbc={}] start_peering_interval up [1,3,2] -> [4,0,2], acting [1,3,2] -> [4,0,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.279705048s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.385009766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.19( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237913132s) [2,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343505859s@ mbc={}] start_peering_interval up [3,1,5] -> [2,0,4], acting [3,1,5] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.19( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.237832069s) [2,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343505859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270583153s) [2,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376586914s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,1], acting [5,0,1] -> [2,0,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.273570061s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379516602s@ mbc={}] start_peering_interval up [1,3,2] -> [5,1,3], acting [1,3,2] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.273510933s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379516602s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269839287s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376098633s@ mbc={}] start_peering_interval up [5,0,1] -> [4,3,2], acting [5,0,1] -> [4,3,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1a( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.238442421s) [3,4,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343505859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269789696s) [4,3,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376098633s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272933960s) [5,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379516602s@ mbc={}] start_peering_interval up [1,3,2] -> [5,0,1], acting [1,3,2] -> [5,0,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270020485s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376464844s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270485878s) [2,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376586914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269218445s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376098633s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,3], acting [5,0,1] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269165993s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376098633s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272994995s) [5,3,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379882812s@ mbc={}] start_peering_interval up [1,3,2] -> [5,3,1], acting [1,3,2] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.11( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.269561768s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376464844s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272998810s) [0,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.380126953s@ mbc={}] start_peering_interval up [5,0,1] -> [0,1,5], acting [5,0,1] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.13( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272933960s) [0,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.380126953s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268793106s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376098633s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268742561s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376098633s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267452240s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374877930s@ mbc={}] start_peering_interval up [5,0,1] -> [4,2,0], acting [5,0,1] -> [4,2,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272860527s) [5,0,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379516602s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267168999s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374755859s@ mbc={}] start_peering_interval up [5,0,1] -> [4,2,0], acting [5,0,1] -> [4,2,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272921562s) [5,3,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379882812s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267083168s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374755859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268669128s) [5,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.376586914s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.15( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.267400742s) [4,2,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374877930s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.10( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229127884s) [4,5,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337280273s@ mbc={}] start_peering_interval up [3,1,5] -> [4,5,3], acting [3,1,5] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.10( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229074478s) [4,5,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337280273s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.11( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228689194s) [0,2,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337036133s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266458511s) [1,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374755859s@ mbc={}] start_peering_interval up [5,0,1] -> [1,5,0], acting [5,0,1] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228619576s) [4,0,5] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337036133s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266458511s) [1,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.374755859s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.11( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228550911s) [0,2,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337036133s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.17( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.268606186s) [5,3,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.376586914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.16( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.234749794s) [0,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343383789s@ mbc={}] start_peering_interval up [3,1,5] -> [0,1,2], acting [3,1,5] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265679359s) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374389648s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,5], acting [5,0,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.16( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.234657288s) [0,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343383789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228191376s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337036133s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.265679359s) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.374389648s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228139877s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337036133s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272604942s) [0,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381591797s@ mbc={}] start_peering_interval up [1,3,2] -> [0,1,2], acting [1,3,2] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228571892s) [4,0,5] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337036133s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272802353s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381958008s@ mbc={}] start_peering_interval up [1,3,2] -> [0,5,1], acting [1,3,2] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272746086s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381958008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.272253990s) [0,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381591797s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264956474s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374389648s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270475388s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.380004883s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270392418s) [2,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.380004883s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264524460s) [3,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374267578s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,5], acting [5,0,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264867783s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374633789s@ mbc={}] start_peering_interval up [5,0,1] -> [4,5,0], acting [5,0,1] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264428139s) [3,1,5] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374267578s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.6( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264818192s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374633789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264459610s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374389648s@ mbc={}] start_peering_interval up [5,0,1] -> [2,1,3], acting [5,0,1] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264404297s) [2,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374389648s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227999687s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.338134766s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227627754s) [2,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337768555s@ mbc={}] start_peering_interval up [3,1,5] -> [2,1,3], acting [3,1,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227949142s) [2,4,3] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.338134766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.227521896s) [2,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337768555s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263625145s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374023438s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263570786s) [5,1,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374023438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271447182s) [0,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381958008s@ mbc={}] start_peering_interval up [1,3,2] -> [0,2,4], acting [1,3,2] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264107704s) [3,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374633789s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,2], acting [5,0,1] -> [3,1,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271395683s) [0,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381958008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264052391s) [3,1,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374633789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270923615s) [5,4,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.381591797s@ mbc={}] start_peering_interval up [1,3,2] -> [5,4,0], acting [1,3,2] -> [5,4,0], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.264909744s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374389648s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263178825s) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374023438s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270862579s) [5,4,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.381591797s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.226953506s) [5,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.337890625s@ mbc={}] start_peering_interval up [3,1,5] -> [5,1,3], acting [3,1,5] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.226895332s) [5,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.337890625s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.262713432s) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374145508s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270758629s) [5,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.382080078s@ mbc={}] start_peering_interval up [1,3,2] -> [5,0,4], acting [1,3,2] -> [5,0,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261864662s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373291016s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,2], acting [5,0,1] -> [3,4,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.263178825s) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.374023438s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.3( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.262713432s) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.374145508s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.e( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261789322s) [3,4,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373291016s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270571709s) [5,0,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.382080078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261681557s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373413086s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.8( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.231292725s) [0,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343017578s@ mbc={}] start_peering_interval up [3,1,5] -> [0,1,2], acting [3,1,5] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.8( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.231239319s) [0,1,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343017578s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.226190567s) [5,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.338012695s@ mbc={}] start_peering_interval up [3,1,5] -> [5,0,4], acting [3,1,5] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261618614s) [4,0,2] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373413086s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.262058258s) [2,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374145508s@ mbc={}] start_peering_interval up [5,0,1] -> [2,4,3], acting [5,0,1] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.225929260s) [5,0,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.338012695s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260646820s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373168945s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.c( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260542870s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373168945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260731697s) [4,3,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373413086s@ mbc={}] start_peering_interval up [5,0,1] -> [4,3,5], acting [5,0,1] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266613007s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.379516602s@ mbc={}] start_peering_interval up [1,3,2] -> [4,5,0], acting [1,3,2] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1d( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260600090s) [4,3,5] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373413086s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.266546249s) [4,5,0] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.379516602s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260270119s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373168945s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,4], acting [5,0,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1a( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260220528s) [3,2,4] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373168945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230233192s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343383789s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261491776s) [3,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.374633789s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,1], acting [5,0,1] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271749496s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.385009766s@ mbc={}] start_peering_interval up [1,3,2] -> [0,5,1], acting [1,3,2] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1c( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.230112076s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343383789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.271697044s) [0,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.385009766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.1b( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.261339188s) [3,5,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374633789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229953766s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343505859s@ mbc={}] start_peering_interval up [3,1,5] -> [4,3,2], acting [3,1,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259502411s) [3,2,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373168945s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,1], acting [5,0,1] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229522705s) [5,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343261719s@ mbc={}] start_peering_interval up [3,1,5] -> [5,1,3], acting [3,1,5] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.18( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259428978s) [3,2,1] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.373168945s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259214401s) [1,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.373168945s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,0], acting [5,0,1] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.260226250s) [2,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.374145508s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[4.19( empty local-lis/les=31/32 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.259214401s) [1,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown pruub 1135.373168945s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270941734s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active pruub 1135.385009766s@ mbc={}] start_peering_interval up [1,3,2] -> [5,4,3], acting [1,3,2] -> [5,4,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1e( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229468346s) [5,1,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41 pruub=11.270884514s) [5,4,3] r=-1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1135.385009766s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1d( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.229892731s) [4,3,2] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343505859s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228734016s) [2,3,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active pruub 1133.343261719s@ mbc={}] start_peering_interval up [3,1,5] -> [2,3,4], acting [3,1,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:18:53 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[2.1f( empty local-lis/les=29/30 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41 pruub=9.228308678s) [2,3,4] r=-1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.343261719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,0,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.10( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.17( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,3,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.2( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,0] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [5,0,4] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [5,4,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,3,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.1a( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,4,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.14( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,4,2] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,2,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.14( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,4,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.17( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [5,1,0] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.f( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,4,3] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [3,5,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,3,4] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [3,2,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [3,5,4] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,5,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,4,2] r=1 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.1( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.b( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,3,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.e( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,4,3] r=1 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,0,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.a( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [2,1,3] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [2,0,4] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,3,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [3,2,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [2,3,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [3,5,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [0,2,4] r=2 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [0,2,4] r=2 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,1,5] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,5,1] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.12( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,2,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.1f( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [0,1,2] r=1 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.10( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,5,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.6( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,5,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.15( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,0,5] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.3( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [4,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.19( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,2,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,2,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,0,5] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,3,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,0,5] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.c( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,5,0] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,0,2] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,2,3] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[5.8( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,2,3] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=29/29 les/c/f=30/30/0 sis=41) [1,2,3] r=0 lpr=41 pi=[29,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[4.1d( empty local-lis/les=41/42 n=0 ec=31/24 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[5.d( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 42 pg[3.1a( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [4,5,0] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=33/25 lis/c=33/33 les/c/f=34/34/0 sis=41) [1,3,5] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,3,5] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=31/22 lis/c=31/31 les/c/f=32/32/0 sis=41) [1,0,2] r=0 lpr=41 pi=[31,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:18:54 np0005548789.localdomain sudo[55691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:54 np0005548789.localdomain sudo[55691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:54 np0005548789.localdomain sudo[55691]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:55 np0005548789.localdomain sudo[55706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:18:55 np0005548789.localdomain sudo[55706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:18:55 np0005548789.localdomain sudo[55706]: pam_unix(sudo:session): session closed for user root
Dec 06 08:18:58 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 06 08:18:58 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 06 08:18:59 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 06 08:19:03 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.1d deep-scrub starts
Dec 06 08:19:03 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 3.1d deep-scrub ok
Dec 06 08:19:04 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 06 08:19:04 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 06 08:19:08 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 06 08:19:08 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 06 08:19:08 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Dec 06 08:19:08 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Dec 06 08:19:09 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 06 08:19:09 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 06 08:19:10 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Dec 06 08:19:10 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Dec 06 08:19:10 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Dec 06 08:19:10 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Dec 06 08:19:11 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.c scrub starts
Dec 06 08:19:11 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.c scrub ok
Dec 06 08:19:12 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 06 08:19:12 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 06 08:19:13 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.a deep-scrub starts
Dec 06 08:19:13 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.a deep-scrub ok
Dec 06 08:19:16 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Dec 06 08:19:16 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Dec 06 08:19:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:19:16 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec 06 08:19:16 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec 06 08:19:16 np0005548789.localdomain podman[55721]: 2025-12-06 08:19:16.920607594 +0000 UTC m=+0.083521684 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:19:17 np0005548789.localdomain podman[55721]: 2025-12-06 08:19:17.118236499 +0000 UTC m=+0.281150589 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:19:17 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:19:17 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Dec 06 08:19:17 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Dec 06 08:19:18 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 06 08:19:18 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 06 08:19:18 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 06 08:19:19 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 06 08:19:19 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec 06 08:19:19 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec 06 08:19:20 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 06 08:19:20 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 06 08:19:22 np0005548789.localdomain sudo[55765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nysbdrxwrohmjyxhllhsjyoeepihfpcf ; /usr/bin/python3
Dec 06 08:19:22 np0005548789.localdomain sudo[55765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:22 np0005548789.localdomain python3[55767]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:22 np0005548789.localdomain sudo[55765]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:24 np0005548789.localdomain sudo[55781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdvxsyrqofnwxiiypwixqhvzbzzxlaft ; /usr/bin/python3
Dec 06 08:19:24 np0005548789.localdomain sudo[55781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:24 np0005548789.localdomain python3[55783]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:24 np0005548789.localdomain sudo[55781]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:25 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec 06 08:19:25 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec 06 08:19:26 np0005548789.localdomain sudo[55797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vofxdzukgnbfxiohalolhvzskzsxdrqe ; /usr/bin/python3
Dec 06 08:19:26 np0005548789.localdomain sudo[55797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:26 np0005548789.localdomain python3[55799]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:26 np0005548789.localdomain sudo[55797]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:27 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 06 08:19:27 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 06 08:19:28 np0005548789.localdomain sudo[55845]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgeugcrjonxbgckjcnmdfnxsoijpgsnr ; /usr/bin/python3
Dec 06 08:19:28 np0005548789.localdomain sudo[55845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:29 np0005548789.localdomain python3[55847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:29 np0005548789.localdomain sudo[55845]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:29 np0005548789.localdomain sudo[55888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzislospuvbishmziimfwkhhhjgurrxg ; /usr/bin/python3
Dec 06 08:19:29 np0005548789.localdomain sudo[55888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:29 np0005548789.localdomain python3[55890]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009168.8126051-92087-50419872536182/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=9d631b6552ddeaa0e75a39b18f2bdb583e0e85e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:29 np0005548789.localdomain sudo[55888]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:30 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 06 08:19:30 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 06 08:19:31 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 06 08:19:31 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 06 08:19:32 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 06 08:19:32 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 06 08:19:33 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec 06 08:19:33 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec 06 08:19:34 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 06 08:19:34 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 06 08:19:34 np0005548789.localdomain sudo[55951]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgknxcedyutnrlscdkrubsdcizuxszvt ; /usr/bin/python3
Dec 06 08:19:34 np0005548789.localdomain sudo[55951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:34 np0005548789.localdomain python3[55953]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:34 np0005548789.localdomain sudo[55951]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:34 np0005548789.localdomain sudo[55994]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aganlxmfstemknikwffxzfcehlfhmmpd ; /usr/bin/python3
Dec 06 08:19:34 np0005548789.localdomain sudo[55994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:35 np0005548789.localdomain python3[55996]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009174.3886807-92087-195413774088246/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=04fcaa63c42fa3b2b702e4421ebc774041538ebd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:35 np0005548789.localdomain sudo[55994]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:35 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Dec 06 08:19:35 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Dec 06 08:19:36 np0005548789.localdomain ceph-osd[32665]: osd.4 43 crush map has features 432629239337189376, adjusting msgr requires for clients
Dec 06 08:19:36 np0005548789.localdomain ceph-osd[32665]: osd.4 43 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Dec 06 08:19:36 np0005548789.localdomain ceph-osd[32665]: osd.4 43 crush map has features 3314933000854323200, adjusting msgr requires for osds
Dec 06 08:19:36 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 43 pg[2.1f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=13.512729645s) [5,3,4] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1177.215209961s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:36 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 43 pg[2.1f( empty local-lis/les=41/42 n=0 ec=29/20 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=13.512590408s) [5,3,4] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1177.215209961s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:37 np0005548789.localdomain ceph-osd[31726]: osd.1 43 crush map has features 432629239337189376, adjusting msgr requires for clients
Dec 06 08:19:37 np0005548789.localdomain ceph-osd[31726]: osd.1 43 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons
Dec 06 08:19:37 np0005548789.localdomain ceph-osd[31726]: osd.1 43 crush map has features 3314933000854323200, adjusting msgr requires for osds
Dec 06 08:19:39 np0005548789.localdomain sudo[56056]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jysfxyvntbbrasmwgsmvgeeytvmvngis ; /usr/bin/python3
Dec 06 08:19:39 np0005548789.localdomain sudo[56056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:39 np0005548789.localdomain python3[56058]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:40 np0005548789.localdomain sudo[56056]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:40 np0005548789.localdomain sudo[56099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsxscadmgltmwqeyuikjbjqpusjygvff ; /usr/bin/python3
Dec 06 08:19:40 np0005548789.localdomain sudo[56099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:40 np0005548789.localdomain python3[56101]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009179.7421753-92087-77202260007466/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=0cb3e740065655621c29366f25db5e0ef0002cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:40 np0005548789.localdomain sudo[56099]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:40 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec 06 08:19:40 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec 06 08:19:41 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 46 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=46 pruub=9.460670471s) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 active pruub 1181.762695312s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,1], acting [5,0,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:41 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 46 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=46 pruub=9.457298279s) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1181.762695312s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.18( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1a( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1f( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.19( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.e( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.c( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.3( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.7( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.6( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.2( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.5( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.4( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.f( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.8( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.a( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1b( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.9( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.b( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.14( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.16( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.17( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.15( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.10( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.13( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1c( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.12( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1d( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.11( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.1e( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:42 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 47 pg[6.d( empty local-lis/les=35/36 n=0 ec=46/35 lis/c=35/35 les/c/f=36/36/0 sis=46) [5,0,1] r=2 lpr=46 pi=[35,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:43 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 48 pg[7.0( v 40'39 (0'0,40'39] local-lis/les=37/38 n=22 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=48 pruub=9.503730774s) [0,1,5] r=1 lpr=48 pi=[37,48)/1 luod=0'0 lua=40'37 crt=40'39 lcod 40'38 mlcod 0'0 active pruub 1183.820800781s@ mbc={}] start_peering_interval up [0,1,5] -> [0,1,5], acting [0,1,5] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:43 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 48 pg[7.0( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=37/37 lis/c=37/37 les/c/f=38/38/0 sis=48 pruub=9.501980782s) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 lcod 40'38 mlcod 0'0 unknown NOTIFY pruub 1183.820800781s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.c( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.a( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.9( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.8( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.e( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.4( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.5( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.7( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.d( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.2( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=2 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 49 pg[7.f( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=37/38 n=1 ec=48/37 lis/c=37/37 les/c/f=38/38/0 sis=48) [0,1,5] r=1 lpr=48 pi=[37,48)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 06 08:19:44 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 06 08:19:45 np0005548789.localdomain sudo[56161]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opxbwgjzhxlakasjzqbropulmtbkhbug ; /usr/bin/python3
Dec 06 08:19:45 np0005548789.localdomain sudo[56161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:45 np0005548789.localdomain python3[56163]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:45 np0005548789.localdomain sudo[56161]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:46 np0005548789.localdomain sudo[56206]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kufwlxjndvpzxukfltacmonulngtktmv ; /usr/bin/python3
Dec 06 08:19:46 np0005548789.localdomain sudo[56206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:46 np0005548789.localdomain python3[56208]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009185.5373125-92412-279512322772049/source _original_basename=tmp57evwyi0 follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:46 np0005548789.localdomain sudo[56206]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:46 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec 06 08:19:46 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 06 08:19:46 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec 06 08:19:47 np0005548789.localdomain sudo[56268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbmqqmieprleuumihoexvgamqypwdycj ; /usr/bin/python3
Dec 06 08:19:47 np0005548789.localdomain sudo[56268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:19:47 np0005548789.localdomain systemd[1]: tmp-crun.vu8IPH.mount: Deactivated successfully.
Dec 06 08:19:47 np0005548789.localdomain podman[56271]: 2025-12-06 08:19:47.446532368 +0000 UTC m=+0.082722959 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 08:19:47 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.19 deep-scrub starts
Dec 06 08:19:47 np0005548789.localdomain python3[56270]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:19:47 np0005548789.localdomain sudo[56268]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:47 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 4.19 deep-scrub ok
Dec 06 08:19:47 np0005548789.localdomain podman[56271]: 2025-12-06 08:19:47.630686345 +0000 UTC m=+0.266876996 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044)
Dec 06 08:19:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:19:47 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec 06 08:19:47 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec 06 08:19:47 np0005548789.localdomain sudo[56340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqsngghfkvwyrbfornhyaxsxdxbydpwy ; /usr/bin/python3
Dec 06 08:19:47 np0005548789.localdomain sudo[56340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:47 np0005548789.localdomain python3[56342]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009187.1791785-92498-160004399388075/source _original_basename=tmp0ven4ftp follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:19:47 np0005548789.localdomain sudo[56340]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:48 np0005548789.localdomain sudo[56370]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jchscheraybpxzlodlcafgkeiqyfsurr ; /usr/bin/python3
Dec 06 08:19:48 np0005548789.localdomain sudo[56370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:48 np0005548789.localdomain python3[56372]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Dec 06 08:19:48 np0005548789.localdomain crontab[56373]: (root) LIST (root)
Dec 06 08:19:48 np0005548789.localdomain crontab[56374]: (root) REPLACE (root)
Dec 06 08:19:48 np0005548789.localdomain sudo[56370]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:48 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 06 08:19:48 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 06 08:19:48 np0005548789.localdomain sudo[56388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqjusjvocrxnajragtgftktwtjmpusga ; /usr/bin/python3
Dec 06 08:19:48 np0005548789.localdomain sudo[56388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:48 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 06 08:19:48 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 06 08:19:48 np0005548789.localdomain python3[56390]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:19:48 np0005548789.localdomain sudo[56388]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:49 np0005548789.localdomain sudo[56438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iizlvqyhgjtmndrpktmlebzlmzokkxve ; /usr/bin/python3
Dec 06 08:19:49 np0005548789.localdomain sudo[56438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.028770447s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329711914s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.13( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.029058456s) [2,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.330200195s@ mbc={}] start_peering_interval up [5,0,1] -> [2,4,3], acting [5,0,1] -> [2,4,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.028700829s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.13( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.029010773s) [2,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.330200195s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.028232574s) [1,5,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329223633s@ mbc={}] start_peering_interval up [5,0,1] -> [1,5,3], acting [5,0,1] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.028406143s) [0,5,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.330200195s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023814201s) [5,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.325195312s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,3], acting [5,0,1] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.12( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.026974678s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328857422s@ mbc={}] start_peering_interval up [5,0,1] -> [4,0,2], acting [5,0,1] -> [4,0,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.11( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027729988s) [5,1,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329589844s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,3], acting [5,0,1] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.11( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027685165s) [5,1,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329589844s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.028232574s) [1,5,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.329223633s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.1e( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.12( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.026546478s) [4,0,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328857422s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.10( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027506828s) [2,1,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329956055s@ mbc={}] start_peering_interval up [5,0,1] -> [2,1,3], acting [5,0,1] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.10( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027451515s) [2,1,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329956055s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.17( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027018547s) [1,2,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329589844s@ mbc={}] start_peering_interval up [5,0,1] -> [1,2,0], acting [5,0,1] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027871132s) [0,5,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.330200195s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.12( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1d( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022918701s) [5,4,3] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.325195312s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.16( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027281761s) [5,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.330078125s@ mbc={}] start_peering_interval up [5,0,1] -> [5,0,4], acting [5,0,1] -> [5,0,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.17( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.027018547s) [1,2,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.329589844s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.16( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.026957512s) [5,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.330078125s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.15( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.026705742s) [3,4,5] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329711914s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,5], acting [5,0,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.15( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.026460648s) [3,4,5] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.14( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.025653839s) [5,3,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329711914s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.025733948s) [0,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329956055s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,4], acting [5,0,1] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.025692940s) [0,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329956055s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.9( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020556450s) [5,1,0] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324951172s@ mbc={}] start_peering_interval up [5,0,1] -> [5,1,0], acting [5,0,1] -> [5,1,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.024387360s) [5,4,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328857422s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,0], acting [5,0,1] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.9( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020507812s) [5,1,0] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324951172s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020248413s) [2,3,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324829102s@ mbc={}] start_peering_interval up [5,0,1] -> [2,3,1], acting [5,0,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020225525s) [2,3,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324829102s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.024341583s) [5,4,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328857422s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.8( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020699501s) [3,1,5] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.325561523s@ mbc={}] start_peering_interval up [5,0,1] -> [3,1,5], acting [5,0,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.5( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023706436s) [0,1,2] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328735352s@ mbc={}] start_peering_interval up [5,0,1] -> [0,1,2], acting [5,0,1] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.2( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023640633s) [3,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328735352s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,4], acting [5,0,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.2( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023591042s) [3,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328735352s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.6( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020242691s) [2,3,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.325439453s@ mbc={}] start_peering_interval up [5,0,1] -> [2,3,1], acting [5,0,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.6( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.020191193s) [2,3,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.325439453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.7( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.019382477s) [3,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324829102s@ mbc={}] start_peering_interval up [5,0,1] -> [3,5,4], acting [5,0,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.14( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.025566101s) [5,3,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.329711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023789406s) [1,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.329223633s@ mbc={}] start_peering_interval up [5,0,1] -> [1,0,2], acting [5,0,1] -> [1,0,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.023789406s) [1,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.329223633s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.7( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.019342422s) [3,5,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324829102s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.8( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.019848824s) [3,1,5] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.325561523s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022847176s) [2,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328735352s@ mbc={}] start_peering_interval up [5,0,1] -> [2,0,4], acting [5,0,1] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.3( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022686005s) [0,4,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.328735352s@ mbc={}] start_peering_interval up [5,0,1] -> [0,4,2], acting [5,0,1] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.5( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022943497s) [0,1,2] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328735352s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.3( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022561073s) [0,4,2] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328735352s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018326759s) [3,2,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324707031s@ mbc={}] start_peering_interval up [5,0,1] -> [3,2,4], acting [5,0,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.e( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018283844s) [3,2,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324707031s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.18( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018775940s) [5,3,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.325317383s@ mbc={}] start_peering_interval up [5,0,1] -> [5,3,4], acting [5,0,1] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.19( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.017786980s) [0,5,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324462891s@ mbc={}] start_peering_interval up [5,0,1] -> [0,5,1], acting [5,0,1] -> [0,5,1], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.18( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018623352s) [5,3,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.325317383s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.19( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.017730713s) [0,5,1] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324462891s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018284798s) [1,3,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.325317383s@ mbc={}] start_peering_interval up [5,0,1] -> [1,3,2], acting [5,0,1] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1b( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.018284798s) [1,3,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.325317383s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.017577171s) [5,4,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.324462891s@ mbc={}] start_peering_interval up [5,0,1] -> [5,4,0], acting [5,0,1] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1f( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.017308235s) [5,4,0] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.324462891s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.c( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022798538s) [2,0,4] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.328735352s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.022010803s) [3,4,5] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active pruub 1189.330078125s@ mbc={}] start_peering_interval up [5,0,1] -> [3,4,5], acting [5,0,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 50 pg[6.1a( empty local-lis/les=46/47 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=9.021780014s) [3,4,5] r=-1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.330078125s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:49 np0005548789.localdomain sudo[56438]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:49 np0005548789.localdomain sudo[56456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbhmpbttognrqkcfdddndcbuelfxztke ; /usr/bin/python3
Dec 06 08:19:49 np0005548789.localdomain sudo[56456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.1d deep-scrub starts
Dec 06 08:19:49 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.1d deep-scrub ok
Dec 06 08:19:49 np0005548789.localdomain sudo[56456]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.1d( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.18( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,3,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.f( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,0] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.e( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,2,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.16( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,0,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.15( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,4,5] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.14( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,3,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.c( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,0,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.1f( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [5,4,0] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.7( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,5,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.13( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [2,4,3] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.2( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,5,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.1a( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [3,4,5] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.a( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [0,5,4] r=2 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 50 pg[6.3( empty local-lis/les=0/0 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [0,4,2] r=1 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 51 pg[6.1c( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,5,3] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 51 pg[6.1b( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,3,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 51 pg[6.1( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 51 pg[6.1e( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 51 pg[6.12( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [4,0,2] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 51 pg[6.17( empty local-lis/les=50/51 n=0 ec=46/35 lis/c=46/46 les/c/f=47/47/0 sis=50) [1,2,0] r=0 lpr=50 pi=[46,50)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:19:50 np0005548789.localdomain sudo[56560]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpvhluumyziaxfabmmvlvkbzfmskefxn ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.9660482-92654-46361919254174/async_wrapper.py 102559012711 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.9660482-92654-46361919254174/AnsiballZ_command.py _
Dec 06 08:19:50 np0005548789.localdomain sudo[56560]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:19:50 np0005548789.localdomain ansible-async_wrapper.py[56562]: Invoked with 102559012711 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009189.9660482-92654-46361919254174/AnsiballZ_command.py _
Dec 06 08:19:50 np0005548789.localdomain ansible-async_wrapper.py[56565]: Starting module and watcher
Dec 06 08:19:50 np0005548789.localdomain ansible-async_wrapper.py[56565]: Start watching 56566 (3600)
Dec 06 08:19:50 np0005548789.localdomain ansible-async_wrapper.py[56566]: Start module (56566)
Dec 06 08:19:50 np0005548789.localdomain ansible-async_wrapper.py[56562]: Return async_wrapper task started.
Dec 06 08:19:50 np0005548789.localdomain sudo[56560]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:50 np0005548789.localdomain sudo[56585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvwqgawxetlcsssgexineydbfwunasyk ; /usr/bin/python3
Dec 06 08:19:50 np0005548789.localdomain sudo[56585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:19:50 np0005548789.localdomain python3[56587]: ansible-ansible.legacy.async_status Invoked with jid=102559012711.56562 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:19:50 np0005548789.localdomain sudo[56585]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:51 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 06 08:19:51 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 06 08:19:52 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Dec 06 08:19:52 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 06 08:19:52 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Dec 06 08:19:53 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Dec 06 08:19:53 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    (file & line not available)
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    (file & line not available)
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.13 seconds
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Notice: Applied catalog in 0.03 seconds
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Application:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    Initial environment: production
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    Converged environment: production
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:          Run mode: user
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Changes:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Events:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Resources:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:             Total: 10
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Time:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:          Schedule: 0.00
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:              File: 0.00
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:              Exec: 0.00
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:            Augeas: 0.01
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    Transaction evaluation: 0.02
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    Catalog application: 0.03
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:    Config retrieval: 0.16
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:          Last run: 1765009194
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:        Filebucket: 0.00
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:             Total: 0.04
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]: Version:
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:            Config: 1765009194
Dec 06 08:19:54 np0005548789.localdomain puppet-user[56582]:            Puppet: 7.10.0
Dec 06 08:19:54 np0005548789.localdomain ansible-async_wrapper.py[56566]: Module complete (56566)
Dec 06 08:19:55 np0005548789.localdomain sudo[56699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:19:55 np0005548789.localdomain sudo[56699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548789.localdomain sudo[56699]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548789.localdomain sudo[56714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:19:55 np0005548789.localdomain sudo[56714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548789.localdomain ansible-async_wrapper.py[56565]: Done in kid B.
Dec 06 08:19:55 np0005548789.localdomain sudo[56714]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548789.localdomain sudo[56750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:19:55 np0005548789.localdomain sudo[56750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:55 np0005548789.localdomain sudo[56750]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:55 np0005548789.localdomain sudo[56765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:19:55 np0005548789.localdomain sudo[56765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:56 np0005548789.localdomain sudo[56765]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540736198s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366455078s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540463448s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366210938s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540756226s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366577148s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540627480s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366455078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540381432s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366210938s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540667534s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366577148s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.541256905s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366943359s@ mbc={}] start_peering_interval up [0,1,5] -> [2,1,0], acting [0,1,5] -> [2,1,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:56 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 52 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=52 pruub=11.540686607s) [2,1,0] r=1 lpr=52 pi=[48,52)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366943359s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:57 np0005548789.localdomain sudo[56812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:19:57 np0005548789.localdomain sudo[56812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:19:57 np0005548789.localdomain sudo[56812]: pam_unix(sudo:session): session closed for user root
Dec 06 08:19:57 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts
Dec 06 08:19:57 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096426964s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366455078s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096349716s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366455078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096376419s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366455078s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096095085s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366210938s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096679688s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1199.366821289s@ mbc={}] start_peering_interval up [0,1,5] -> [5,3,4], acting [0,1,5] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096002579s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366210938s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096216202s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366455078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:19:59 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 54 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54 pruub=9.096534729s) [5,3,4] r=-1 lpr=54 pi=[48,54)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1199.366821289s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 54 pg[7.b( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=2 lpr=54 pi=[48,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 54 pg[7.3( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=2 lpr=54 pi=[48,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 54 pg[7.7( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=2 lpr=54 pi=[48,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 54 pg[7.f( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=54) [5,3,4] r=2 lpr=54 pi=[48,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 06 08:20:00 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 06 08:20:01 np0005548789.localdomain sudo[56840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytqsxvyxupyxwblxiyreirloorpqokha ; /usr/bin/python3
Dec 06 08:20:01 np0005548789.localdomain sudo[56840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:01 np0005548789.localdomain python3[56842]: ansible-ansible.legacy.async_status Invoked with jid=102559012711.56562 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:20:01 np0005548789.localdomain sudo[56840]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 06 08:20:01 np0005548789.localdomain sudo[56856]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igfuttwbuetkmaekhniozpecrqkrzorn ; /usr/bin/python3
Dec 06 08:20:01 np0005548789.localdomain sudo[56856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 56 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.609109879s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1207.366333008s@ mbc={}] start_peering_interval up [0,1,5] -> [2,3,4], acting [0,1,5] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 56 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.609762192s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1207.366943359s@ mbc={}] start_peering_interval up [0,1,5] -> [2,3,4], acting [0,1,5] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 56 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.609000206s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.366333008s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 56 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=14.609653473s) [2,3,4] r=-1 lpr=56 pi=[48,56)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.366943359s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:01 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 06 08:20:01 np0005548789.localdomain python3[56858]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:20:01 np0005548789.localdomain sudo[56856]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:02 np0005548789.localdomain sudo[56872]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbuejzobwbjlshnvstdtrwmvrenqkvqi ; /usr/bin/python3
Dec 06 08:20:02 np0005548789.localdomain sudo[56872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:02 np0005548789.localdomain python3[56874]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:20:02 np0005548789.localdomain sudo[56872]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:02 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 56 pg[7.4( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=2 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:02 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 56 pg[7.c( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=56) [2,3,4] r=2 lpr=56 pi=[48,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:02 np0005548789.localdomain sudo[56922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnnubawukotaqsyhiqopvreqxmlhdtcx ; /usr/bin/python3
Dec 06 08:20:02 np0005548789.localdomain sudo[56922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:02 np0005548789.localdomain python3[56924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:02 np0005548789.localdomain sudo[56922]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:03 np0005548789.localdomain sudo[56940]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atippbanaydprjovdkprurcmurgzmvfx ; /usr/bin/python3
Dec 06 08:20:03 np0005548789.localdomain sudo[56940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:03 np0005548789.localdomain python3[56942]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmppqet6sy3 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:20:03 np0005548789.localdomain sudo[56940]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:03 np0005548789.localdomain sudo[56970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjiuexfkbgyldjxuubelnmqqbgxumpvy ; /usr/bin/python3
Dec 06 08:20:03 np0005548789.localdomain sudo[56970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 58 pg[7.d( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58) [4,5,0] r=0 lpr=58 pi=[48,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 58 pg[7.5( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58) [4,5,0] r=0 lpr=58 pi=[48,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 58 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.849601746s) [4,5,0] r=-1 lpr=58 pi=[48,58)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1207.367187500s@ mbc={}] start_peering_interval up [0,1,5] -> [4,5,0], acting [0,1,5] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 58 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.849519730s) [4,5,0] r=-1 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.367187500s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 58 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.849435806s) [4,5,0] r=-1 lpr=58 pi=[48,58)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1207.367187500s@ mbc={}] start_peering_interval up [0,1,5] -> [4,5,0], acting [0,1,5] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 58 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=48/49 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58 pruub=12.849357605s) [4,5,0] r=-1 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.367187500s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:03 np0005548789.localdomain python3[56972]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:03 np0005548789.localdomain sudo[56970]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 06 08:20:03 np0005548789.localdomain sudo[56987]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caotzepkexuvwpbpucrzcnsequwppezq ; /usr/bin/python3
Dec 06 08:20:03 np0005548789.localdomain sudo[56987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:03 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 06 08:20:04 np0005548789.localdomain sudo[56987]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:04 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 59 pg[7.5( v 40'39 lc 40'7 (0'0,40'39] local-lis/les=58/59 n=2 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58) [4,5,0] r=0 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+0)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:04 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 59 pg[7.d( v 40'39 lc 40'8 (0'0,40'39] local-lis/les=58/59 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=58) [4,5,0] r=0 lpr=58 pi=[48,58)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+0)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:04 np0005548789.localdomain sudo[57074]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqwlsdzctjrxkqqhpxpjlpynbyvptwuu ; /usr/bin/python3
Dec 06 08:20:04 np0005548789.localdomain sudo[57074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:04 np0005548789.localdomain python3[57076]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:20:04 np0005548789.localdomain sudo[57074]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:05 np0005548789.localdomain sudo[57093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtjhxndvfwwwnxwzzqizeeielwiaflke ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:05 np0005548789.localdomain sudo[57093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:05 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 60 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.290819168s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1204.848999023s@ mbc={}] start_peering_interval up [2,1,0] -> [5,0,4], acting [2,1,0] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:05 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 60 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=52/53 n=2 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.287181854s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1204.845581055s@ mbc={}] start_peering_interval up [2,1,0] -> [5,0,4], acting [2,1,0] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:05 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 60 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=52/53 n=2 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.286940575s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1204.845581055s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:05 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 60 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60 pruub=8.290492058s) [5,0,4] r=-1 lpr=60 pi=[52,60)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1204.848999023s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:05 np0005548789.localdomain python3[57095]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:05 np0005548789.localdomain sudo[57093]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:05 np0005548789.localdomain sudo[57109]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlkwshreqnkjtcncsimlbawrjwgznylx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:05 np0005548789.localdomain sudo[57109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:06 np0005548789.localdomain sudo[57109]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:06 np0005548789.localdomain sudo[57125]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-demimcihpbggulcxohfznwgjfvkqznxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:06 np0005548789.localdomain sudo[57125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 61 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.122196198s) [3,1,5] r=-1 lpr=61 pi=[54,61)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1203.795776367s@ mbc={}] start_peering_interval up [5,3,4] -> [3,1,5], acting [5,3,4] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 61 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.122116089s) [3,1,5] r=-1 lpr=61 pi=[54,61)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1203.795776367s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 60 pg[7.6( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=2 lpr=60 pi=[52,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 60 pg[7.e( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=60) [5,0,4] r=2 lpr=60 pi=[52,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 61 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.119771004s) [3,1,5] r=-1 lpr=61 pi=[54,61)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1203.795654297s@ mbc={}] start_peering_interval up [5,3,4] -> [3,1,5], acting [5,3,4] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:06 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 61 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=54/55 n=1 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61 pruub=10.119685173s) [3,1,5] r=-1 lpr=61 pi=[54,61)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1203.795654297s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:06 np0005548789.localdomain python3[57127]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:20:06 np0005548789.localdomain sudo[57125]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548789.localdomain sudo[57175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwudieqswwfoxgbzboetwbwncgiurcam ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548789.localdomain sudo[57175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548789.localdomain python3[57177]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:07 np0005548789.localdomain sudo[57175]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548789.localdomain sudo[57193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsyzyqutrhqarqcnnehelvttwionhrsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548789.localdomain sudo[57193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548789.localdomain python3[57195]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:07 np0005548789.localdomain sudo[57193]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:07 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 61 pg[7.7( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61) [3,1,5] r=1 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:07 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 61 pg[7.f( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=54/54 les/c/f=55/55/0 sis=61) [3,1,5] r=1 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:07 np0005548789.localdomain sudo[57255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrtvimcekzlqsabcruneczphjjlgixls ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:07 np0005548789.localdomain sudo[57255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:07 np0005548789.localdomain python3[57257]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:07 np0005548789.localdomain sudo[57255]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:08 np0005548789.localdomain sudo[57273]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqzzelrdckjjwerobaaijnklbbkluwzd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:08 np0005548789.localdomain sudo[57273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548789.localdomain python3[57275]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:08 np0005548789.localdomain sudo[57273]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:08 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec 06 08:20:08 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec 06 08:20:08 np0005548789.localdomain sudo[57335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdswdmqozvkknjfzczlkqqcpckkjxytk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:08 np0005548789.localdomain sudo[57335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548789.localdomain python3[57337]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:08 np0005548789.localdomain sudo[57335]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:08 np0005548789.localdomain sudo[57353]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axvysshwzkryscjyyybrcsecouqfslzs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:08 np0005548789.localdomain sudo[57353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:08 np0005548789.localdomain python3[57355]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:08 np0005548789.localdomain sudo[57353]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 5030 writes, 22K keys, 5030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5030 writes, 506 syncs, 9.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1771 writes, 6287 keys, 1771 commit groups, 1.0 writes per commit group, ingest: 2.42 MB, 0.00 MB/s
                                                          Interval WAL: 1771 writes, 361 syncs, 4.91 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 63 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=15.088534355s) [2,0,1] r=2 lpr=63 pi=[48,63)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1215.366455078s@ mbc={}] start_peering_interval up [0,1,5] -> [2,0,1], acting [0,1,5] -> [2,0,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 63 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=15.088434219s) [2,0,1] r=2 lpr=63 pi=[48,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1215.366455078s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:09 np0005548789.localdomain sudo[57415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsailcyzxeoaisayewithwaujcdszshz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:09 np0005548789.localdomain sudo[57415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Dec 06 08:20:09 np0005548789.localdomain python3[57417]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:09 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Dec 06 08:20:09 np0005548789.localdomain sudo[57415]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:09 np0005548789.localdomain sudo[57433]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnfhyzpaoidsxzwzwkjgyftvueefwsfw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:09 np0005548789.localdomain sudo[57433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:09 np0005548789.localdomain python3[57435]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:09 np0005548789.localdomain sudo[57433]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:10 np0005548789.localdomain sudo[57463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuqqskjfknmnwjegrqidmxwnydvramln ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:10 np0005548789.localdomain sudo[57463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:10 np0005548789.localdomain python3[57465]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:20:10 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:20:10 np0005548789.localdomain systemd-sysv-generator[57491]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:20:10 np0005548789.localdomain systemd-rc-local-generator[57488]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:20:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:20:10 np0005548789.localdomain sudo[57463]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:11 np0005548789.localdomain sudo[57549]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-faboyjwcxmuvtaafrapttwdfowsdybuy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:11 np0005548789.localdomain sudo[57549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:11 np0005548789.localdomain python3[57551]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:11 np0005548789.localdomain sudo[57549]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:11 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 65 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65 pruub=13.055438042s) [5,4,3] r=-1 lpr=65 pi=[48,65)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1215.366699219s@ mbc={}] start_peering_interval up [0,1,5] -> [5,4,3], acting [0,1,5] -> [5,4,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:11 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 65 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=48/49 n=1 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65 pruub=13.055321693s) [5,4,3] r=-1 lpr=65 pi=[48,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1215.366699219s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:11 np0005548789.localdomain sudo[57567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gexxnpytnkenfuddhhllmrhvierrkhzi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:11 np0005548789.localdomain sudo[57567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:11 np0005548789.localdomain python3[57569]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:11 np0005548789.localdomain sudo[57567]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:12 np0005548789.localdomain sudo[57629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydvtsudvyrksdfbtfoilfgtpabbxzbre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:12 np0005548789.localdomain sudo[57629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548789.localdomain python3[57631]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:20:12 np0005548789.localdomain sudo[57629]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:12 np0005548789.localdomain sudo[57647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhhyobueyzuqluupfuijksmpvmwkmnpi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:12 np0005548789.localdomain sudo[57647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 65 pg[7.9( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=48/48 les/c/f=49/49/0 sis=65) [5,4,3] r=1 lpr=65 pi=[48,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:12 np0005548789.localdomain python3[57649]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:12 np0005548789.localdomain sudo[57647]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 06 08:20:12 np0005548789.localdomain sudo[57677]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqllgqakgemjasfffxnbpgsxlgeedyyt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:12 np0005548789.localdomain sudo[57677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Cumulative writes: 4343 writes, 20K keys, 4343 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4343 writes, 459 syncs, 9.46 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 955 writes, 3346 keys, 955 commit groups, 1.0 writes per commit group, ingest: 1.76 MB, 0.00 MB/s
                                                          Interval WAL: 955 writes, 261 syncs, 3.66 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.2 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 08:20:12 np0005548789.localdomain python3[57679]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:20:13 np0005548789.localdomain systemd-rc-local-generator[57704]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:20:13 np0005548789.localdomain systemd-sysv-generator[57709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:20:13 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:20:13 np0005548789.localdomain sudo[57677]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:13 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 06 08:20:13 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 06 08:20:13 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Dec 06 08:20:13 np0005548789.localdomain sudo[57735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnvqhwcpznwerijqqnxzpcmyrdsrqqtn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:13 np0005548789.localdomain sudo[57735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:13 np0005548789.localdomain ceph-osd[31726]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Dec 06 08:20:13 np0005548789.localdomain python3[57737]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:20:13 np0005548789.localdomain sudo[57735]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:14 np0005548789.localdomain sudo[57751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djywyinxeosqvcbhjjzffhyyrwprcons ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:14 np0005548789.localdomain sudo[57751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:14 np0005548789.localdomain sudo[57751]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:15 np0005548789.localdomain sudo[57792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qicirblnrpuvwetixgzvxlecpunctjel ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:20:15 np0005548789.localdomain sudo[57792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:15 np0005548789.localdomain python3[57794]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:20:15 np0005548789.localdomain podman[57869]: 2025-12-06 08:20:15.800919239 +0000 UTC m=+0.081660831 container create 15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step2, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt)
Dec 06 08:20:15 np0005548789.localdomain podman[57870]: 2025-12-06 08:20:15.831272785 +0000 UTC m=+0.106069756 container create 0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, container_name=nova_compute_init_log, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:36:58Z, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:20:15 np0005548789.localdomain podman[57869]: 2025-12-06 08:20:15.751270595 +0000 UTC m=+0.032012227 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:20:15 np0005548789.localdomain podman[57870]: 2025-12-06 08:20:15.768023986 +0000 UTC m=+0.042821027 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: Started libpod-conmon-15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89.scope.
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: Started libpod-conmon-0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304.scope.
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:15 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89ee8dead5a29d0553b978375c66d4bc010ba2732baca36dcfe5a54e3214c8ff/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:15 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d7c22414f3b3b03ee747009a3ba1860a523968c599aef8234ba1bea94f6d58e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:15 np0005548789.localdomain podman[57869]: 2025-12-06 08:20:15.909366357 +0000 UTC m=+0.190107949 container init 15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, name=rhosp17/openstack-nova-libvirt, version=17.1.12, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:20:15 np0005548789.localdomain podman[57869]: 2025-12-06 08:20:15.920442455 +0000 UTC m=+0.201184027 container start 15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, name=rhosp17/openstack-nova-libvirt, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_virtqemud_init_logs, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:20:15 np0005548789.localdomain python3[57794]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: libpod-15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89.scope: Deactivated successfully.
Dec 06 08:20:15 np0005548789.localdomain podman[57870]: 2025-12-06 08:20:15.958788984 +0000 UTC m=+0.233585955 container init 0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, release=1761123044, architecture=x86_64, container_name=nova_compute_init_log, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Dec 06 08:20:15 np0005548789.localdomain podman[57870]: 2025-12-06 08:20:15.973865254 +0000 UTC m=+0.248662255 container start 0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Dec 06 08:20:15 np0005548789.localdomain systemd[1]: libpod-0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304.scope: Deactivated successfully.
Dec 06 08:20:15 np0005548789.localdomain python3[57794]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Dec 06 08:20:16 np0005548789.localdomain podman[57908]: 2025-12-06 08:20:16.004039334 +0000 UTC m=+0.060650291 container died 15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud_init_logs, config_id=tripleo_step2, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:20:16 np0005548789.localdomain podman[57933]: 2025-12-06 08:20:16.041970251 +0000 UTC m=+0.056824434 container died 0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute_init_log, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step2, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:20:16 np0005548789.localdomain podman[57908]: 2025-12-06 08:20:16.153157022 +0000 UTC m=+0.209767909 container cleanup 15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud_init_logs, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: libpod-conmon-15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89.scope: Deactivated successfully.
Dec 06 08:20:16 np0005548789.localdomain podman[57933]: 2025-12-06 08:20:16.19865768 +0000 UTC m=+0.213511823 container cleanup 0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, container_name=nova_compute_init_log, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: libpod-conmon-0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304.scope: Deactivated successfully.
Dec 06 08:20:16 np0005548789.localdomain podman[58045]: 2025-12-06 08:20:16.585617031 +0000 UTC m=+0.078704531 container create fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step2, vcs-type=git, container_name=create_virtlogd_wrapper, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: Started libpod-conmon-fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28.scope.
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:16 np0005548789.localdomain podman[58045]: 2025-12-06 08:20:16.537740701 +0000 UTC m=+0.030828251 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:20:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b65786436fb22482b0004739b377157c398689a3d99c4b69d96c34a55e2067d5/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:16 np0005548789.localdomain podman[58045]: 2025-12-06 08:20:16.650613583 +0000 UTC m=+0.143701093 container init fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:20:16 np0005548789.localdomain podman[58045]: 2025-12-06 08:20:16.657318879 +0000 UTC m=+0.150406349 container start fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step2, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:20:16 np0005548789.localdomain podman[58045]: 2025-12-06 08:20:16.657546135 +0000 UTC m=+0.150633695 container attach fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 08:20:16 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:16.687839789 +0000 UTC m=+0.133973007 container create 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: Started libpod-conmon-19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6.scope.
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:20:16 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:16.636449321 +0000 UTC m=+0.082582580 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:20:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17bd947360a6f617c72dcee5b3dca8b1bd8a672c96dfaa4c4c4ee08fb50892d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:20:16 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:16.744861618 +0000 UTC m=+0.190994836 container init 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.12, container_name=create_haproxy_wrapper, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:20:16 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:16.753939675 +0000 UTC m=+0.200072893 container start 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:20:16 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:16.754607316 +0000 UTC m=+0.200740584 container attach 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z)
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7d7c22414f3b3b03ee747009a3ba1860a523968c599aef8234ba1bea94f6d58e-merged.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-89ee8dead5a29d0553b978375c66d4bc010ba2732baca36dcfe5a54e3214c8ff-merged.mount: Deactivated successfully.
Dec 06 08:20:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:20:17 np0005548789.localdomain podman[58125]: 2025-12-06 08:20:17.917120181 +0000 UTC m=+0.081073203 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:20:18 np0005548789.localdomain podman[58125]: 2025-12-06 08:20:18.085983091 +0000 UTC m=+0.249936103 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:20:18 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:20:18 np0005548789.localdomain ovs-vsctl[58186]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Dec 06 08:20:18 np0005548789.localdomain systemd[1]: libpod-fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28.scope: Deactivated successfully.
Dec 06 08:20:18 np0005548789.localdomain systemd[1]: libpod-fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28.scope: Consumed 2.186s CPU time.
Dec 06 08:20:18 np0005548789.localdomain podman[58336]: 2025-12-06 08:20:18.96469432 +0000 UTC m=+0.054655397 container died fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, version=17.1.12, container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:20:18 np0005548789.localdomain systemd[1]: tmp-crun.AsIgMl.mount: Deactivated successfully.
Dec 06 08:20:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain podman[58336]: 2025-12-06 08:20:19.006050252 +0000 UTC m=+0.096011279 container cleanup fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=create_virtlogd_wrapper, version=17.1.12, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc.)
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: libpod-conmon-fe916b04b5a4965a58b15296c4cfd066cc2cd03ab9317bd7e14d1e94989cda28.scope: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain python3[57794]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Dec 06 08:20:19 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 67 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67 pruub=10.368765831s) [4,2,3] r=-1 lpr=67 pi=[52,67)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1220.845825195s@ mbc={}] start_peering_interval up [2,1,0] -> [4,2,3], acting [2,1,0] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:19 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 67 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=52/53 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67 pruub=10.368695259s) [4,2,3] r=-1 lpr=67 pi=[52,67)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1220.845825195s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:19 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 67 pg[7.a( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67) [4,2,3] r=0 lpr=67 pi=[52,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: libpod-19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6.scope: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: libpod-19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6.scope: Consumed 2.154s CPU time.
Dec 06 08:20:19 np0005548789.localdomain podman[58069]: 2025-12-06 08:20:19.645271867 +0000 UTC m=+3.091405135 container died 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, config_id=tripleo_step2, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=create_haproxy_wrapper, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:20:19 np0005548789.localdomain podman[58375]: 2025-12-06 08:20:19.765903966 +0000 UTC m=+0.111259244 container cleanup 19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible)
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: libpod-conmon-19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6.scope: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain python3[57794]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Dec 06 08:20:19 np0005548789.localdomain sudo[57792]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-17bd947360a6f617c72dcee5b3dca8b1bd8a672c96dfaa4c4c4ee08fb50892d9-merged.mount: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19fd50ca84f305510197cafcf52300c0564622e4e13459d1a5f1f90f7f835db6-userdata-shm.mount: Deactivated successfully.
Dec 06 08:20:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b65786436fb22482b0004739b377157c398689a3d99c4b69d96c34a55e2067d5-merged.mount: Deactivated successfully.
Dec 06 08:20:20 np0005548789.localdomain sudo[58429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnzsgdybmxvupdgacxargzpfufhesdfn ; /usr/bin/python3
Dec 06 08:20:20 np0005548789.localdomain sudo[58429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:20 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 68 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=67/68 n=1 ec=48/37 lis/c=52/52 les/c/f=53/53/0 sis=67) [4,2,3] r=0 lpr=67 pi=[52,67)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 08:20:20 np0005548789.localdomain python3[58431]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:20 np0005548789.localdomain sudo[58429]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:20 np0005548789.localdomain sudo[58477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kspmdouejzyfzdydwnnqhaoqamukvabp ; /usr/bin/python3
Dec 06 08:20:20 np0005548789.localdomain sudo[58477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:21 np0005548789.localdomain sudo[58477]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:21 np0005548789.localdomain sudo[58520]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrkltuswgwfivmztovzyhdkfqibnkqsf ; /usr/bin/python3
Dec 06 08:20:21 np0005548789.localdomain sudo[58520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:21 np0005548789.localdomain sudo[58520]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:21 np0005548789.localdomain sudo[58550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zagzddooprbixgqvoasypjiquezehbgv ; /usr/bin/python3
Dec 06 08:20:21 np0005548789.localdomain sudo[58550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:22 np0005548789.localdomain python3[58552]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005548789 step=2 update_config_hash_only=False
Dec 06 08:20:22 np0005548789.localdomain sudo[58550]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:22 np0005548789.localdomain sudo[58566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwhtbanodoqxxsajteeakuzhomsejzno ; /usr/bin/python3
Dec 06 08:20:22 np0005548789.localdomain sudo[58566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:22 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 06 08:20:22 np0005548789.localdomain python3[58568]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:20:22 np0005548789.localdomain sudo[58566]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:22 np0005548789.localdomain ceph-osd[32665]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 06 08:20:22 np0005548789.localdomain sudo[58582]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzwiaclbudroeskyqzksnffnyhlbegbe ; /usr/bin/python3
Dec 06 08:20:22 np0005548789.localdomain sudo[58582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:20:22 np0005548789.localdomain python3[58584]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:20:23 np0005548789.localdomain sudo[58582]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:29 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 70 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=56/57 n=1 ec=48/37 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.193599701s) [0,5,4] r=2 lpr=70 pi=[56,70)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1229.589599609s@ mbc={}] start_peering_interval up [2,3,4] -> [0,5,4], acting [2,3,4] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:29 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 70 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=56/57 n=1 ec=48/37 lis/c=56/56 les/c/f=57/57/0 sis=70 pruub=13.193515778s) [0,5,4] r=2 lpr=70 pi=[56,70)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1229.589599609s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:31 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 72 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=58/59 n=2 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72 pruub=13.232522011s) [3,2,1] r=-1 lpr=72 pi=[58,72)/1 crt=40'39 mlcod 0'0 active pruub 1231.655029297s@ mbc={255={}}] start_peering_interval up [4,5,0] -> [3,2,1], acting [4,5,0] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:31 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 72 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=58/59 n=2 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72 pruub=13.232336998s) [3,2,1] r=-1 lpr=72 pi=[58,72)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1231.655029297s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:32 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 72 pg[7.d( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=58/58 les/c/f=59/59/0 sis=72) [3,2,1] r=2 lpr=72 pi=[58,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:33 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 74 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=60/61 n=1 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74 pruub=13.228281975s) [2,1,3] r=-1 lpr=74 pi=[60,74)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1233.692016602s@ mbc={}] start_peering_interval up [5,0,4] -> [2,1,3], acting [5,0,4] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:33 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 74 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=60/61 n=1 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74 pruub=13.228202820s) [2,1,3] r=-1 lpr=74 pi=[60,74)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1233.692016602s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:34 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 74 pg[7.e( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=60/60 les/c/f=61/61/0 sis=74) [2,1,3] r=1 lpr=74 pi=[60,74)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:35 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 76 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76 pruub=12.203485489s) [2,4,3] r=-1 lpr=76 pi=[61,76)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1238.626586914s@ mbc={}] start_peering_interval up [3,1,5] -> [2,4,3], acting [3,1,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Dec 06 08:20:35 np0005548789.localdomain ceph-osd[31726]: osd.1 pg_epoch: 76 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76 pruub=12.203265190s) [2,4,3] r=-1 lpr=76 pi=[61,76)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1238.626586914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:36 np0005548789.localdomain ceph-osd[32665]: osd.4 pg_epoch: 76 pg[7.f( empty local-lis/les=0/0 n=0 ec=48/37 lis/c=61/61 les/c/f=62/62/0 sis=76) [2,4,3] r=1 lpr=76 pi=[61,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Dec 06 08:20:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:20:48 np0005548789.localdomain systemd[1]: tmp-crun.27yuHl.mount: Deactivated successfully.
Dec 06 08:20:48 np0005548789.localdomain podman[58585]: 2025-12-06 08:20:48.915151982 +0000 UTC m=+0.077363071 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, architecture=x86_64, io.openshift.expose-services=)
Dec 06 08:20:49 np0005548789.localdomain podman[58585]: 2025-12-06 08:20:49.115163942 +0000 UTC m=+0.277375051 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:20:49 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:20:57 np0005548789.localdomain sudo[58614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:20:57 np0005548789.localdomain sudo[58614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:57 np0005548789.localdomain sudo[58614]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:57 np0005548789.localdomain sudo[58629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:20:57 np0005548789.localdomain sudo[58629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:57 np0005548789.localdomain sudo[58629]: pam_unix(sudo:session): session closed for user root
Dec 06 08:20:58 np0005548789.localdomain sudo[58675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:20:58 np0005548789.localdomain sudo[58675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:20:58 np0005548789.localdomain sudo[58675]: pam_unix(sudo:session): session closed for user root
Dec 06 08:21:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:21:19 np0005548789.localdomain podman[58690]: 2025-12-06 08:21:19.93106655 +0000 UTC m=+0.091035327 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:21:20 np0005548789.localdomain podman[58690]: 2025-12-06 08:21:20.108313637 +0000 UTC m=+0.268282454 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 08:21:20 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:21:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:21:50 np0005548789.localdomain podman[58720]: 2025-12-06 08:21:50.916704142 +0000 UTC m=+0.075751081 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:21:51 np0005548789.localdomain podman[58720]: 2025-12-06 08:21:51.085390187 +0000 UTC m=+0.244437026 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4)
Dec 06 08:21:51 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:21:58 np0005548789.localdomain sudo[58749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:21:58 np0005548789.localdomain sudo[58749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:21:58 np0005548789.localdomain sudo[58749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:21:58 np0005548789.localdomain sudo[58764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:21:58 np0005548789.localdomain sudo[58764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:21:59 np0005548789.localdomain podman[58849]: 2025-12-06 08:21:59.660964923 +0000 UTC m=+0.080508357 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, release=1763362218, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:21:59 np0005548789.localdomain podman[58849]: 2025-12-06 08:21:59.781092546 +0000 UTC m=+0.200635950 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:22:00 np0005548789.localdomain sudo[58764]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:00 np0005548789.localdomain sudo[58916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:22:00 np0005548789.localdomain sudo[58916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:00 np0005548789.localdomain sudo[58916]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:00 np0005548789.localdomain sudo[58931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:22:00 np0005548789.localdomain sudo[58931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:00 np0005548789.localdomain sudo[58931]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:01 np0005548789.localdomain sudo[58979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:22:01 np0005548789.localdomain sudo[58979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:22:01 np0005548789.localdomain sudo[58979]: pam_unix(sudo:session): session closed for user root
Dec 06 08:22:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:22:21 np0005548789.localdomain systemd[1]: tmp-crun.V9l7At.mount: Deactivated successfully.
Dec 06 08:22:21 np0005548789.localdomain podman[58994]: 2025-12-06 08:22:21.938515005 +0000 UTC m=+0.091178776 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1)
Dec 06 08:22:22 np0005548789.localdomain podman[58994]: 2025-12-06 08:22:22.13425045 +0000 UTC m=+0.286914211 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, tcib_managed=true, vcs-type=git, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Dec 06 08:22:22 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:22:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:22:52 np0005548789.localdomain podman[59022]: 2025-12-06 08:22:52.92957614 +0000 UTC m=+0.087538056 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:22:53 np0005548789.localdomain podman[59022]: 2025-12-06 08:22:53.152226301 +0000 UTC m=+0.310188177 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 08:22:53 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:23:01 np0005548789.localdomain sudo[59051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:23:01 np0005548789.localdomain sudo[59051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:01 np0005548789.localdomain sudo[59051]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:01 np0005548789.localdomain sudo[59066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:23:01 np0005548789.localdomain sudo[59066]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:02 np0005548789.localdomain sudo[59066]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:02 np0005548789.localdomain sudo[59113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:23:02 np0005548789.localdomain sudo[59113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:23:02 np0005548789.localdomain sudo[59113]: pam_unix(sudo:session): session closed for user root
Dec 06 08:23:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:23:23 np0005548789.localdomain podman[59128]: 2025-12-06 08:23:23.924226286 +0000 UTC m=+0.086837684 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, architecture=x86_64, tcib_managed=true, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:23:24 np0005548789.localdomain podman[59128]: 2025-12-06 08:23:24.131382377 +0000 UTC m=+0.293993775 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-qdrouterd)
Dec 06 08:23:24 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:23:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:23:54 np0005548789.localdomain podman[59159]: 2025-12-06 08:23:54.920747937 +0000 UTC m=+0.081524393 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=metrics_qdr)
Dec 06 08:23:55 np0005548789.localdomain podman[59159]: 2025-12-06 08:23:55.117184583 +0000 UTC m=+0.277961079 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd)
Dec 06 08:23:55 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:24:02 np0005548789.localdomain sudo[59188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:24:02 np0005548789.localdomain sudo[59188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:02 np0005548789.localdomain sudo[59188]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:02 np0005548789.localdomain sudo[59203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:24:02 np0005548789.localdomain sudo[59203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:03 np0005548789.localdomain sudo[59203]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:04 np0005548789.localdomain sudo[59250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:24:04 np0005548789.localdomain sudo[59250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:24:04 np0005548789.localdomain sudo[59250]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:24:25 np0005548789.localdomain systemd[1]: tmp-crun.VydYiQ.mount: Deactivated successfully.
Dec 06 08:24:25 np0005548789.localdomain podman[59265]: 2025-12-06 08:24:25.924219477 +0000 UTC m=+0.086568713 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1)
Dec 06 08:24:26 np0005548789.localdomain podman[59265]: 2025-12-06 08:24:26.123000348 +0000 UTC m=+0.285349564 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 06 08:24:26 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:24:52 np0005548789.localdomain sudo[59340]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pioaqsoqjhvoxxjvtfyyzcmqnnwpdemp ; /usr/bin/python3
Dec 06 08:24:52 np0005548789.localdomain sudo[59340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:52 np0005548789.localdomain python3[59342]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:24:52 np0005548789.localdomain sudo[59340]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:52 np0005548789.localdomain sudo[59385]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-noazqmnnlsllynhgkxquhyacgmlxvytw ; /usr/bin/python3
Dec 06 08:24:52 np0005548789.localdomain sudo[59385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:52 np0005548789.localdomain python3[59387]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009492.2425036-98735-234126235061465/source _original_basename=tmp7nhgmnu9 follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:24:52 np0005548789.localdomain sudo[59385]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:53 np0005548789.localdomain sudo[59415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwnhkssadnzdugkcwnvwpxalnzojekfj ; /usr/bin/python3
Dec 06 08:24:53 np0005548789.localdomain sudo[59415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:53 np0005548789.localdomain python3[59417]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:24:53 np0005548789.localdomain sudo[59415]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:54 np0005548789.localdomain sudo[59465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xstwzsucjyknnettaaddlswgqhywjipp ; /usr/bin/python3
Dec 06 08:24:54 np0005548789.localdomain sudo[59465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:54 np0005548789.localdomain sudo[59465]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:54 np0005548789.localdomain sudo[59483]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llgjuwmprfjtdkhdmxoicdkelvawmiws ; /usr/bin/python3
Dec 06 08:24:54 np0005548789.localdomain sudo[59483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:54 np0005548789.localdomain sudo[59483]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:55 np0005548789.localdomain sudo[59587]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqszhzlsxxknycdgmzlkqwjaeojaltwo ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.0948784-98891-228612148580821/async_wrapper.py 965851980157 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.0948784-98891-228612148580821/AnsiballZ_command.py _
Dec 06 08:24:55 np0005548789.localdomain sudo[59587]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:24:55 np0005548789.localdomain ansible-async_wrapper.py[59589]: Invoked with 965851980157 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009495.0948784-98891-228612148580821/AnsiballZ_command.py _
Dec 06 08:24:55 np0005548789.localdomain ansible-async_wrapper.py[59592]: Starting module and watcher
Dec 06 08:24:55 np0005548789.localdomain ansible-async_wrapper.py[59592]: Start watching 59593 (3600)
Dec 06 08:24:55 np0005548789.localdomain ansible-async_wrapper.py[59593]: Start module (59593)
Dec 06 08:24:55 np0005548789.localdomain ansible-async_wrapper.py[59589]: Return async_wrapper task started.
Dec 06 08:24:55 np0005548789.localdomain sudo[59587]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:55 np0005548789.localdomain sudo[59608]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxgufadgbdcqahzwhxsstmegpdfaakni ; /usr/bin/python3
Dec 06 08:24:55 np0005548789.localdomain sudo[59608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:24:55 np0005548789.localdomain python3[59613]: ansible-ansible.legacy.async_status Invoked with jid=965851980157.59589 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:24:55 np0005548789.localdomain sudo[59608]: pam_unix(sudo:session): session closed for user root
Dec 06 08:24:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:24:56 np0005548789.localdomain systemd[1]: tmp-crun.28v0AJ.mount: Deactivated successfully.
Dec 06 08:24:56 np0005548789.localdomain podman[59627]: 2025-12-06 08:24:56.883217903 +0000 UTC m=+0.049053995 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:24:57 np0005548789.localdomain podman[59627]: 2025-12-06 08:24:57.037479409 +0000 UTC m=+0.203315521 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:24:57 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    (file & line not available)
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    (file & line not available)
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.11 seconds
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Notice: Applied catalog in 0.04 seconds
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Application:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    Initial environment: production
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    Converged environment: production
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:          Run mode: user
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Changes:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Events:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Resources:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:             Total: 10
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Time:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:          Schedule: 0.00
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:              File: 0.00
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:            Augeas: 0.01
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:              Exec: 0.01
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    Transaction evaluation: 0.03
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    Catalog application: 0.04
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:    Config retrieval: 0.15
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:          Last run: 1765009499
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:        Filebucket: 0.00
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:             Total: 0.04
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]: Version:
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:            Config: 1765009499
Dec 06 08:24:59 np0005548789.localdomain puppet-user[59612]:            Puppet: 7.10.0
Dec 06 08:24:59 np0005548789.localdomain ansible-async_wrapper.py[59593]: Module complete (59593)
Dec 06 08:25:00 np0005548789.localdomain ansible-async_wrapper.py[59592]: Done in kid B.
Dec 06 08:25:04 np0005548789.localdomain sudo[59752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:25:04 np0005548789.localdomain sudo[59752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:04 np0005548789.localdomain sudo[59752]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:04 np0005548789.localdomain sudo[59767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:25:04 np0005548789.localdomain sudo[59767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:05 np0005548789.localdomain sudo[59767]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:05 np0005548789.localdomain sudo[59815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:25:05 np0005548789.localdomain sudo[59815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:25:05 np0005548789.localdomain sudo[59815]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:06 np0005548789.localdomain sudo[59843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtszfksbwpnwuundrjquhryhuxdihjsu ; /usr/bin/python3
Dec 06 08:25:06 np0005548789.localdomain sudo[59843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:06 np0005548789.localdomain python3[59845]: ansible-ansible.legacy.async_status Invoked with jid=965851980157.59589 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:25:06 np0005548789.localdomain sudo[59843]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:06 np0005548789.localdomain sudo[59859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxwljrujgokqlononufawgtdpbmetpgo ; /usr/bin/python3
Dec 06 08:25:06 np0005548789.localdomain sudo[59859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:07 np0005548789.localdomain python3[59861]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:25:07 np0005548789.localdomain sudo[59859]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:07 np0005548789.localdomain sudo[59875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntlovodmuxuptwatgyqmggcdtfxyvbxm ; /usr/bin/python3
Dec 06 08:25:07 np0005548789.localdomain sudo[59875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:07 np0005548789.localdomain python3[59877]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:07 np0005548789.localdomain sudo[59875]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:07 np0005548789.localdomain sudo[59925]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plalwewgldjsbxopuuklultyxdislpjh ; /usr/bin/python3
Dec 06 08:25:07 np0005548789.localdomain sudo[59925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548789.localdomain python3[59927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:08 np0005548789.localdomain sudo[59925]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548789.localdomain sudo[59943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdikrrffrapijneqyxumnskwkdkeqrus ; /usr/bin/python3
Dec 06 08:25:08 np0005548789.localdomain sudo[59943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548789.localdomain python3[59945]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpr43d29l3 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:25:08 np0005548789.localdomain sudo[59943]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548789.localdomain sudo[59973]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpuqufodabqpgrxwytfmkgdvmeohwkrp ; /usr/bin/python3
Dec 06 08:25:08 np0005548789.localdomain sudo[59973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:08 np0005548789.localdomain python3[59975]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:08 np0005548789.localdomain sudo[59973]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:08 np0005548789.localdomain sudo[59989]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eszeccxgzeeihqfbgamlsypjzpnpbnfv ; /usr/bin/python3
Dec 06 08:25:08 np0005548789.localdomain sudo[59989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:09 np0005548789.localdomain sudo[59989]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:09 np0005548789.localdomain sudo[60076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgstccocmhhxbpvbeulriqijlvdtpmah ; /usr/bin/python3
Dec 06 08:25:09 np0005548789.localdomain sudo[60076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:09 np0005548789.localdomain python3[60078]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:25:09 np0005548789.localdomain sudo[60076]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:10 np0005548789.localdomain sudo[60095]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mevsxdtdyopqwdctzxmiivhoyztvhrrd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:10 np0005548789.localdomain sudo[60095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:10 np0005548789.localdomain python3[60097]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:10 np0005548789.localdomain sudo[60095]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:10 np0005548789.localdomain sudo[60111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buiwurcsfsqkymqaflhmkexirilzcium ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:10 np0005548789.localdomain sudo[60111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:11 np0005548789.localdomain sudo[60111]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:11 np0005548789.localdomain sudo[60127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxwyjlmatzepigqggqiijftktzisgsev ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:11 np0005548789.localdomain sudo[60127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:11 np0005548789.localdomain python3[60129]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:11 np0005548789.localdomain sudo[60127]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:12 np0005548789.localdomain sudo[60177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqidncwnlqigcxagvfuizetegidxhygs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:12 np0005548789.localdomain sudo[60177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:12 np0005548789.localdomain python3[60179]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:12 np0005548789.localdomain sudo[60177]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:12 np0005548789.localdomain sudo[60195]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wypnpylanfxvophglrnoplkfkgtwqxog ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:12 np0005548789.localdomain sudo[60195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:12 np0005548789.localdomain python3[60197]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:12 np0005548789.localdomain sudo[60195]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:12 np0005548789.localdomain sudo[60257]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koakropktsqksvaqjoezjtxgbrfwftwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:12 np0005548789.localdomain sudo[60257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:12 np0005548789.localdomain python3[60259]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:13 np0005548789.localdomain sudo[60257]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548789.localdomain sudo[60275]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsqksjosqpmwxcmzdcgbfqinowhzxeqf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548789.localdomain sudo[60275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:13 np0005548789.localdomain python3[60277]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:13 np0005548789.localdomain sudo[60275]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548789.localdomain sudo[60337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okongovlhglilyckubfjtsgznueqagxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548789.localdomain sudo[60337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:13 np0005548789.localdomain python3[60339]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:13 np0005548789.localdomain sudo[60337]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:13 np0005548789.localdomain sudo[60355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgjspyujvvjmhuejqwmauoistcbkbjym ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:13 np0005548789.localdomain sudo[60355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548789.localdomain python3[60357]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:14 np0005548789.localdomain sudo[60355]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:14 np0005548789.localdomain sudo[60417]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwcsuplbieenhzjqljkjgbcumhvwlpgc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:14 np0005548789.localdomain sudo[60417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548789.localdomain python3[60419]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:14 np0005548789.localdomain sudo[60417]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:14 np0005548789.localdomain sudo[60435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jictvyaqsqusltftsfbrjfqiuodsxbix ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:14 np0005548789.localdomain sudo[60435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:14 np0005548789.localdomain python3[60437]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:15 np0005548789.localdomain sudo[60435]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:15 np0005548789.localdomain sudo[60465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmhzsyzfgfjoncmbtdeymoobnxoxeqsp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:15 np0005548789.localdomain sudo[60465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:15 np0005548789.localdomain python3[60467]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:15 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:15 np0005548789.localdomain systemd-rc-local-generator[60490]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:15 np0005548789.localdomain systemd-sysv-generator[60494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:15 np0005548789.localdomain sudo[60465]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:16 np0005548789.localdomain sudo[60550]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdmgkrnvyudvnsixqslkxnllmimyyzuv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:16 np0005548789.localdomain sudo[60550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:16 np0005548789.localdomain python3[60552]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:16 np0005548789.localdomain sudo[60550]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:16 np0005548789.localdomain sudo[60568]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmvhggsemjliiooxxuyvwsscejpccrsf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:16 np0005548789.localdomain sudo[60568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:16 np0005548789.localdomain python3[60570]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:16 np0005548789.localdomain sudo[60568]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548789.localdomain sudo[60630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhaqyaeihpckirpfzkfqdfcatifkjgoc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548789.localdomain sudo[60630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:17 np0005548789.localdomain python3[60632]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:25:17 np0005548789.localdomain sudo[60630]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548789.localdomain sudo[60648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzukbrqitwjmeqbczygfizyoqnoxoavg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548789.localdomain sudo[60648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:17 np0005548789.localdomain python3[60650]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:17 np0005548789.localdomain sudo[60648]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:17 np0005548789.localdomain sudo[60678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzvkxgaebacpfmbctnmdncaxsvukcdys ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:17 np0005548789.localdomain sudo[60678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:18 np0005548789.localdomain python3[60680]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:18 np0005548789.localdomain systemd-sysv-generator[60709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:18 np0005548789.localdomain systemd-rc-local-generator[60705]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:25:18 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:25:18 np0005548789.localdomain sudo[60678]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:18 np0005548789.localdomain sudo[60734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obrphjavezyxrfdrvkdzltbroemfrkij ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:18 np0005548789.localdomain sudo[60734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:19 np0005548789.localdomain python3[60736]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:25:19 np0005548789.localdomain sudo[60734]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:19 np0005548789.localdomain sudo[60750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmwnqwdonepvfbmxfxcmlotjwwebhosj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:19 np0005548789.localdomain sudo[60750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:19 np0005548789.localdomain sudo[60750]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:20 np0005548789.localdomain sudo[60793]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuffwfljeobcmjpzgxzdgszkzagekuwh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:20 np0005548789.localdomain sudo[60793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:21 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:25:21 np0005548789.localdomain podman[60940]: 2025-12-06 08:25:21.412354083 +0000 UTC m=+0.068153200 container create afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, version=17.1.12)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope.
Dec 06 08:25:21 np0005548789.localdomain podman[60948]: 2025-12-06 08:25:21.453450551 +0000 UTC m=+0.091858005 container create c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:35:22Z, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_step3)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain podman[60940]: 2025-12-06 08:25:21.383629882 +0000 UTC m=+0.039428999 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:25:21 np0005548789.localdomain podman[60940]: 2025-12-06 08:25:21.482280535 +0000 UTC m=+0.138079652 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, version=17.1.12, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:25:21 np0005548789.localdomain podman[60940]: 2025-12-06 08:25:21.48930331 +0000 UTC m=+0.145102417 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Dec 06 08:25:21 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=7a657a42c3cbd75086c59cf211d6fafe --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Dec 06 08:25:21 np0005548789.localdomain podman[61002]: 2025-12-06 08:25:21.49680929 +0000 UTC m=+0.075909976 container create ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, container_name=ceilometer_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82.scope.
Dec 06 08:25:21 np0005548789.localdomain sudo[61029]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:21 np0005548789.localdomain sudo[61029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548789.localdomain podman[60948]: 2025-12-06 08:25:21.407747051 +0000 UTC m=+0.046154505 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain podman[60954]: 2025-12-06 08:25:21.42011953 +0000 UTC m=+0.058955288 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:25:21 np0005548789.localdomain podman[60954]: 2025-12-06 08:25:21.523669933 +0000 UTC m=+0.162505661 container create b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1.scope.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548789.localdomain podman[60989]: 2025-12-06 08:25:21.552390434 +0000 UTC m=+0.143712076 container create 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container)
Dec 06 08:25:21 np0005548789.localdomain sudo[61029]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e232d99afeeb95c94065c4aa6c90831e0f37d94aede849daf1e3af8b69b5b465/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d.scope.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain podman[61002]: 2025-12-06 08:25:21.463076707 +0000 UTC m=+0.042177383 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:25:21 np0005548789.localdomain podman[61002]: 2025-12-06 08:25:21.565100122 +0000 UTC m=+0.144200808 container init ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/761acef0a8624aecb4fcf3ced8fe890c9e02485b13878d736631aa63ee8f2874/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/761acef0a8624aecb4fcf3ced8fe890c9e02485b13878d736631aa63ee8f2874/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/761acef0a8624aecb4fcf3ced8fe890c9e02485b13878d736631aa63ee8f2874/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain podman[61002]: 2025-12-06 08:25:21.572718406 +0000 UTC m=+0.151819082 container start ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain podman[60948]: 2025-12-06 08:25:21.577908315 +0000 UTC m=+0.216315759 container init c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtlogd_wrapper, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true)
Dec 06 08:25:21 np0005548789.localdomain podman[60954]: 2025-12-06 08:25:21.580534116 +0000 UTC m=+0.219369864 container init b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_statedir_owner, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.scope.
Dec 06 08:25:21 np0005548789.localdomain podman[60948]: 2025-12-06 08:25:21.591141811 +0000 UTC m=+0.229549255 container start c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:21 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d980d54738e5f040d62ff40bb9abb4b1931da0f4c80c1ba3031e7feabd416146/merged/scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d980d54738e5f040d62ff40bb9abb4b1931da0f4c80c1ba3031e7feabd416146/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:21 np0005548789.localdomain podman[60989]: 2025-12-06 08:25:21.507514189 +0000 UTC m=+0.098835821 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:25:21 np0005548789.localdomain sudo[61083]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:21 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:25:21 np0005548789.localdomain podman[61059]: 2025-12-06 08:25:21.633643903 +0000 UTC m=+0.056060919 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, tcib_managed=true, release=1761123044, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:25:21 np0005548789.localdomain podman[60989]: 2025-12-06 08:25:21.638624655 +0000 UTC m=+0.229946277 container init 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:25:21 np0005548789.localdomain sudo[61112]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:25:21 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:25:21 np0005548789.localdomain podman[60989]: 2025-12-06 08:25:21.672203714 +0000 UTC m=+0.263525336 container start 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 06 08:25:21 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4767aaabc3de112d8791c290aa2b669d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548789.localdomain podman[61070]: 2025-12-06 08:25:21.684000756 +0000 UTC m=+0.096544769 container died ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_init_log, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:21 np0005548789.localdomain podman[61059]: 2025-12-06 08:25:21.761196941 +0000 UTC m=+0.183613927 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-rsyslog, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-conmon-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Queued start job for default target Main User Target.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Created slice User Application Slice.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Reached target Paths.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Reached target Timers.
Dec 06 08:25:21 np0005548789.localdomain podman[60954]: 2025-12-06 08:25:21.79184537 +0000 UTC m=+0.430681088 container start b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public)
Dec 06 08:25:21 np0005548789.localdomain podman[60954]: 2025-12-06 08:25:21.792371176 +0000 UTC m=+0.431206904 container attach b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_statedir_owner, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Starting D-Bus User Message Bus Socket...
Dec 06 08:25:21 np0005548789.localdomain podman[61114]: 2025-12-06 08:25:21.793818221 +0000 UTC m=+0.137455153 container died b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, container_name=nova_statedir_owner, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z)
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Starting Create User's Volatile Files and Directories...
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Finished Create User's Volatile Files and Directories.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Reached target Sockets.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Reached target Basic System.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Reached target Main User Target.
Dec 06 08:25:21 np0005548789.localdomain systemd[61115]: Startup finished in 114ms.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:25:21 np0005548789.localdomain podman[61070]: 2025-12-06 08:25:21.812607187 +0000 UTC m=+0.225151170 container cleanup ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:12:45Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_init_log, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started Session c1 of User root.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: Started Session c2 of User root.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-conmon-ef28ead9699a3a453a7c64489f8848159f375497f80472e3334975e7b928c9e1.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain sudo[61083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548789.localdomain sudo[61112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:21 np0005548789.localdomain podman[61114]: 2025-12-06 08:25:21.884477468 +0000 UTC m=+0.228114420 container cleanup b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, container_name=nova_statedir_owner, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 06 08:25:21 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Dec 06 08:25:21 np0005548789.localdomain sudo[61083]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: libpod-conmon-b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain podman[61123]: 2025-12-06 08:25:21.801853407 +0000 UTC m=+0.123534476 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=collectd, io.openshift.expose-services=)
Dec 06 08:25:21 np0005548789.localdomain sudo[61112]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Dec 06 08:25:21 np0005548789.localdomain podman[61123]: 2025-12-06 08:25:21.932575072 +0000 UTC m=+0.254256101 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:21 np0005548789.localdomain podman[61123]: unhealthy
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:21 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Failed with result 'exit-code'.
Dec 06 08:25:22 np0005548789.localdomain podman[61300]: 2025-12-06 08:25:22.230056298 +0000 UTC m=+0.075943848 container create 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:25:22 np0005548789.localdomain systemd[1]: Started libpod-conmon-5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964.scope.
Dec 06 08:25:22 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:22 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a1de60586d08fb5b92518366abbb713eb2f96c306cc36f6078e1ea96b940056/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a1de60586d08fb5b92518366abbb713eb2f96c306cc36f6078e1ea96b940056/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a1de60586d08fb5b92518366abbb713eb2f96c306cc36f6078e1ea96b940056/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a1de60586d08fb5b92518366abbb713eb2f96c306cc36f6078e1ea96b940056/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:22 np0005548789.localdomain podman[61300]: 2025-12-06 08:25:22.201226074 +0000 UTC m=+0.047113584 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:22 np0005548789.localdomain podman[61300]: 2025-12-06 08:25:22.303862829 +0000 UTC m=+0.149750339 container init 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, vcs-type=git, tcib_managed=true)
Dec 06 08:25:22 np0005548789.localdomain podman[61300]: 2025-12-06 08:25:22.313431992 +0000 UTC m=+0.159319492 container start 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4)
Dec 06 08:25:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 08:25:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:22 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Dec 06 08:25:23 np0005548789.localdomain podman[61381]: 2025-12-06 08:25:23.013460601 +0000 UTC m=+0.066925282 container create 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc.)
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367.scope.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548789.localdomain podman[61381]: 2025-12-06 08:25:22.984493674 +0000 UTC m=+0.037958365 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain podman[61381]: 2025-12-06 08:25:23.095163254 +0000 UTC m=+0.148627945 container init 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step3, container_name=nova_virtsecretd, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:23 np0005548789.localdomain podman[61381]: 2025-12-06 08:25:23.106993428 +0000 UTC m=+0.160458119 container start 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtsecretd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Dec 06 08:25:23 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548789.localdomain sudo[61402]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started Session c3 of User root.
Dec 06 08:25:23 np0005548789.localdomain sudo[61402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548789.localdomain sudo[61402]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548789.localdomain podman[61517]: 2025-12-06 08:25:23.498557345 +0000 UTC m=+0.061622119 container create 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z)
Dec 06 08:25:23 np0005548789.localdomain podman[61524]: 2025-12-06 08:25:23.540884693 +0000 UTC m=+0.089998740 container create b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5.scope.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain podman[61517]: 2025-12-06 08:25:23.56303715 +0000 UTC m=+0.126101924 container init 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 08:25:23 np0005548789.localdomain podman[61517]: 2025-12-06 08:25:23.571525851 +0000 UTC m=+0.134590635 container start 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 06 08:25:23 np0005548789.localdomain podman[61517]: 2025-12-06 08:25:23.46834462 +0000 UTC m=+0.031409414 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.scope.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef5a2dfca972201470637fe24151a27f03799cbb5a942d988431f305c9ea334c/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef5a2dfca972201470637fe24151a27f03799cbb5a942d988431f305c9ea334c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:23 np0005548789.localdomain podman[61524]: 2025-12-06 08:25:23.493422888 +0000 UTC m=+0.042537005 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:25:23 np0005548789.localdomain sudo[61552]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started Session c4 of User root.
Dec 06 08:25:23 np0005548789.localdomain sudo[61552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:25:23 np0005548789.localdomain podman[61524]: 2025-12-06 08:25:23.621002697 +0000 UTC m=+0.170116754 container init b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=iscsid, distribution-scope=public, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team)
Dec 06 08:25:23 np0005548789.localdomain sudo[61571]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:25:23 np0005548789.localdomain podman[61524]: 2025-12-06 08:25:23.655711651 +0000 UTC m=+0.204825688 container start b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git)
Dec 06 08:25:23 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: Started Session c5 of User root.
Dec 06 08:25:23 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=18576754feb36b85b5c8742ad9b5643d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Dec 06 08:25:23 np0005548789.localdomain sudo[61571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:23 np0005548789.localdomain sudo[61552]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548789.localdomain podman[61578]: 2025-12-06 08:25:23.724206919 +0000 UTC m=+0.060951259 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid)
Dec 06 08:25:23 np0005548789.localdomain sudo[61571]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Dec 06 08:25:23 np0005548789.localdomain kernel: Loading iSCSI transport class v2.0-870.
Dec 06 08:25:23 np0005548789.localdomain podman[61578]: 2025-12-06 08:25:23.759257823 +0000 UTC m=+0.096002183 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Dec 06 08:25:23 np0005548789.localdomain podman[61578]: unhealthy
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:23 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Failed with result 'exit-code'.
Dec 06 08:25:25 np0005548789.localdomain podman[61693]: 2025-12-06 08:25:25.117125989 +0000 UTC m=+0.101517271 container create 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 08:25:25 np0005548789.localdomain podman[61693]: 2025-12-06 08:25:25.074229895 +0000 UTC m=+0.058621187 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started libpod-conmon-92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d.scope.
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain podman[61693]: 2025-12-06 08:25:25.221978752 +0000 UTC m=+0.206370044 container init 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 06 08:25:25 np0005548789.localdomain podman[61693]: 2025-12-06 08:25:25.229871733 +0000 UTC m=+0.214263025 container start 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:25:25 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548789.localdomain sudo[61712]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:25 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started Session c6 of User root.
Dec 06 08:25:25 np0005548789.localdomain sudo[61712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:25 np0005548789.localdomain sudo[61712]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Dec 06 08:25:25 np0005548789.localdomain podman[61795]: 2025-12-06 08:25:25.692895272 +0000 UTC m=+0.063646372 container create e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtqemud, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started libpod-conmon-e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a.scope.
Dec 06 08:25:25 np0005548789.localdomain podman[61795]: 2025-12-06 08:25:25.65532777 +0000 UTC m=+0.026078930 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:25 np0005548789.localdomain podman[61795]: 2025-12-06 08:25:25.784085015 +0000 UTC m=+0.154836105 container init e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, version=17.1.12, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://www.redhat.com, release=1761123044, container_name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 08:25:25 np0005548789.localdomain podman[61795]: 2025-12-06 08:25:25.802546381 +0000 UTC m=+0.173297481 container start e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:25:25 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:25 np0005548789.localdomain sudo[61815]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:25 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: Started Session c7 of User root.
Dec 06 08:25:25 np0005548789.localdomain sudo[61815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:25 np0005548789.localdomain sudo[61815]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:25 np0005548789.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Dec 06 08:25:26 np0005548789.localdomain podman[61903]: 2025-12-06 08:25:26.313198178 +0000 UTC m=+0.087806751 container create abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:25:26 np0005548789.localdomain systemd[1]: Started libpod-conmon-abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa.scope.
Dec 06 08:25:26 np0005548789.localdomain podman[61903]: 2025-12-06 08:25:26.269795158 +0000 UTC m=+0.044403781 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:26 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:26 np0005548789.localdomain podman[61903]: 2025-12-06 08:25:26.386415661 +0000 UTC m=+0.161024254 container init abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:25:26 np0005548789.localdomain podman[61903]: 2025-12-06 08:25:26.395794879 +0000 UTC m=+0.170403482 container start abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, distribution-scope=public)
Dec 06 08:25:26 np0005548789.localdomain python3[60795]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:25:26 np0005548789.localdomain systemd[1]: tmp-crun.tP470m.mount: Deactivated successfully.
Dec 06 08:25:26 np0005548789.localdomain sudo[61922]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:26 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:25:26 np0005548789.localdomain systemd[1]: Started Session c8 of User root.
Dec 06 08:25:26 np0005548789.localdomain sudo[61922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:26 np0005548789.localdomain sudo[61922]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548789.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Dec 06 08:25:26 np0005548789.localdomain sudo[60793]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:26 np0005548789.localdomain sudo[61982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odvvlrudriujfhbczbpfqhiqzqktencs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:26 np0005548789.localdomain sudo[61982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548789.localdomain python3[61984]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548789.localdomain sudo[61982]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548789.localdomain sudo[61998]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovvpvecrqremuidvuddrremkjseujbpo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:25:27 np0005548789.localdomain sudo[61998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548789.localdomain systemd[1]: tmp-crun.wVdgkN.mount: Deactivated successfully.
Dec 06 08:25:27 np0005548789.localdomain podman[62000]: 2025-12-06 08:25:27.309922828 +0000 UTC m=+0.103822182 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, vcs-type=git)
Dec 06 08:25:27 np0005548789.localdomain python3[62001]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548789.localdomain sudo[61998]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548789.localdomain sudo[62043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqbxlsiksqnbkskjimfensayeakeiyoi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548789.localdomain sudo[62043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548789.localdomain podman[62000]: 2025-12-06 08:25:27.512978919 +0000 UTC m=+0.306878193 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 06 08:25:27 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:25:27 np0005548789.localdomain python3[62045]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548789.localdomain sudo[62043]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548789.localdomain sudo[62059]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpifromnwpywmlhrjgljiximnsjyuipt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548789.localdomain sudo[62059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:27 np0005548789.localdomain python3[62061]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:27 np0005548789.localdomain sudo[62059]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:27 np0005548789.localdomain sudo[62075]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beoikvsdioqvqphdbpujmymimfkqyzcm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:27 np0005548789.localdomain sudo[62075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548789.localdomain python3[62077]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:28 np0005548789.localdomain sudo[62075]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548789.localdomain sudo[62091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mifpunbxesdgknkmtvjfeeydbjjumbkt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548789.localdomain sudo[62091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548789.localdomain python3[62093]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:28 np0005548789.localdomain sudo[62091]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548789.localdomain sudo[62107]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgjejfgqozpqayokvsouhpcdoofplfis ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548789.localdomain sudo[62107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548789.localdomain python3[62109]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:28 np0005548789.localdomain sudo[62107]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548789.localdomain sudo[62123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddnqgglotfwxsjqbfuulolmujolizfsj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548789.localdomain sudo[62123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:28 np0005548789.localdomain python3[62125]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:28 np0005548789.localdomain sudo[62123]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:28 np0005548789.localdomain sudo[62139]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkckeztkmbfgmthohmppwwvrfsqpcqvs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:28 np0005548789.localdomain sudo[62139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548789.localdomain python3[62141]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:29 np0005548789.localdomain sudo[62139]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548789.localdomain sudo[62155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbnncsohsubwzpltbfrbcbdoebursibt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548789.localdomain sudo[62155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548789.localdomain python3[62157]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548789.localdomain sudo[62155]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548789.localdomain sudo[62171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdijoosqmyxfqywnnaxdqoaoetboviwq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548789.localdomain sudo[62171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548789.localdomain python3[62173]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548789.localdomain sudo[62171]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548789.localdomain sudo[62187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jritwgalhveaxnvlbamjmodzkmzwgwcj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548789.localdomain sshd[62189]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:25:29 np0005548789.localdomain sudo[62187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:29 np0005548789.localdomain sshd[62189]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 08:25:29 np0005548789.localdomain sshd[62189]: Connection closed by 92.118.39.95 port 42446
Dec 06 08:25:29 np0005548789.localdomain python3[62190]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:29 np0005548789.localdomain sudo[62187]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:29 np0005548789.localdomain sudo[62204]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlgoburgtyrmrnmtdejqgajmklhatmpc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:29 np0005548789.localdomain sudo[62204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548789.localdomain python3[62206]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548789.localdomain sudo[62204]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548789.localdomain sudo[62220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzqapwtisvvjnktdznkpsfquavmjqxsr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548789.localdomain sudo[62220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548789.localdomain python3[62222]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548789.localdomain sudo[62220]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548789.localdomain sudo[62236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbecxnkwnfltxpcycjwbvmzxnshcngdc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548789.localdomain sudo[62236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548789.localdomain python3[62238]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548789.localdomain sudo[62236]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548789.localdomain sudo[62252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utpcdnlrhggywxgfwmtrriefukoilorx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548789.localdomain sudo[62252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:30 np0005548789.localdomain python3[62254]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:30 np0005548789.localdomain sudo[62252]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:30 np0005548789.localdomain sudo[62268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqmoxphyuvyminguzzhtdjrgjkohlyhz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:30 np0005548789.localdomain sudo[62268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:31 np0005548789.localdomain python3[62270]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:31 np0005548789.localdomain sudo[62268]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:31 np0005548789.localdomain sudo[62284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ughgpegngtrnsweotylpphqosgbyabzs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:31 np0005548789.localdomain sudo[62284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:31 np0005548789.localdomain python3[62286]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:25:31 np0005548789.localdomain sudo[62284]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:31 np0005548789.localdomain sudo[62345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aptfympjapjvolwlpvwnjbnhrmubvytz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:31 np0005548789.localdomain sudo[62345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:31 np0005548789.localdomain python3[62347]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:31 np0005548789.localdomain sudo[62345]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:32 np0005548789.localdomain sudo[62374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxyejysqdrwldcgzmhrcjvazcttacvto ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:32 np0005548789.localdomain sudo[62374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:32 np0005548789.localdomain python3[62376]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:32 np0005548789.localdomain sudo[62374]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:32 np0005548789.localdomain sudo[62403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyshyxaljubfgzznkulgvjvumxopcdzz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:32 np0005548789.localdomain sudo[62403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:33 np0005548789.localdomain python3[62405]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:33 np0005548789.localdomain sudo[62403]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:33 np0005548789.localdomain sudo[62432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgkukgzcwgwqumwhdmgnptcjrhldlogb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:33 np0005548789.localdomain sudo[62432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:33 np0005548789.localdomain python3[62434]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:33 np0005548789.localdomain sudo[62432]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:33 np0005548789.localdomain sudo[62461]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uovldmwdluhqxkhhozjixkqratlmkqlo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:33 np0005548789.localdomain sudo[62461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:34 np0005548789.localdomain python3[62463]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:34 np0005548789.localdomain sudo[62461]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:34 np0005548789.localdomain sudo[62490]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzwbzcdprscaenzbkpdwbqgvzwkgfzxm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:34 np0005548789.localdomain sudo[62490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:34 np0005548789.localdomain python3[62492]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:34 np0005548789.localdomain sudo[62490]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:34 np0005548789.localdomain sudo[62519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nryqsxwxshhooqpkrsjkljwpyqshemwe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:34 np0005548789.localdomain sudo[62519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:35 np0005548789.localdomain python3[62521]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:35 np0005548789.localdomain sudo[62519]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:35 np0005548789.localdomain sudo[62548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tceuondxdifhybbzpesqfvgmngruuldk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:35 np0005548789.localdomain sudo[62548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:35 np0005548789.localdomain python3[62550]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:35 np0005548789.localdomain sudo[62548]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:36 np0005548789.localdomain sudo[62577]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkajqbdfokdplqnfeekukenqrnxfdeaf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:36 np0005548789.localdomain sudo[62577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:36 np0005548789.localdomain python3[62579]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009531.4189758-100154-124419602684867/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:36 np0005548789.localdomain sudo[62577]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:36 np0005548789.localdomain sudo[62593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrpmozujlkmqhvolvdglsieaoyzxllfp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:36 np0005548789.localdomain sudo[62593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:36 np0005548789.localdomain python3[62595]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:36 np0005548789.localdomain systemd-rc-local-generator[62621]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:36 np0005548789.localdomain systemd-sysv-generator[62624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Activating special unit Exit the Session...
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped target Main User Target.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped target Basic System.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped target Paths.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped target Sockets.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped target Timers.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Closed D-Bus User Message Bus Socket.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Removed slice User Application Slice.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Reached target Shutdown.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Finished Exit the Session.
Dec 06 08:25:36 np0005548789.localdomain systemd[61115]: Reached target Exit the Session.
Dec 06 08:25:36 np0005548789.localdomain sudo[62593]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:25:36 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:25:37 np0005548789.localdomain sudo[62646]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vztcqeclkvxdhdzmqytqiudyshdhzeni ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:37 np0005548789.localdomain sudo[62646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:37 np0005548789.localdomain python3[62648]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:37 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:37 np0005548789.localdomain systemd-sysv-generator[62678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:37 np0005548789.localdomain systemd-rc-local-generator[62675]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:37 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:37 np0005548789.localdomain systemd[1]: Starting collectd container...
Dec 06 08:25:37 np0005548789.localdomain systemd[1]: Started collectd container.
Dec 06 08:25:37 np0005548789.localdomain sudo[62646]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:38 np0005548789.localdomain sudo[62712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wshpsajpjmnhiorsjzmdxcwnlyplgamw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:38 np0005548789.localdomain sudo[62712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:38 np0005548789.localdomain python3[62714]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:38 np0005548789.localdomain systemd-sysv-generator[62743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:38 np0005548789.localdomain systemd-rc-local-generator[62739]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:38 np0005548789.localdomain systemd[1]: Starting iscsid container...
Dec 06 08:25:38 np0005548789.localdomain systemd[1]: Started iscsid container.
Dec 06 08:25:38 np0005548789.localdomain sudo[62712]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:39 np0005548789.localdomain sudo[62778]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arjruocbgigytvywgusqotvfjesvsspz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:39 np0005548789.localdomain sudo[62778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:39 np0005548789.localdomain python3[62780]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:39 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:39 np0005548789.localdomain systemd-sysv-generator[62810]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:39 np0005548789.localdomain systemd-rc-local-generator[62806]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:39 np0005548789.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Dec 06 08:25:39 np0005548789.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Dec 06 08:25:39 np0005548789.localdomain sudo[62778]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:40 np0005548789.localdomain sudo[62845]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzsfcxaxpmurqbsgqcvlrongaualdibr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:40 np0005548789.localdomain sudo[62845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:40 np0005548789.localdomain python3[62847]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:40 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:40 np0005548789.localdomain systemd-rc-local-generator[62873]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:40 np0005548789.localdomain systemd-sysv-generator[62879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:40 np0005548789.localdomain systemd[1]: Starting nova_virtnodedevd container...
Dec 06 08:25:41 np0005548789.localdomain tripleo-start-podman-container[62887]: Creating additional drop-in dependency for "nova_virtnodedevd" (77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5)
Dec 06 08:25:41 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:41 np0005548789.localdomain systemd-rc-local-generator[62943]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:41 np0005548789.localdomain systemd-sysv-generator[62946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:41 np0005548789.localdomain systemd[1]: Started nova_virtnodedevd container.
Dec 06 08:25:41 np0005548789.localdomain sudo[62845]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:41 np0005548789.localdomain sudo[62968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfrqtaowxhqrzdsmnyosejxqvwgmzexm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:41 np0005548789.localdomain sudo[62968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:42 np0005548789.localdomain python3[62970]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:42 np0005548789.localdomain systemd-rc-local-generator[63001]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:42 np0005548789.localdomain systemd-sysv-generator[63005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: Starting nova_virtproxyd container...
Dec 06 08:25:42 np0005548789.localdomain tripleo-start-podman-container[63010]: Creating additional drop-in dependency for "nova_virtproxyd" (abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa)
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:42 np0005548789.localdomain systemd-rc-local-generator[63072]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:42 np0005548789.localdomain systemd-sysv-generator[63075]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:42 np0005548789.localdomain systemd[1]: Started nova_virtproxyd container.
Dec 06 08:25:43 np0005548789.localdomain sudo[62968]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:43 np0005548789.localdomain sudo[63094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bysawzbvrxrkxkttnzvfrwoalbtnebkz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:43 np0005548789.localdomain sudo[63094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:43 np0005548789.localdomain python3[63096]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:44 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:44 np0005548789.localdomain systemd-sysv-generator[63128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:44 np0005548789.localdomain systemd-rc-local-generator[63123]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:44 np0005548789.localdomain systemd[1]: Starting nova_virtqemud container...
Dec 06 08:25:45 np0005548789.localdomain tripleo-start-podman-container[63136]: Creating additional drop-in dependency for "nova_virtqemud" (e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a)
Dec 06 08:25:45 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:45 np0005548789.localdomain systemd-rc-local-generator[63196]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:45 np0005548789.localdomain systemd-sysv-generator[63200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:45 np0005548789.localdomain systemd[1]: Started nova_virtqemud container.
Dec 06 08:25:45 np0005548789.localdomain sudo[63094]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:45 np0005548789.localdomain sudo[63219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypziqqklfvwzfoqiqplmqhpylposhrcm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:45 np0005548789.localdomain sudo[63219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:45 np0005548789.localdomain python3[63221]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:46 np0005548789.localdomain systemd-rc-local-generator[63246]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:46 np0005548789.localdomain systemd-sysv-generator[63252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: Starting nova_virtsecretd container...
Dec 06 08:25:46 np0005548789.localdomain tripleo-start-podman-container[63261]: Creating additional drop-in dependency for "nova_virtsecretd" (2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367)
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:46 np0005548789.localdomain systemd-rc-local-generator[63317]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:46 np0005548789.localdomain systemd-sysv-generator[63323]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:46 np0005548789.localdomain systemd[1]: Started nova_virtsecretd container.
Dec 06 08:25:46 np0005548789.localdomain sudo[63219]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:47 np0005548789.localdomain sudo[63342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgvhjzawqhsecumzuiqwdxwsezzkmyfy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:47 np0005548789.localdomain sudo[63342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:47 np0005548789.localdomain python3[63344]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:47 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:47 np0005548789.localdomain systemd-sysv-generator[63370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:47 np0005548789.localdomain systemd-rc-local-generator[63367]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:47 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:47 np0005548789.localdomain systemd[1]: Starting nova_virtstoraged container...
Dec 06 08:25:48 np0005548789.localdomain tripleo-start-podman-container[63384]: Creating additional drop-in dependency for "nova_virtstoraged" (92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d)
Dec 06 08:25:48 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:48 np0005548789.localdomain systemd-rc-local-generator[63440]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:48 np0005548789.localdomain systemd-sysv-generator[63443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:48 np0005548789.localdomain systemd[1]: Started nova_virtstoraged container.
Dec 06 08:25:48 np0005548789.localdomain sudo[63342]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:48 np0005548789.localdomain sudo[63465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xizgfkthmhpygidrvvbmyyhkpfbycfur ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:25:48 np0005548789.localdomain sudo[63465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:49 np0005548789.localdomain python3[63467]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:25:49 np0005548789.localdomain systemd-rc-local-generator[63496]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:25:49 np0005548789.localdomain systemd-sysv-generator[63501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: tmp-crun.XJui79.mount: Deactivated successfully.
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:49 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:49 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:49 np0005548789.localdomain podman[63507]: 2025-12-06 08:25:49.685818382 +0000 UTC m=+0.136390400 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., release=1761123044)
Dec 06 08:25:49 np0005548789.localdomain podman[63507]: 2025-12-06 08:25:49.694332993 +0000 UTC m=+0.144905011 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-rsyslog, version=17.1.12, tcib_managed=true)
Dec 06 08:25:49 np0005548789.localdomain podman[63507]: rsyslog
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:49 np0005548789.localdomain sudo[63525]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:49 np0005548789.localdomain sudo[63525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:49 np0005548789.localdomain sudo[63465]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:49 np0005548789.localdomain sudo[63525]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:49 np0005548789.localdomain podman[63529]: 2025-12-06 08:25:49.842063249 +0000 UTC m=+0.050108636 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, container_name=rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git)
Dec 06 08:25:49 np0005548789.localdomain podman[63529]: 2025-12-06 08:25:49.91485839 +0000 UTC m=+0.122903737 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, container_name=rsyslog, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:49 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:50 np0005548789.localdomain podman[63557]: 2025-12-06 08:25:50.00654819 +0000 UTC m=+0.058460163 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044)
Dec 06 08:25:50 np0005548789.localdomain podman[63557]: rsyslog
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:50 np0005548789.localdomain sudo[63581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxknpzwwnbrihrfqzukywspmgrpdibhw ; /usr/bin/python3
Dec 06 08:25:50 np0005548789.localdomain sudo[63581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:50 np0005548789.localdomain python3[63583]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:50 np0005548789.localdomain sudo[63581]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:50 np0005548789.localdomain podman[63584]: 2025-12-06 08:25:50.451945487 +0000 UTC m=+0.108319100 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, distribution-scope=public, config_id=tripleo_step3)
Dec 06 08:25:50 np0005548789.localdomain podman[63584]: 2025-12-06 08:25:50.461229341 +0000 UTC m=+0.117602904 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container)
Dec 06 08:25:50 np0005548789.localdomain podman[63584]: rsyslog
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:50 np0005548789.localdomain sudo[63605]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:50 np0005548789.localdomain sudo[63605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:50 np0005548789.localdomain sudo[63605]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:50 np0005548789.localdomain podman[63613]: 2025-12-06 08:25:50.615008583 +0000 UTC m=+0.054754918 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.)
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: tmp-crun.g0APtl.mount: Deactivated successfully.
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:50 np0005548789.localdomain podman[63613]: 2025-12-06 08:25:50.650988836 +0000 UTC m=+0.090735151 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, name=rhosp17/openstack-rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:50 np0005548789.localdomain podman[63652]: 2025-12-06 08:25:50.72978835 +0000 UTC m=+0.047604049 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, container_name=rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public)
Dec 06 08:25:50 np0005548789.localdomain podman[63652]: rsyslog
Dec 06 08:25:50 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:50 np0005548789.localdomain sudo[63678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxyorsjjlehfhokszmbokhvbkhwviavf ; /usr/bin/python3
Dec 06 08:25:50 np0005548789.localdomain sudo[63678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:50 np0005548789.localdomain sudo[63678]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:51 np0005548789.localdomain sudo[63728]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smfzbptpauoalyqruhkzgqheroxjaoqs ; /usr/bin/python3
Dec 06 08:25:51 np0005548789.localdomain sudo[63728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:51 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548789.localdomain podman[63709]: 2025-12-06 08:25:51.200096341 +0000 UTC m=+0.112749735 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:25:51 np0005548789.localdomain podman[63709]: 2025-12-06 08:25:51.208013594 +0000 UTC m=+0.120666998 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=rsyslog, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, url=https://www.redhat.com)
Dec 06 08:25:51 np0005548789.localdomain podman[63709]: rsyslog
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:51 np0005548789.localdomain sudo[63744]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:51 np0005548789.localdomain sudo[63744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:51 np0005548789.localdomain sudo[63744]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:51 np0005548789.localdomain podman[63747]: 2025-12-06 08:25:51.347076004 +0000 UTC m=+0.036057075 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:51 np0005548789.localdomain sudo[63728]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548789.localdomain podman[63747]: 2025-12-06 08:25:51.370519353 +0000 UTC m=+0.059500404 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, build-date=2025-11-18T22:49:49Z, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12)
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:51 np0005548789.localdomain podman[63761]: 2025-12-06 08:25:51.447840592 +0000 UTC m=+0.051260652 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, release=1761123044, container_name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Dec 06 08:25:51 np0005548789.localdomain podman[63761]: rsyslog
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 08:25:51 np0005548789.localdomain sudo[63807]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amizhvhgragcplelwnfktjuqirjxnklv ; /usr/bin/python3
Dec 06 08:25:51 np0005548789.localdomain sudo[63807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:51 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:51 np0005548789.localdomain podman[63796]: 2025-12-06 08:25:51.716308729 +0000 UTC m=+0.125522427 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Dec 06 08:25:51 np0005548789.localdomain podman[63796]: 2025-12-06 08:25:51.72515842 +0000 UTC m=+0.134372118 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, managed_by=tripleo_ansible, container_name=rsyslog, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-rsyslog, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:25:51 np0005548789.localdomain podman[63796]: rsyslog
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:51 np0005548789.localdomain sudo[63821]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:51 np0005548789.localdomain sudo[63821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:51 np0005548789.localdomain python3[63813]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005548789 step=3 update_config_hash_only=False
Dec 06 08:25:51 np0005548789.localdomain sudo[63821]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:51 np0005548789.localdomain sudo[63807]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:51 np0005548789.localdomain podman[63824]: 2025-12-06 08:25:51.865273262 +0000 UTC m=+0.038223861 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-rsyslog, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, version=17.1.12, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:25:51 np0005548789.localdomain podman[63824]: 2025-12-06 08:25:51.889884806 +0000 UTC m=+0.062835355 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:51 np0005548789.localdomain podman[63837]: 2025-12-06 08:25:51.978801471 +0000 UTC m=+0.057791492 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=rsyslog, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1)
Dec 06 08:25:51 np0005548789.localdomain podman[63837]: rsyslog
Dec 06 08:25:51 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:25:52 np0005548789.localdomain podman[63850]: 2025-12-06 08:25:52.102573503 +0000 UTC m=+0.086088279 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Starting rsyslog container...
Dec 06 08:25:52 np0005548789.localdomain podman[63850]: 2025-12-06 08:25:52.143206689 +0000 UTC m=+0.126721435 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:25:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Dec 06 08:25:52 np0005548789.localdomain podman[63870]: 2025-12-06 08:25:52.250814196 +0000 UTC m=+0.111834808 container init afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, release=1761123044)
Dec 06 08:25:52 np0005548789.localdomain podman[63870]: 2025-12-06 08:25:52.2594366 +0000 UTC m=+0.120457222 container start afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, version=17.1.12, name=rhosp17/openstack-rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.)
Dec 06 08:25:52 np0005548789.localdomain podman[63870]: rsyslog
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Started rsyslog container.
Dec 06 08:25:52 np0005548789.localdomain sudo[63901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbscxfggmdfpmzbmpmvcwrqkdxrsxdqk ; /usr/bin/python3
Dec 06 08:25:52 np0005548789.localdomain sudo[63901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:52 np0005548789.localdomain sudo[63905]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:25:52 np0005548789.localdomain sudo[63905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:25:52 np0005548789.localdomain sudo[63905]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: libpod-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f.scope: Deactivated successfully.
Dec 06 08:25:52 np0005548789.localdomain python3[63906]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:25:52 np0005548789.localdomain sudo[63901]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:52 np0005548789.localdomain podman[63909]: 2025-12-06 08:25:52.461681216 +0000 UTC m=+0.083588852 container died afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=rsyslog, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:52 np0005548789.localdomain podman[63909]: 2025-12-06 08:25:52.486018833 +0000 UTC m=+0.107926419 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z)
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:25:52 np0005548789.localdomain podman[63922]: 2025-12-06 08:25:52.57861785 +0000 UTC m=+0.056601085 container cleanup afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '7a657a42c3cbd75086c59cf211d6fafe'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3)
Dec 06 08:25:52 np0005548789.localdomain podman[63922]: rsyslog
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afa881aa4ca20970e313be385764e9619d76190f61a0727580f0c398406f954f-userdata-shm.mount: Deactivated successfully.
Dec 06 08:25:52 np0005548789.localdomain sudo[63948]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csqccjomyrisnrhonfykgpxeuwmyfytg ; /usr/bin/python3
Dec 06 08:25:52 np0005548789.localdomain sudo[63948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:25:52 np0005548789.localdomain python3[63950]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Stopped rsyslog container.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Dec 06 08:25:52 np0005548789.localdomain systemd[1]: Failed to start rsyslog container.
Dec 06 08:25:52 np0005548789.localdomain sudo[63948]: pam_unix(sudo:session): session closed for user root
Dec 06 08:25:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:25:53 np0005548789.localdomain podman[63951]: 2025-12-06 08:25:53.909454778 +0000 UTC m=+0.073898395 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 06 08:25:53 np0005548789.localdomain podman[63951]: 2025-12-06 08:25:53.924074325 +0000 UTC m=+0.088517982 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:25:53 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:25:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:25:57 np0005548789.localdomain podman[63970]: 2025-12-06 08:25:57.915073493 +0000 UTC m=+0.078387183 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, container_name=metrics_qdr)
Dec 06 08:25:58 np0005548789.localdomain podman[63970]: 2025-12-06 08:25:58.11110138 +0000 UTC m=+0.274415080 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:25:58 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:26:05 np0005548789.localdomain sudo[63999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:26:05 np0005548789.localdomain sudo[63999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:05 np0005548789.localdomain sudo[63999]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:05 np0005548789.localdomain sudo[64014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:26:05 np0005548789.localdomain sudo[64014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:06 np0005548789.localdomain sudo[64014]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:07 np0005548789.localdomain sudo[64062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:26:07 np0005548789.localdomain sudo[64062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:26:07 np0005548789.localdomain sudo[64062]: pam_unix(sudo:session): session closed for user root
Dec 06 08:26:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:26:22 np0005548789.localdomain podman[64077]: 2025-12-06 08:26:22.917564408 +0000 UTC m=+0.079208378 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:26:22 np0005548789.localdomain podman[64077]: 2025-12-06 08:26:22.947597908 +0000 UTC m=+0.109241888 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 08:26:22 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:26:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:26:24 np0005548789.localdomain podman[64098]: 2025-12-06 08:26:24.914601658 +0000 UTC m=+0.077982170 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, container_name=iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:26:24 np0005548789.localdomain podman[64098]: 2025-12-06 08:26:24.949627591 +0000 UTC m=+0.113008103 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 08:26:24 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:26:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:26:28 np0005548789.localdomain podman[64118]: 2025-12-06 08:26:28.904023727 +0000 UTC m=+0.066941582 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:26:29 np0005548789.localdomain podman[64118]: 2025-12-06 08:26:29.12017932 +0000 UTC m=+0.283097175 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:26:29 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:26:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:26:53 np0005548789.localdomain podman[64146]: 2025-12-06 08:26:53.918719794 +0000 UTC m=+0.081659257 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:26:53 np0005548789.localdomain podman[64146]: 2025-12-06 08:26:53.932396402 +0000 UTC m=+0.095335865 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:26:53 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:26:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:26:55 np0005548789.localdomain systemd[1]: tmp-crun.dYXxMM.mount: Deactivated successfully.
Dec 06 08:26:55 np0005548789.localdomain podman[64166]: 2025-12-06 08:26:55.913075302 +0000 UTC m=+0.067194415 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, container_name=iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:26:55 np0005548789.localdomain podman[64166]: 2025-12-06 08:26:55.950127437 +0000 UTC m=+0.104246530 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:26:55 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:26:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:26:59 np0005548789.localdomain systemd[1]: tmp-crun.PZYkzZ.mount: Deactivated successfully.
Dec 06 08:26:59 np0005548789.localdomain podman[64185]: 2025-12-06 08:26:59.919980873 +0000 UTC m=+0.086510052 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 06 08:27:00 np0005548789.localdomain podman[64185]: 2025-12-06 08:27:00.107515216 +0000 UTC m=+0.274044395 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:27:00 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:27:07 np0005548789.localdomain sudo[64214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:27:07 np0005548789.localdomain sudo[64214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:07 np0005548789.localdomain sudo[64214]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:07 np0005548789.localdomain sudo[64229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:27:07 np0005548789.localdomain sudo[64229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:07 np0005548789.localdomain sudo[64229]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:08 np0005548789.localdomain sudo[64275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:27:08 np0005548789.localdomain sudo[64275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:27:08 np0005548789.localdomain sudo[64275]: pam_unix(sudo:session): session closed for user root
Dec 06 08:27:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:27:24 np0005548789.localdomain podman[64290]: 2025-12-06 08:27:24.907876711 +0000 UTC m=+0.072253585 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:27:24 np0005548789.localdomain podman[64290]: 2025-12-06 08:27:24.940860145 +0000 UTC m=+0.105237039 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:27:24 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:27:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:27:26 np0005548789.localdomain podman[64310]: 2025-12-06 08:27:26.91968838 +0000 UTC m=+0.085589033 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:27:26 np0005548789.localdomain podman[64310]: 2025-12-06 08:27:26.928901096 +0000 UTC m=+0.094801759 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid)
Dec 06 08:27:26 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:27:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:27:30 np0005548789.localdomain systemd[1]: tmp-crun.1vZO6Y.mount: Deactivated successfully.
Dec 06 08:27:30 np0005548789.localdomain podman[64329]: 2025-12-06 08:27:30.940880816 +0000 UTC m=+0.106095025 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vcs-type=git, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:27:31 np0005548789.localdomain podman[64329]: 2025-12-06 08:27:31.137153051 +0000 UTC m=+0.302367270 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:27:31 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:27:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:27:55 np0005548789.localdomain podman[64358]: 2025-12-06 08:27:55.919003425 +0000 UTC m=+0.080803621 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z)
Dec 06 08:27:55 np0005548789.localdomain podman[64358]: 2025-12-06 08:27:55.956290168 +0000 UTC m=+0.118090344 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Dec 06 08:27:55 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:27:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:27:57 np0005548789.localdomain podman[64378]: 2025-12-06 08:27:57.914418556 +0000 UTC m=+0.077252246 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 06 08:27:57 np0005548789.localdomain podman[64378]: 2025-12-06 08:27:57.953111269 +0000 UTC m=+0.115944909 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:27:57 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:28:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:28:01 np0005548789.localdomain podman[64396]: 2025-12-06 08:28:01.914399698 +0000 UTC m=+0.077046169 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr)
Dec 06 08:28:02 np0005548789.localdomain podman[64396]: 2025-12-06 08:28:02.132173534 +0000 UTC m=+0.294819965 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, version=17.1.12, release=1761123044, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.)
Dec 06 08:28:02 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:28:08 np0005548789.localdomain sudo[64425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:28:08 np0005548789.localdomain sudo[64425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:08 np0005548789.localdomain sudo[64425]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:08 np0005548789.localdomain sudo[64440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:28:08 np0005548789.localdomain sudo[64440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:09 np0005548789.localdomain sudo[64440]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:10 np0005548789.localdomain sudo[64485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:28:10 np0005548789.localdomain sudo[64485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:28:10 np0005548789.localdomain sudo[64485]: pam_unix(sudo:session): session closed for user root
Dec 06 08:28:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:28:26 np0005548789.localdomain podman[64500]: 2025-12-06 08:28:26.915889186 +0000 UTC m=+0.078468482 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git)
Dec 06 08:28:26 np0005548789.localdomain podman[64500]: 2025-12-06 08:28:26.929079869 +0000 UTC m=+0.091659155 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-collectd, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z)
Dec 06 08:28:26 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:28:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:28:28 np0005548789.localdomain podman[64521]: 2025-12-06 08:28:28.922279132 +0000 UTC m=+0.083849432 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:28:28 np0005548789.localdomain podman[64521]: 2025-12-06 08:28:28.955789862 +0000 UTC m=+0.117360112 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Dec 06 08:28:28 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:28:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:28:32 np0005548789.localdomain podman[64539]: 2025-12-06 08:28:32.918079411 +0000 UTC m=+0.080078680 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Dec 06 08:28:33 np0005548789.localdomain podman[64539]: 2025-12-06 08:28:33.113238152 +0000 UTC m=+0.275237471 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:28:33 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:28:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:28:57 np0005548789.localdomain podman[64568]: 2025-12-06 08:28:57.915904467 +0000 UTC m=+0.078596526 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:28:57 np0005548789.localdomain podman[64568]: 2025-12-06 08:28:57.923136033 +0000 UTC m=+0.085828102 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., container_name=collectd, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 08:28:57 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:28:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:28:59 np0005548789.localdomain podman[64588]: 2025-12-06 08:28:59.919331605 +0000 UTC m=+0.077200533 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:28:59 np0005548789.localdomain podman[64588]: 2025-12-06 08:28:59.956249026 +0000 UTC m=+0.114117954 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git)
Dec 06 08:28:59 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:29:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:29:03 np0005548789.localdomain podman[64606]: 2025-12-06 08:29:03.893103197 +0000 UTC m=+0.059613970 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, version=17.1.12)
Dec 06 08:29:04 np0005548789.localdomain podman[64606]: 2025-12-06 08:29:04.101265756 +0000 UTC m=+0.267776519 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., container_name=metrics_qdr, version=17.1.12)
Dec 06 08:29:04 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:29:06 np0005548789.localdomain sshd[64635]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:08 np0005548789.localdomain sshd[64635]: Invalid user admin from 91.202.233.33 port 65518
Dec 06 08:29:09 np0005548789.localdomain sshd[64635]: Connection reset by invalid user admin 91.202.233.33 port 65518 [preauth]
Dec 06 08:29:09 np0005548789.localdomain sshd[64637]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:10 np0005548789.localdomain sudo[64639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:29:10 np0005548789.localdomain sudo[64639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:10 np0005548789.localdomain sudo[64639]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:10 np0005548789.localdomain sudo[64654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:29:10 np0005548789.localdomain sudo[64654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:11 np0005548789.localdomain sudo[64654]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:11 np0005548789.localdomain sudo[64702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:29:11 np0005548789.localdomain sudo[64702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:11 np0005548789.localdomain sudo[64702]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:11 np0005548789.localdomain sudo[64717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 08:29:11 np0005548789.localdomain sudo[64717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:12 np0005548789.localdomain sudo[64717]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:12 np0005548789.localdomain sshd[64637]: Connection reset by authenticating user root 91.202.233.33 port 65528 [preauth]
Dec 06 08:29:12 np0005548789.localdomain sshd[64750]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:14 np0005548789.localdomain sshd[64750]: Connection reset by authenticating user root 91.202.233.33 port 23422 [preauth]
Dec 06 08:29:14 np0005548789.localdomain sshd[64752]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:17 np0005548789.localdomain sudo[64754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:29:17 np0005548789.localdomain sudo[64754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:29:17 np0005548789.localdomain sudo[64754]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:17 np0005548789.localdomain sshd[64752]: Connection reset by authenticating user root 91.202.233.33 port 23430 [preauth]
Dec 06 08:29:17 np0005548789.localdomain sshd[64769]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:29:20 np0005548789.localdomain sshd[64769]: Connection reset by authenticating user root 91.202.233.33 port 23438 [preauth]
Dec 06 08:29:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:29:28 np0005548789.localdomain podman[64772]: 2025-12-06 08:29:28.923346628 +0000 UTC m=+0.087531313 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3)
Dec 06 08:29:28 np0005548789.localdomain podman[64772]: 2025-12-06 08:29:28.933390757 +0000 UTC m=+0.097575472 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:29:28 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:29:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:29:30 np0005548789.localdomain podman[64793]: 2025-12-06 08:29:30.912461569 +0000 UTC m=+0.076820033 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:29:30 np0005548789.localdomain podman[64793]: 2025-12-06 08:29:30.92156282 +0000 UTC m=+0.085921234 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 08:29:30 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:29:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:29:34 np0005548789.localdomain podman[64813]: 2025-12-06 08:29:34.928010827 +0000 UTC m=+0.088639515 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4)
Dec 06 08:29:35 np0005548789.localdomain podman[64813]: 2025-12-06 08:29:35.135191887 +0000 UTC m=+0.295820515 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git)
Dec 06 08:29:35 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:29:44 np0005548789.localdomain sudo[64888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmvpwsetrrupimyzycuujjbsjwxkuivz ; /usr/bin/python3
Dec 06 08:29:44 np0005548789.localdomain sudo[64888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:44 np0005548789.localdomain python3[64890]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:44 np0005548789.localdomain sudo[64888]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:45 np0005548789.localdomain sudo[64933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvsjwbkcwsnifsmvtkhpbyhxufiblehj ; /usr/bin/python3
Dec 06 08:29:45 np0005548789.localdomain sudo[64933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:45 np0005548789.localdomain python3[64935]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009784.6719065-107269-30986519723805/source _original_basename=tmp1uftwivi follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:45 np0005548789.localdomain sudo[64933]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:46 np0005548789.localdomain sudo[64995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etlieylfklznrvommnudxrdgymaecoti ; /usr/bin/python3
Dec 06 08:29:46 np0005548789.localdomain sudo[64995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:46 np0005548789.localdomain python3[64997]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:46 np0005548789.localdomain sudo[64995]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:46 np0005548789.localdomain sudo[65038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlkstvkpjcaszjthrxzksbortjlphdnv ; /usr/bin/python3
Dec 06 08:29:46 np0005548789.localdomain sudo[65038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:47 np0005548789.localdomain python3[65040]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009786.3400743-107379-225241340822480/source _original_basename=tmphmniqcct follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:47 np0005548789.localdomain sudo[65038]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:47 np0005548789.localdomain sudo[65100]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vossgvemzblakgybhucxdmtodwarrnus ; /usr/bin/python3
Dec 06 08:29:47 np0005548789.localdomain sudo[65100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:47 np0005548789.localdomain python3[65102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:47 np0005548789.localdomain sudo[65100]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:47 np0005548789.localdomain sudo[65143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oixzuiahxadtplzvncgsgmeqqlorxnbs ; /usr/bin/python3
Dec 06 08:29:47 np0005548789.localdomain sudo[65143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:47 np0005548789.localdomain python3[65145]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009787.2976177-107430-92736625283838/source _original_basename=tmpujw47kzv follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:47 np0005548789.localdomain sudo[65143]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:48 np0005548789.localdomain sudo[65205]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhcvcuohfvyjnugobjycuufecyvycjnm ; /usr/bin/python3
Dec 06 08:29:48 np0005548789.localdomain sudo[65205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:48 np0005548789.localdomain python3[65207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:48 np0005548789.localdomain sudo[65205]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:48 np0005548789.localdomain sudo[65248]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diuqxviirldntuuyzkxdmdlnfgqxfikm ; /usr/bin/python3
Dec 06 08:29:48 np0005548789.localdomain sudo[65248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:48 np0005548789.localdomain python3[65250]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009788.2169046-107487-40054258302436/source _original_basename=tmpilrv0q1p follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:48 np0005548789.localdomain sudo[65248]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:49 np0005548789.localdomain sudo[65278]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asmbxwkglbamgkuomnqpjkqftqvpxuaw ; /usr/bin/python3
Dec 06 08:29:49 np0005548789.localdomain sudo[65278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:49 np0005548789.localdomain python3[65280]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 08:29:49 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:49 np0005548789.localdomain systemd-sysv-generator[65310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:49 np0005548789.localdomain systemd-rc-local-generator[65305]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:49 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:49 np0005548789.localdomain systemd-sysv-generator[65343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:49 np0005548789.localdomain systemd-rc-local-generator[65339]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:50 np0005548789.localdomain sudo[65278]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:50 np0005548789.localdomain sudo[65367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjdweavnpyotwzhkjkeawnguhkmicxhd ; /usr/bin/python3
Dec 06 08:29:50 np0005548789.localdomain sudo[65367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:50 np0005548789.localdomain python3[65369]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:29:50 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:50 np0005548789.localdomain systemd-sysv-generator[65400]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:50 np0005548789.localdomain systemd-rc-local-generator[65395]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:50 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:51 np0005548789.localdomain systemd-sysv-generator[65439]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:51 np0005548789.localdomain systemd-rc-local-generator[65436]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:51 np0005548789.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Dec 06 08:29:51 np0005548789.localdomain sudo[65367]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:51 np0005548789.localdomain sudo[65459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crdkizaadbxrmlbucpywvktkntvnzaiy ; /usr/bin/python3
Dec 06 08:29:51 np0005548789.localdomain sudo[65459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:51 np0005548789.localdomain python3[65461]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:29:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:51 np0005548789.localdomain systemd-sysv-generator[65488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:51 np0005548789.localdomain systemd-rc-local-generator[65485]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:52 np0005548789.localdomain sudo[65459]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:52 np0005548789.localdomain sudo[65543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kehqjpygtbpahzubndrmflspssmjxiov ; /usr/bin/python3
Dec 06 08:29:52 np0005548789.localdomain sudo[65543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:52 np0005548789.localdomain python3[65545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:29:52 np0005548789.localdomain sudo[65543]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:52 np0005548789.localdomain sudo[65586]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxmfzxdotrfxljvqicanqqmjhkjsfvud ; /usr/bin/python3
Dec 06 08:29:52 np0005548789.localdomain sudo[65586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:53 np0005548789.localdomain python3[65588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009792.268081-107625-97876280645722/source _original_basename=tmpsvnm9f6v follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:29:53 np0005548789.localdomain sudo[65586]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:53 np0005548789.localdomain sudo[65616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjvwgguvuldplqjgirywabfaddyerjly ; /usr/bin/python3
Dec 06 08:29:53 np0005548789.localdomain sudo[65616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:53 np0005548789.localdomain python3[65618]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:29:53 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:29:53 np0005548789.localdomain systemd-sysv-generator[65644]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:29:53 np0005548789.localdomain systemd-rc-local-generator[65640]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:29:53 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:29:53 np0005548789.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Dec 06 08:29:53 np0005548789.localdomain sudo[65616]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:54 np0005548789.localdomain sudo[65670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpfvrbwphpoohdyihzzylvodsuzqvgwg ; /usr/bin/python3
Dec 06 08:29:54 np0005548789.localdomain sudo[65670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:54 np0005548789.localdomain python3[65672]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:29:54 np0005548789.localdomain sudo[65670]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:54 np0005548789.localdomain sudo[65720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sleaprafzytfwvxirghgkihggbwpemqo ; /usr/bin/python3
Dec 06 08:29:54 np0005548789.localdomain sudo[65720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:54 np0005548789.localdomain sudo[65720]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:55 np0005548789.localdomain sudo[65738]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrqxjijaklnuxtodyiworrcagqhuaeqp ; /usr/bin/python3
Dec 06 08:29:55 np0005548789.localdomain sudo[65738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:55 np0005548789.localdomain sudo[65738]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:55 np0005548789.localdomain sudo[65842]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqttcxarwozzclleohemiopetciymkpp ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009795.454371-107727-27560256888861/async_wrapper.py 436720557326 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009795.454371-107727-27560256888861/AnsiballZ_command.py _
Dec 06 08:29:55 np0005548789.localdomain sudo[65842]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:29:56 np0005548789.localdomain ansible-async_wrapper.py[65844]: Invoked with 436720557326 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009795.454371-107727-27560256888861/AnsiballZ_command.py _
Dec 06 08:29:56 np0005548789.localdomain ansible-async_wrapper.py[65847]: Starting module and watcher
Dec 06 08:29:56 np0005548789.localdomain ansible-async_wrapper.py[65847]: Start watching 65848 (3600)
Dec 06 08:29:56 np0005548789.localdomain ansible-async_wrapper.py[65848]: Start module (65848)
Dec 06 08:29:56 np0005548789.localdomain ansible-async_wrapper.py[65844]: Return async_wrapper task started.
Dec 06 08:29:56 np0005548789.localdomain sudo[65842]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:56 np0005548789.localdomain sudo[65863]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlbzumwzcbzronhboxriteattvqrtgev ; /usr/bin/python3
Dec 06 08:29:56 np0005548789.localdomain sudo[65863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:29:56 np0005548789.localdomain python3[65868]: ansible-ansible.legacy.async_status Invoked with jid=436720557326.65844 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:29:56 np0005548789.localdomain sudo[65863]: pam_unix(sudo:session): session closed for user root
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (file & line not available)
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (file & line not available)
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:29:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:29:59 np0005548789.localdomain puppet-user[65867]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.20 seconds
Dec 06 08:29:59 np0005548789.localdomain systemd[1]: tmp-crun.Jmu2K1.mount: Deactivated successfully.
Dec 06 08:29:59 np0005548789.localdomain podman[65978]: 2025-12-06 08:29:59.921667627 +0000 UTC m=+0.086171241 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:29:59 np0005548789.localdomain podman[65978]: 2025-12-06 08:29:59.963063762 +0000 UTC m=+0.127567376 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044)
Dec 06 08:29:59 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:30:01 np0005548789.localdomain ansible-async_wrapper.py[65847]: 65848 still running (3600)
Dec 06 08:30:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:30:01 np0005548789.localdomain systemd[1]: tmp-crun.K8qj44.mount: Deactivated successfully.
Dec 06 08:30:01 np0005548789.localdomain podman[66006]: 2025-12-06 08:30:01.927175938 +0000 UTC m=+0.088540922 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Dec 06 08:30:01 np0005548789.localdomain podman[66006]: 2025-12-06 08:30:01.966185102 +0000 UTC m=+0.127550056 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:30:01 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:30:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:30:05 np0005548789.localdomain systemd[1]: tmp-crun.iHuGtZ.mount: Deactivated successfully.
Dec 06 08:30:05 np0005548789.localdomain podman[66092]: 2025-12-06 08:30:05.924453818 +0000 UTC m=+0.084751762 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 06 08:30:06 np0005548789.localdomain ansible-async_wrapper.py[65847]: 65848 still running (3595)
Dec 06 08:30:06 np0005548789.localdomain podman[66092]: 2025-12-06 08:30:06.122911205 +0000 UTC m=+0.283209099 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:30:06 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:30:06 np0005548789.localdomain sudo[66135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgdepblbwvdzqojmctmflwvrvjhkiwbl ; /usr/bin/python3
Dec 06 08:30:06 np0005548789.localdomain sudo[66135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:06 np0005548789.localdomain python3[66137]: ansible-ansible.legacy.async_status Invoked with jid=436720557326.65844 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:30:06 np0005548789.localdomain sudo[66135]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:06 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:30:06 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:30:06 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:07 np0005548789.localdomain systemd-sysv-generator[66221]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:07 np0005548789.localdomain systemd-rc-local-generator[66212]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:07 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 08:30:07 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:30:07 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:30:07 np0005548789.localdomain systemd[1]: run-r37ea743efc694d4ba5d5352f0f592192.service: Deactivated successfully.
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}75e8aaaefd4e6fd8bd6c608f0585c92d05d76aa2865d7c4690c8ebae838476c5'
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Dec 06 08:30:08 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Dec 06 08:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 5168 writes, 22K keys, 5168 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5168 writes, 575 syncs, 8.99 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 138 writes, 383 keys, 138 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 138 writes, 69 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:30:11 np0005548789.localdomain ansible-async_wrapper.py[65847]: 65848 still running (3590)
Dec 06 08:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.2 total, 600.0 interval
                                                          Cumulative writes: 4467 writes, 20K keys, 4467 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4467 writes, 521 syncs, 8.57 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 124 writes, 346 keys, 124 commit groups, 1.0 writes per commit group, ingest: 0.29 MB, 0.00 MB/s
                                                          Interval WAL: 124 writes, 62 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:30:13 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Dec 06 08:30:13 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:13 np0005548789.localdomain systemd-rc-local-generator[67263]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:13 np0005548789.localdomain systemd-sysv-generator[67267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:13 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Dec 06 08:30:14 np0005548789.localdomain snmpd[67279]: Can't find directory of RPM packages
Dec 06 08:30:14 np0005548789.localdomain snmpd[67279]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:14 np0005548789.localdomain systemd-rc-local-generator[67307]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:14 np0005548789.localdomain systemd-sysv-generator[67310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:14 np0005548789.localdomain systemd-sysv-generator[67348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:14 np0005548789.localdomain systemd-rc-local-generator[67342]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Notice: Applied catalog in 14.94 seconds
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Application:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:    Initial environment: production
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:    Converged environment: production
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:          Run mode: user
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Changes:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:             Total: 8
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Events:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:           Success: 8
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:             Total: 8
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Resources:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:         Restarted: 1
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:           Changed: 8
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:       Out of sync: 8
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:             Total: 19
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Time:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:        Filebucket: 0.00
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:          Schedule: 0.00
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:            Augeas: 0.01
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:              File: 0.09
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:    Config retrieval: 0.26
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:           Service: 1.18
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:    Transaction evaluation: 14.93
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:    Catalog application: 14.94
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:          Last run: 1765009814
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:              Exec: 5.05
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:           Package: 8.42
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:             Total: 14.94
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]: Version:
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:            Config: 1765009799
Dec 06 08:30:14 np0005548789.localdomain puppet-user[65867]:            Puppet: 7.10.0
Dec 06 08:30:14 np0005548789.localdomain ansible-async_wrapper.py[65848]: Module complete (65848)
Dec 06 08:30:16 np0005548789.localdomain ansible-async_wrapper.py[65847]: Done in kid B.
Dec 06 08:30:16 np0005548789.localdomain sudo[67367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kibwwtluwwzhzfmsvozhmjdnwqnzfefs ; /usr/bin/python3
Dec 06 08:30:16 np0005548789.localdomain sudo[67367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:16 np0005548789.localdomain python3[67369]: ansible-ansible.legacy.async_status Invoked with jid=436720557326.65844 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:30:16 np0005548789.localdomain sudo[67367]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548789.localdomain sudo[67396]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puunyrlgmtfygufnciwbcnmjnzxkffgo ; /usr/bin/python3
Dec 06 08:30:17 np0005548789.localdomain sudo[67396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:17 np0005548789.localdomain sudo[67370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:30:17 np0005548789.localdomain sudo[67370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:17 np0005548789.localdomain sudo[67370]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548789.localdomain sudo[67401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:30:17 np0005548789.localdomain sudo[67401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:17 np0005548789.localdomain python3[67399]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:30:17 np0005548789.localdomain sudo[67396]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548789.localdomain sudo[67429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpgbwchjaukqlslgfecwtlbqoxieokhz ; /usr/bin/python3
Dec 06 08:30:17 np0005548789.localdomain sudo[67429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:17 np0005548789.localdomain python3[67431]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:17 np0005548789.localdomain sudo[67429]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:17 np0005548789.localdomain sudo[67401]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548789.localdomain sudo[67454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:30:18 np0005548789.localdomain sudo[67454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:18 np0005548789.localdomain sudo[67454]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548789.localdomain sudo[67469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:30:18 np0005548789.localdomain sudo[67469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:18 np0005548789.localdomain sudo[67529]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flmywozsuulmdboqckfeepsjsqoimxlo ; /usr/bin/python3
Dec 06 08:30:18 np0005548789.localdomain sudo[67529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:18 np0005548789.localdomain python3[67531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:18 np0005548789.localdomain sudo[67529]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548789.localdomain sudo[67562]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcqdavgsenevrfaagvmhlnuldqzvwlfh ; /usr/bin/python3
Dec 06 08:30:18 np0005548789.localdomain sudo[67562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:18 np0005548789.localdomain sudo[67469]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548789.localdomain python3[67566]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp56jj7vza recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:30:18 np0005548789.localdomain sudo[67562]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:18 np0005548789.localdomain sudo[67609]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsgzgpbpthcsjpgbhngziwpgtfdtyjls ; /usr/bin/python3
Dec 06 08:30:18 np0005548789.localdomain sudo[67609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:19 np0005548789.localdomain python3[67611]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:19 np0005548789.localdomain sudo[67609]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:19 np0005548789.localdomain sudo[67625]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdyiulwiyogccbvmoeapeknbgfnnljnm ; /usr/bin/python3
Dec 06 08:30:19 np0005548789.localdomain sudo[67625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:19 np0005548789.localdomain sudo[67625]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:19 np0005548789.localdomain sudo[67712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynvxsowxqzojvbawkgiqjppzbqgzwjle ; /usr/bin/python3
Dec 06 08:30:19 np0005548789.localdomain sudo[67712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:20 np0005548789.localdomain python3[67714]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:30:20 np0005548789.localdomain sudo[67712]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:20 np0005548789.localdomain sudo[67731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uggvyllrwkzurvegpljkzmsolklalyfw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:20 np0005548789.localdomain sudo[67731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:20 np0005548789.localdomain python3[67733]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:20 np0005548789.localdomain sudo[67731]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548789.localdomain sudo[67747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koxjmxvajtfhxppacpktqjaoyvfhucuo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:21 np0005548789.localdomain sudo[67747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:21 np0005548789.localdomain sudo[67747]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548789.localdomain sudo[67763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxtcihfrplwnmjlvjhzxrmeroejcchny ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:21 np0005548789.localdomain sudo[67763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:21 np0005548789.localdomain sudo[67766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:30:21 np0005548789.localdomain sudo[67766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:30:21 np0005548789.localdomain sudo[67766]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:21 np0005548789.localdomain python3[67765]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:21 np0005548789.localdomain sudo[67763]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:22 np0005548789.localdomain sudo[67828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlfhmrjtsrmzbcdkwxoobkrwoxmypmbl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:22 np0005548789.localdomain sudo[67828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:22 np0005548789.localdomain python3[67830]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:22 np0005548789.localdomain sudo[67828]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:22 np0005548789.localdomain sudo[67846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgtyrnbacsxxwmnuorplbaybjmvzoxen ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:22 np0005548789.localdomain sudo[67846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:22 np0005548789.localdomain python3[67848]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:22 np0005548789.localdomain sudo[67846]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548789.localdomain sudo[67908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eksjimevwgreabdwxyyxbmyrycycpiun ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548789.localdomain sudo[67908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:23 np0005548789.localdomain python3[67910]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:23 np0005548789.localdomain sudo[67908]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548789.localdomain sudo[67926]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxejfwtpuxqwdpqtphbkijztnggblesm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548789.localdomain sudo[67926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:23 np0005548789.localdomain python3[67928]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:23 np0005548789.localdomain sudo[67926]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:23 np0005548789.localdomain sudo[67988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osusfucxyiovsdsynwmiyoforqnlmdts ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:23 np0005548789.localdomain sudo[67988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548789.localdomain python3[67990]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:24 np0005548789.localdomain sudo[67988]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:24 np0005548789.localdomain sudo[68006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nusznhyqfcleifimkmehlnmvavdurkky ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:24 np0005548789.localdomain sudo[68006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548789.localdomain python3[68008]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:24 np0005548789.localdomain sudo[68006]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:24 np0005548789.localdomain sudo[68068]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffbwhgkkorhfntzpkoyvuzwawnsjgrmw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:24 np0005548789.localdomain sudo[68068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:24 np0005548789.localdomain python3[68070]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:24 np0005548789.localdomain sudo[68068]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:25 np0005548789.localdomain sudo[68086]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfgnafbxmvnztvfexbzaqdieccxfnxco ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:25 np0005548789.localdomain sudo[68086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:25 np0005548789.localdomain python3[68088]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:25 np0005548789.localdomain sudo[68086]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:25 np0005548789.localdomain sudo[68116]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klrxmqvzcoltrmhwoqlsljyzdoqzgsja ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:25 np0005548789.localdomain sudo[68116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:25 np0005548789.localdomain python3[68118]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:25 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:25 np0005548789.localdomain systemd-sysv-generator[68143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:25 np0005548789.localdomain systemd-rc-local-generator[68139]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:26 np0005548789.localdomain sudo[68116]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:26 np0005548789.localdomain sudo[68202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbojmlxsccgxbgyhvvymatvaylwjpued ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:26 np0005548789.localdomain sudo[68202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:26 np0005548789.localdomain python3[68204]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:26 np0005548789.localdomain sudo[68202]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:26 np0005548789.localdomain sudo[68220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziqsuafddmalllzafaifxcvpvcqtzoer ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:26 np0005548789.localdomain sudo[68220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:26 np0005548789.localdomain python3[68222]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:26 np0005548789.localdomain sudo[68220]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548789.localdomain sudo[68282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iodpnvubzqxncyvskruwvebzexcstdmb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548789.localdomain sudo[68282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:27 np0005548789.localdomain python3[68284]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:30:27 np0005548789.localdomain sudo[68282]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548789.localdomain sudo[68300]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efipczlnglvgrqiwitmlmrwclskvzvsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548789.localdomain sudo[68300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:27 np0005548789.localdomain python3[68302]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:27 np0005548789.localdomain sudo[68300]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:27 np0005548789.localdomain sudo[68330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czqglcgzqsybqepuwdnrxaxkgpnjrjud ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:27 np0005548789.localdomain sudo[68330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:28 np0005548789.localdomain python3[68332]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:28 np0005548789.localdomain systemd-sysv-generator[68361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:28 np0005548789.localdomain systemd-rc-local-generator[68358]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:30:28 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:30:28 np0005548789.localdomain sudo[68330]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:28 np0005548789.localdomain sudo[68387]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xinrgvdncmqwrxcuzbblcofovtcakcex ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:28 np0005548789.localdomain sudo[68387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:28 np0005548789.localdomain python3[68389]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:30:29 np0005548789.localdomain sudo[68387]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:29 np0005548789.localdomain sudo[68403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqcvcwydsleupxhvjtgfcfwwljmkqrwf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:29 np0005548789.localdomain sudo[68403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:29 np0005548789.localdomain sudo[68403]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:30:30 np0005548789.localdomain systemd[1]: tmp-crun.dzboxF.mount: Deactivated successfully.
Dec 06 08:30:30 np0005548789.localdomain podman[68433]: 2025-12-06 08:30:30.935868342 +0000 UTC m=+0.091181958 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:30:30 np0005548789.localdomain podman[68433]: 2025-12-06 08:30:30.947023475 +0000 UTC m=+0.102337041 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, container_name=collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1)
Dec 06 08:30:30 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:30:30 np0005548789.localdomain sudo[68467]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdnaoswhtbraklidtiruwycfospxzikm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:30 np0005548789.localdomain sudo[68467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:31 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:30:31 np0005548789.localdomain podman[68617]: 2025-12-06 08:30:31.437257104 +0000 UTC m=+0.070868185 container create b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Dec 06 08:30:31 np0005548789.localdomain podman[68614]: 2025-12-06 08:30:31.459550518 +0000 UTC m=+0.096289065 container create 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-type=git, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.scope.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.scope.
Dec 06 08:30:31 np0005548789.localdomain podman[68614]: 2025-12-06 08:30:31.395102611 +0000 UTC m=+0.031841178 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548789.localdomain podman[68617]: 2025-12-06 08:30:31.401626901 +0000 UTC m=+0.035238012 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd6425452938e99a947d98ed440c416f97c1a47fc1a973380479b4612f15ab3d/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7edd91eaf927f0cc6c745dda6c529d67c09cf793c73be3335bece938eeb713d/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.513976428 +0000 UTC m=+0.132662511 container create 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.430163116 +0000 UTC m=+0.048849209 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Dec 06 08:30:31 np0005548789.localdomain podman[68653]: 2025-12-06 08:30:31.451278344 +0000 UTC m=+0.036982735 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.556523633 +0000 UTC m=+0.131508836 container create e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=configure_cms_options, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd.scope.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1.scope.
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.58642507 +0000 UTC m=+0.205111153 container init 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:35:22Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.594908751 +0000 UTC m=+0.213594834 container start 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_libvirt_init_secret, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git)
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.595600131 +0000 UTC m=+0.214286214 container attach 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc.)
Dec 06 08:30:31 np0005548789.localdomain podman[68653]: 2025-12-06 08:30:31.611384616 +0000 UTC m=+0.197088987 container create a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.521338194 +0000 UTC m=+0.096323427 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:30:31 np0005548789.localdomain podman[68617]: 2025-12-06 08:30:31.631338108 +0000 UTC m=+0.264949189 container init b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:30:31 np0005548789.localdomain sudo[68721]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548789.localdomain sudo[68721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.648529685 +0000 UTC m=+0.223514878 container init e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.654127237 +0000 UTC m=+0.229112430 container start e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.buildah.version=1.41.4, container_name=configure_cms_options, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044)
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.654523169 +0000 UTC m=+0.229508382 container attach e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=configure_cms_options, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']})
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:30:31 np0005548789.localdomain podman[68617]: 2025-12-06 08:30:31.659598395 +0000 UTC m=+0.293209476 container start b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vcs-type=git, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:30:31 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=728090aef247cfdd273031dadf6d1125 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:30:31 np0005548789.localdomain podman[68614]: 2025-12-06 08:30:31.680837417 +0000 UTC m=+0.317575984 container init 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, container_name=logrotate_crond)
Dec 06 08:30:31 np0005548789.localdomain sudo[68753]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548789.localdomain sudo[68753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:31 np0005548789.localdomain sudo[68721]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: libpod-192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain podman[68631]: 2025-12-06 08:30:31.718381688 +0000 UTC m=+0.337067781 container died 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:31 np0005548789.localdomain sudo[68753]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:31 np0005548789.localdomain crond[68752]: (CRON) STARTUP (1.5.7)
Dec 06 08:30:31 np0005548789.localdomain crond[68752]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 0% if used.)
Dec 06 08:30:31 np0005548789.localdomain crond[68752]: (CRON) INFO (running with inotify support)
Dec 06 08:30:31 np0005548789.localdomain podman[68730]: 2025-12-06 08:30:31.744882492 +0000 UTC m=+0.079379787 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:12:45Z, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:30:31 np0005548789.localdomain ovs-vsctl[68785]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: libpod-e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain podman[68669]: 2025-12-06 08:30:31.752438373 +0000 UTC m=+0.327423596 container died e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z)
Dec 06 08:30:31 np0005548789.localdomain podman[68614]: 2025-12-06 08:30:31.762613986 +0000 UTC m=+0.399352543 container start 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, version=17.1.12, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:30:31 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.scope.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15de5573c617e73fedd1daaecfac821d4b4021582e250a3cae6d24e4b8e4cd51/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:31 np0005548789.localdomain podman[68756]: 2025-12-06 08:30:31.825271097 +0000 UTC m=+0.112956756 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:30:31 np0005548789.localdomain podman[68730]: 2025-12-06 08:30:31.860574841 +0000 UTC m=+0.195072156 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Dec 06 08:30:31 np0005548789.localdomain podman[68730]: unhealthy
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed with result 'exit-code'.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:30:31 np0005548789.localdomain podman[68653]: 2025-12-06 08:30:31.886041992 +0000 UTC m=+0.471746383 container init a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1)
Dec 06 08:30:31 np0005548789.localdomain sudo[68846]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:31 np0005548789.localdomain sudo[68846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:31 np0005548789.localdomain podman[68756]: 2025-12-06 08:30:31.912470043 +0000 UTC m=+0.200155752 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd-userdata-shm.mount: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807-merged.mount: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain podman[68768]: 2025-12-06 08:30:31.965121068 +0000 UTC m=+0.235209337 container cleanup 192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:31 np0005548789.localdomain systemd[1]: libpod-conmon-192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd.scope: Deactivated successfully.
Dec 06 08:30:31 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Dec 06 08:30:31 np0005548789.localdomain sudo[68846]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-30c5044896505b9166e77885065d3af47cd1d5cde049e01332c2cc6c18ba5026-merged.mount: Deactivated successfully.
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1-userdata-shm.mount: Deactivated successfully.
Dec 06 08:30:32 np0005548789.localdomain podman[68789]: 2025-12-06 08:30:32.024244742 +0000 UTC m=+0.258200322 container cleanup e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=configure_cms_options, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: libpod-conmon-e15d86348a4b5f0d83945da433efd8fb855d49225082af18e8a77aab7058aff1.scope: Deactivated successfully.
Dec 06 08:30:32 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Dec 06 08:30:32 np0005548789.localdomain podman[68653]: 2025-12-06 08:30:32.132610577 +0000 UTC m=+0.718314948 container start a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:30:32 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=728090aef247cfdd273031dadf6d1125 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Dec 06 08:30:32 np0005548789.localdomain podman[68861]: 2025-12-06 08:30:32.158612374 +0000 UTC m=+0.226681945 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team)
Dec 06 08:30:32 np0005548789.localdomain podman[68861]: 2025-12-06 08:30:32.168121916 +0000 UTC m=+0.236191467 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:30:32 np0005548789.localdomain podman[68861]: unhealthy
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Failed with result 'exit-code'.
Dec 06 08:30:32 np0005548789.localdomain podman[68901]: 2025-12-06 08:30:32.081632433 +0000 UTC m=+0.056696850 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, container_name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:30:32 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:32.289980484 +0000 UTC m=+0.076503438 container create 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, container_name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:30:32 np0005548789.localdomain podman[68901]: 2025-12-06 08:30:32.317042635 +0000 UTC m=+0.292107102 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started libpod-conmon-237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af.scope.
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:32 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:32.34885005 +0000 UTC m=+0.135372994 container init 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:30:32 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:32.356396122 +0000 UTC m=+0.142919066 container start 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:30:32 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:32.356543146 +0000 UTC m=+0.143066110 container attach 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, tcib_managed=true, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=setup_ovs_manager, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Dec 06 08:30:32 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:32.257618442 +0000 UTC m=+0.044141426 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:32 np0005548789.localdomain podman[69042]: 2025-12-06 08:30:32.394835052 +0000 UTC m=+0.062368846 container create 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started libpod-conmon-38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.scope.
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:32 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e4342af8bf5958b38bc295034feee0dd1522d1c796e48d3acbadb880cc49ff/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:30:32 np0005548789.localdomain podman[69042]: 2025-12-06 08:30:32.453417739 +0000 UTC m=+0.120951523 container init 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:30:32 np0005548789.localdomain sudo[69069]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:32 np0005548789.localdomain podman[69042]: 2025-12-06 08:30:32.367501183 +0000 UTC m=+0.035034967 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:30:32 np0005548789.localdomain sudo[69069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:30:32 np0005548789.localdomain podman[69042]: 2025-12-06 08:30:32.483990886 +0000 UTC m=+0.151524700 container start 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:30:32 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:30:32 np0005548789.localdomain sudo[69069]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:32 np0005548789.localdomain sshd[69100]: Server listening on 0.0.0.0 port 2022.
Dec 06 08:30:32 np0005548789.localdomain sshd[69100]: Server listening on :: port 2022.
Dec 06 08:30:32 np0005548789.localdomain podman[69071]: 2025-12-06 08:30:32.577365221 +0000 UTC m=+0.086430893 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public)
Dec 06 08:30:32 np0005548789.localdomain sudo[69117]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmppqwefemp/privsep.sock
Dec 06 08:30:32 np0005548789.localdomain sudo[69117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 08:30:32 np0005548789.localdomain podman[69071]: 2025-12-06 08:30:32.934137676 +0000 UTC m=+0.443203328 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:32 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:30:33 np0005548789.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 06 08:30:33 np0005548789.localdomain sudo[69117]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:35 np0005548789.localdomain ovs-vsctl[69249]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: libpod-237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af.scope: Deactivated successfully.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: libpod-237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af.scope: Consumed 2.881s CPU time.
Dec 06 08:30:35 np0005548789.localdomain podman[69010]: 2025-12-06 08:30:35.237384826 +0000 UTC m=+3.023907830 container died 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, vcs-type=git, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af-userdata-shm.mount: Deactivated successfully.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9a3d7308d621b6c88232a02a8b3043ca87b4d41445011f4603e8c51c260b2591-merged.mount: Deactivated successfully.
Dec 06 08:30:35 np0005548789.localdomain podman[69250]: 2025-12-06 08:30:35.344043778 +0000 UTC m=+0.096352346 container cleanup 237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: libpod-conmon-237adc5bc2c172820a03ac5be6390ecc453f7c348da1b610d2cf6f305320a6af.scope: Deactivated successfully.
Dec 06 08:30:35 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765008053 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765008053'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Dec 06 08:30:35 np0005548789.localdomain podman[69362]: 2025-12-06 08:30:35.717862537 +0000 UTC m=+0.072837665 container create 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 08:30:35 np0005548789.localdomain podman[69363]: 2025-12-06 08:30:35.751877961 +0000 UTC m=+0.097371159 container create 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.scope.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.scope.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:35 np0005548789.localdomain podman[69362]: 2025-12-06 08:30:35.67655975 +0000 UTC m=+0.031534948 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31be4bfa33f1fd50cae755746783d85d8683e10cd2caa7fdf7edb677e543b7f9/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31be4bfa33f1fd50cae755746783d85d8683e10cd2caa7fdf7edb677e543b7f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31be4bfa33f1fd50cae755746783d85d8683e10cd2caa7fdf7edb677e543b7f9/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9f3dee13341691faa8d64763052b5a13a4b5ff224f1a26e18c82d56bd99001/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9f3dee13341691faa8d64763052b5a13a4b5ff224f1a26e18c82d56bd99001/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c9f3dee13341691faa8d64763052b5a13a4b5ff224f1a26e18c82d56bd99001/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 08:30:35 np0005548789.localdomain podman[69363]: 2025-12-06 08:30:35.699769002 +0000 UTC m=+0.045262220 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:30:35 np0005548789.localdomain podman[69362]: 2025-12-06 08:30:35.809095645 +0000 UTC m=+0.164070853 container init 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:30:35 np0005548789.localdomain podman[69363]: 2025-12-06 08:30:35.827930194 +0000 UTC m=+0.173423422 container init 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1)
Dec 06 08:30:35 np0005548789.localdomain sudo[69400]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:30:35 np0005548789.localdomain sudo[69400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:30:35 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:30:35 np0005548789.localdomain podman[69362]: 2025-12-06 08:30:35.850605639 +0000 UTC m=+0.205580807 container start 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:30:35 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=270cf6e6b67cba1ef197c7fa89d5bb20 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:30:35 np0005548789.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:30:35 np0005548789.localdomain sudo[69400]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:35 np0005548789.localdomain podman[69363]: 2025-12-06 08:30:35.894671861 +0000 UTC m=+0.240165059 container start 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 06 08:30:35 np0005548789.localdomain python3[68469]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Dec 06 08:30:35 np0005548789.localdomain systemd[69420]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:30:35 np0005548789.localdomain podman[69405]: 2025-12-06 08:30:35.966862706 +0000 UTC m=+0.107889511 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:30:36 np0005548789.localdomain podman[69405]: 2025-12-06 08:30:36.006007557 +0000 UTC m=+0.147034352 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 08:30:36 np0005548789.localdomain podman[69405]: unhealthy
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Queued start job for default target Main User Target.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Created slice User Application Slice.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Reached target Paths.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Reached target Timers.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Starting D-Bus User Message Bus Socket...
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Starting Create User's Volatile Files and Directories...
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Finished Create User's Volatile Files and Directories.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Reached target Sockets.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Reached target Basic System.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Reached target Main User Target.
Dec 06 08:30:36 np0005548789.localdomain systemd[69420]: Startup finished in 120ms.
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: Started Session c9 of User root.
Dec 06 08:30:36 np0005548789.localdomain podman[69419]: 2025-12-06 08:30:36.090975714 +0000 UTC m=+0.198873913 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 06 08:30:36 np0005548789.localdomain podman[69419]: 2025-12-06 08:30:36.103978413 +0000 UTC m=+0.211876652 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:30:36 np0005548789.localdomain podman[69419]: unhealthy
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 08:30:36 np0005548789.localdomain sudo[68467]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:30:36 np0005548789.localdomain kernel: device br-int entered promiscuous mode
Dec 06 08:30:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765009836.1762] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Dec 06 08:30:36 np0005548789.localdomain systemd-udevd[69524]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 08:30:36 np0005548789.localdomain podman[69508]: 2025-12-06 08:30:36.208064226 +0000 UTC m=+0.049665565 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 06 08:30:36 np0005548789.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Dec 06 08:30:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765009836.2598] device (genev_sys_6081): carrier: link connected
Dec 06 08:30:36 np0005548789.localdomain systemd-udevd[69530]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 08:30:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765009836.2601] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Dec 06 08:30:36 np0005548789.localdomain podman[69508]: 2025-12-06 08:30:36.390161212 +0000 UTC m=+0.231762571 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, tcib_managed=true, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:30:36 np0005548789.localdomain sudo[69563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drpizxrdlljdgasakzqkzcojwiqkmqxf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:36 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:30:36 np0005548789.localdomain sudo[69563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:36 np0005548789.localdomain python3[69565]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:36 np0005548789.localdomain sudo[69563]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548789.localdomain sudo[69579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okybehihzhqvqfpjyjyxjptujpkqmecv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:36 np0005548789.localdomain sudo[69579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:36 np0005548789.localdomain python3[69581]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:36 np0005548789.localdomain sudo[69579]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:36 np0005548789.localdomain sudo[69595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqimjvaeadtfhhotfsuocvwouorlictr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:36 np0005548789.localdomain sudo[69595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548789.localdomain python3[69597]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548789.localdomain sudo[69595]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548789.localdomain sudo[69611]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysbkrjtsrwbogialvnvivrmsnnmiawdd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548789.localdomain sudo[69611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548789.localdomain python3[69613]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548789.localdomain sudo[69611]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548789.localdomain sudo[69627]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eracnzttoskpcuqmpwhnqrxecukmqkon ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548789.localdomain sudo[69627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548789.localdomain python3[69629]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548789.localdomain sudo[69631]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmpe2b0cwoz/privsep.sock
Dec 06 08:30:37 np0005548789.localdomain sudo[69631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 06 08:30:37 np0005548789.localdomain sudo[69627]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548789.localdomain sudo[69646]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghvznydeadpomxlrqhrvmmpkapdghsxs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548789.localdomain sudo[69646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:37 np0005548789.localdomain python3[69648]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:37 np0005548789.localdomain sudo[69646]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:37 np0005548789.localdomain sudo[69663]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxmonbrcoykvcrhhrmasucavbcmbqsmq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:37 np0005548789.localdomain sudo[69663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548789.localdomain python3[69665]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548789.localdomain sudo[69663]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548789.localdomain sudo[69679]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpnnmtnvmvqdriocahboupinkfxqequa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548789.localdomain sudo[69679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548789.localdomain sudo[69631]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548789.localdomain python3[69681]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548789.localdomain sudo[69679]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548789.localdomain sudo[69697]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzipavbltfecewixhxtriblhtnvdefdi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548789.localdomain sudo[69697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548789.localdomain python3[69699]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548789.localdomain sudo[69697]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548789.localdomain sudo[69715]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfqtzdwmbsiwouyalakropxekutlvyts ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548789.localdomain sudo[69715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548789.localdomain python3[69717]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548789.localdomain sudo[69715]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:38 np0005548789.localdomain sudo[69731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmhlcsveipfydhjaltrgrewfccfbyowe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:38 np0005548789.localdomain sudo[69731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:38 np0005548789.localdomain python3[69733]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:38 np0005548789.localdomain sudo[69731]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:39 np0005548789.localdomain sudo[69747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqmppdapahvggwktsulpknjrqyycpclv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:39 np0005548789.localdomain sudo[69747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:39 np0005548789.localdomain python3[69749]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:30:39 np0005548789.localdomain sudo[69747]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:39 np0005548789.localdomain sudo[69808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtktaccauohiorfwjywohbnutkpimfvn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:39 np0005548789.localdomain sudo[69808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:39 np0005548789.localdomain python3[69810]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:39 np0005548789.localdomain sudo[69808]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:40 np0005548789.localdomain sudo[69837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flolgptsqvaikbxiqorikjlijskcvgzx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:40 np0005548789.localdomain sudo[69837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:40 np0005548789.localdomain python3[69839]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:40 np0005548789.localdomain sudo[69837]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:40 np0005548789.localdomain sudo[69866]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwdthfrbmxquhfjcdzueiwaonzdgqlbp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:40 np0005548789.localdomain sudo[69866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:40 np0005548789.localdomain python3[69868]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:40 np0005548789.localdomain sudo[69866]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:41 np0005548789.localdomain sudo[69895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezzmeohpqwdkyrhdzptzzsbcduevfdoc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:41 np0005548789.localdomain sudo[69895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:41 np0005548789.localdomain python3[69897]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:41 np0005548789.localdomain sudo[69895]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:41 np0005548789.localdomain sudo[69924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvnepnjdgyqihafbdpktwzjxhmuypgek ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:41 np0005548789.localdomain sudo[69924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:41 np0005548789.localdomain python3[69926]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:41 np0005548789.localdomain sudo[69924]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:42 np0005548789.localdomain sudo[69953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iinwqoidbfedapyoxavccdlxippwapvn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:42 np0005548789.localdomain sudo[69953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:42 np0005548789.localdomain python3[69955]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765009839.2854874-109029-179926828752600/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:42 np0005548789.localdomain sudo[69953]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:42 np0005548789.localdomain sudo[69969]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbeohydzbzpuhrynbobapprdzpfihnnl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:42 np0005548789.localdomain sudo[69969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:42 np0005548789.localdomain python3[69971]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:30:42 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:42 np0005548789.localdomain systemd-sysv-generator[69999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:42 np0005548789.localdomain systemd-rc-local-generator[69996]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:42 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:42 np0005548789.localdomain sudo[69969]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:43 np0005548789.localdomain sudo[70021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuwdqfccyjcpiklvjqlwwyqisvzvtklg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:43 np0005548789.localdomain sudo[70021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:43 np0005548789.localdomain python3[70023]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:43 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:43 np0005548789.localdomain systemd-rc-local-generator[70052]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:43 np0005548789.localdomain systemd-sysv-generator[70057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:43 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:44 np0005548789.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 08:30:44 np0005548789.localdomain tripleo-start-podman-container[70063]: Creating additional drop-in dependency for "ceilometer_agent_compute" (a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9)
Dec 06 08:30:44 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:44 np0005548789.localdomain systemd-rc-local-generator[70111]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:44 np0005548789.localdomain systemd-sysv-generator[70117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:44 np0005548789.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 08:30:44 np0005548789.localdomain sudo[70021]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:44 np0005548789.localdomain sudo[70144]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvztbpavmxynaembarbcsgcnhembtxjb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:44 np0005548789.localdomain sudo[70144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:45 np0005548789.localdomain python3[70146]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:45 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:45 np0005548789.localdomain systemd-sysv-generator[70176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:45 np0005548789.localdomain systemd-rc-local-generator[70168]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:45 np0005548789.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Dec 06 08:30:45 np0005548789.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Dec 06 08:30:45 np0005548789.localdomain sudo[70144]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:45 np0005548789.localdomain sudo[70210]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rucjmiwzfcgaatyuqbnbtthweimxmxbx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:45 np0005548789.localdomain sudo[70210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:46 np0005548789.localdomain python3[70212]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Activating special unit Exit the Session...
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped target Main User Target.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped target Basic System.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped target Paths.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped target Sockets.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped target Timers.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Closed D-Bus User Message Bus Socket.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Removed slice User Application Slice.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Reached target Shutdown.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Finished Exit the Session.
Dec 06 08:30:46 np0005548789.localdomain systemd[69420]: Reached target Exit the Session.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:46 np0005548789.localdomain systemd-rc-local-generator[70238]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:46 np0005548789.localdomain systemd-sysv-generator[70241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Starting logrotate_crond container...
Dec 06 08:30:46 np0005548789.localdomain systemd[1]: Started logrotate_crond container.
Dec 06 08:30:46 np0005548789.localdomain sudo[70210]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:47 np0005548789.localdomain sudo[70279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggssprbiinkzkhhxybeezphqrrwwkkpl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:47 np0005548789.localdomain sudo[70279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:47 np0005548789.localdomain python3[70281]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:47 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:47 np0005548789.localdomain systemd-rc-local-generator[70305]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:47 np0005548789.localdomain systemd-sysv-generator[70308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:47 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:47 np0005548789.localdomain systemd[1]: Starting nova_migration_target container...
Dec 06 08:30:47 np0005548789.localdomain systemd[1]: Started nova_migration_target container.
Dec 06 08:30:47 np0005548789.localdomain sudo[70279]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:48 np0005548789.localdomain sudo[70346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqytgrjuwshxvpyovkusnqrhkmpqldrb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:48 np0005548789.localdomain sudo[70346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:48 np0005548789.localdomain python3[70348]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:48 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:48 np0005548789.localdomain systemd-rc-local-generator[70373]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:48 np0005548789.localdomain systemd-sysv-generator[70378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:48 np0005548789.localdomain systemd[1]: Starting ovn_controller container...
Dec 06 08:30:49 np0005548789.localdomain tripleo-start-podman-container[70388]: Creating additional drop-in dependency for "ovn_controller" (1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076)
Dec 06 08:30:49 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:49 np0005548789.localdomain systemd-sysv-generator[70451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:49 np0005548789.localdomain systemd-rc-local-generator[70448]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:49 np0005548789.localdomain systemd[1]: Started ovn_controller container.
Dec 06 08:30:49 np0005548789.localdomain sudo[70346]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:49 np0005548789.localdomain sudo[70470]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqkooaiuhyaodkwckhtznzkogqagkpjk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:30:49 np0005548789.localdomain sudo[70470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:50 np0005548789.localdomain python3[70472]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:30:50 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:30:50 np0005548789.localdomain systemd-sysv-generator[70504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:30:50 np0005548789.localdomain systemd-rc-local-generator[70496]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:30:50 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:30:50 np0005548789.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 06 08:30:50 np0005548789.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 06 08:30:50 np0005548789.localdomain sudo[70470]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:50 np0005548789.localdomain sudo[70552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsumopxqoomfaygmnsdezxwcxylcsfgz ; /usr/bin/python3
Dec 06 08:30:50 np0005548789.localdomain sudo[70552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:50 np0005548789.localdomain python3[70554]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:50 np0005548789.localdomain sudo[70552]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:51 np0005548789.localdomain sudo[70600]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqzloflirnahuituseuulokhccgykuic ; /usr/bin/python3
Dec 06 08:30:51 np0005548789.localdomain sudo[70600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:51 np0005548789.localdomain sudo[70600]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:51 np0005548789.localdomain sudo[70643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnfjhburyhrybafdffgagvbthrtrmboo ; /usr/bin/python3
Dec 06 08:30:51 np0005548789.localdomain sudo[70643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:52 np0005548789.localdomain sudo[70643]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:52 np0005548789.localdomain sudo[70673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbewfilbfcylnzwucadvsmlcfrlhezwu ; /usr/bin/python3
Dec 06 08:30:52 np0005548789.localdomain sudo[70673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:52 np0005548789.localdomain python3[70675]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005548789 step=4 update_config_hash_only=False
Dec 06 08:30:52 np0005548789.localdomain sudo[70673]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:53 np0005548789.localdomain sudo[70689]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbmhwufmrzzajjpqhvxgdkekqjenwvun ; /usr/bin/python3
Dec 06 08:30:53 np0005548789.localdomain sudo[70689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:53 np0005548789.localdomain python3[70691]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:30:53 np0005548789.localdomain sudo[70689]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:53 np0005548789.localdomain sudo[70705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywycwckhrkgtqetvuazjahaqmsuohfzt ; /usr/bin/python3
Dec 06 08:30:53 np0005548789.localdomain sudo[70705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:30:53 np0005548789.localdomain python3[70707]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:30:53 np0005548789.localdomain sudo[70705]: pam_unix(sudo:session): session closed for user root
Dec 06 08:30:56 np0005548789.localdomain sshd[70709]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:30:56 np0005548789.localdomain sshd[70709]: Invalid user solana from 92.118.39.95 port 32798
Dec 06 08:30:56 np0005548789.localdomain sshd[70709]: Connection closed by invalid user solana 92.118.39.95 port 32798 [preauth]
Dec 06 08:31:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:31:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:31:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:31:01 np0005548789.localdomain podman[70712]: 2025-12-06 08:31:01.945341334 +0000 UTC m=+0.107858032 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:31:01 np0005548789.localdomain podman[70712]: 2025-12-06 08:31:01.984040566 +0000 UTC m=+0.146557184 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, vcs-type=git, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:31:01 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: tmp-crun.oZvhxE.mount: Deactivated successfully.
Dec 06 08:31:02 np0005548789.localdomain podman[70729]: 2025-12-06 08:31:02.039084107 +0000 UTC m=+0.084584184 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:31:02 np0005548789.localdomain podman[70729]: 2025-12-06 08:31:02.072008561 +0000 UTC m=+0.117508648 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 08:31:02 np0005548789.localdomain podman[70730]: 2025-12-06 08:31:02.084807772 +0000 UTC m=+0.129275637 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, name=rhosp17/openstack-cron, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:31:02 np0005548789.localdomain podman[70730]: 2025-12-06 08:31:02.092905029 +0000 UTC m=+0.137372914 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:31:02 np0005548789.localdomain podman[70778]: 2025-12-06 08:31:02.896923783 +0000 UTC m=+0.060414996 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1)
Dec 06 08:31:02 np0005548789.localdomain podman[70778]: 2025-12-06 08:31:02.922423791 +0000 UTC m=+0.085914994 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:31:02 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:31:03 np0005548789.localdomain systemd[1]: tmp-crun.i3Nmrg.mount: Deactivated successfully.
Dec 06 08:31:03 np0005548789.localdomain podman[70779]: 2025-12-06 08:31:03.002582068 +0000 UTC m=+0.163190223 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible)
Dec 06 08:31:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:31:03 np0005548789.localdomain podman[70779]: 2025-12-06 08:31:03.036202055 +0000 UTC m=+0.196810220 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true)
Dec 06 08:31:03 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:31:03 np0005548789.localdomain podman[70826]: 2025-12-06 08:31:03.087966044 +0000 UTC m=+0.063124578 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 06 08:31:03 np0005548789.localdomain podman[70826]: 2025-12-06 08:31:03.484355225 +0000 UTC m=+0.459513769 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:31:03 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:31:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:31:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:31:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:31:06 np0005548789.localdomain systemd[1]: tmp-crun.UAQdFI.mount: Deactivated successfully.
Dec 06 08:31:06 np0005548789.localdomain podman[70850]: 2025-12-06 08:31:06.908553481 +0000 UTC m=+0.071486233 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=)
Dec 06 08:31:06 np0005548789.localdomain systemd[1]: tmp-crun.upn2SK.mount: Deactivated successfully.
Dec 06 08:31:06 np0005548789.localdomain podman[70849]: 2025-12-06 08:31:06.970157042 +0000 UTC m=+0.133094095 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 08:31:07 np0005548789.localdomain podman[70851]: 2025-12-06 08:31:07.01370203 +0000 UTC m=+0.172431674 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.12)
Dec 06 08:31:07 np0005548789.localdomain podman[70849]: 2025-12-06 08:31:07.024231302 +0000 UTC m=+0.187168365 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:31:07 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:31:07 np0005548789.localdomain podman[70851]: 2025-12-06 08:31:07.05919074 +0000 UTC m=+0.217920394 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:31:07 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:31:07 np0005548789.localdomain podman[70850]: 2025-12-06 08:31:07.110545957 +0000 UTC m=+0.273478719 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:07 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:31:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 08:31:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 08:31:21 np0005548789.localdomain sudo[70922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:31:21 np0005548789.localdomain sudo[70922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:21 np0005548789.localdomain sudo[70922]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:22 np0005548789.localdomain sudo[70937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:31:22 np0005548789.localdomain sudo[70937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:22 np0005548789.localdomain sudo[70937]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:22 np0005548789.localdomain sudo[70983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:31:22 np0005548789.localdomain sudo[70983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:22 np0005548789.localdomain sudo[70983]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:22 np0005548789.localdomain sudo[70998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 08:31:22 np0005548789.localdomain sudo[70998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.597646981 +0000 UTC m=+0.076750754 container create 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1763362218, name=rhceph, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:31:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df.scope.
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.566000185 +0000 UTC m=+0.045103988 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:31:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.688597658 +0000 UTC m=+0.167701421 container init 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1763362218, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.702198483 +0000 UTC m=+0.181302246 container start 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, distribution-scope=public, release=1763362218, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git)
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.702479341 +0000 UTC m=+0.181583154 container attach 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public)
Dec 06 08:31:23 np0005548789.localdomain relaxed_hypatia[71069]: 167 167
Dec 06 08:31:23 np0005548789.localdomain systemd[1]: libpod-06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df.scope: Deactivated successfully.
Dec 06 08:31:23 np0005548789.localdomain podman[71054]: 2025-12-06 08:31:23.70505231 +0000 UTC m=+0.184156073 container died 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:31:23 np0005548789.localdomain podman[71074]: 2025-12-06 08:31:23.791823809 +0000 UTC m=+0.074744673 container remove 06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_hypatia, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.41.4, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Dec 06 08:31:23 np0005548789.localdomain systemd[1]: libpod-conmon-06eded59e56a478e3b341c8b5aa6fd712a3fe774dfa303980f3425e4ff3242df.scope: Deactivated successfully.
Dec 06 08:31:23 np0005548789.localdomain podman[71095]: 
Dec 06 08:31:23 np0005548789.localdomain podman[71095]: 2025-12-06 08:31:23.998570599 +0000 UTC m=+0.069436640 container create 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 08:31:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58.scope.
Dec 06 08:31:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:31:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c85c724648715eadbccbb0bc5f6d6728d35df2e52420deaaae74774b2b67e410/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c85c724648715eadbccbb0bc5f6d6728d35df2e52420deaaae74774b2b67e410/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c85c724648715eadbccbb0bc5f6d6728d35df2e52420deaaae74774b2b67e410/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 08:31:24 np0005548789.localdomain podman[71095]: 2025-12-06 08:31:24.054747364 +0000 UTC m=+0.125613385 container init 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Dec 06 08:31:24 np0005548789.localdomain podman[71095]: 2025-12-06 08:31:24.064226704 +0000 UTC m=+0.135092725 container start 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z)
Dec 06 08:31:24 np0005548789.localdomain podman[71095]: 2025-12-06 08:31:24.064395979 +0000 UTC m=+0.135262050 container attach 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:31:24 np0005548789.localdomain podman[71095]: 2025-12-06 08:31:23.970032348 +0000 UTC m=+0.040898419 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 08:31:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ae677c7704d873443cf126f0407d68f81442eed3a10be4e81453f59c6f47f9d4-merged.mount: Deactivated successfully.
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]: [
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:     {
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "available": false,
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "ceph_device": false,
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "lsm_data": {},
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "lvs": [],
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "path": "/dev/sr0",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "rejected_reasons": [
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "Insufficient space (<5GB)",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "Has a FileSystem"
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         ],
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         "sys_api": {
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "actuators": null,
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "device_nodes": "sr0",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "human_readable_size": "482.00 KB",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "id_bus": "ata",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "model": "QEMU DVD-ROM",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "nr_requests": "2",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "partitions": {},
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "path": "/dev/sr0",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "removable": "1",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "rev": "2.5+",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "ro": "0",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "rotational": "1",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "sas_address": "",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "sas_device_handle": "",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "scheduler_mode": "mq-deadline",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "sectors": 0,
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "sectorsize": "2048",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "size": 493568.0,
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "support_discard": "0",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "type": "disk",
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:             "vendor": "QEMU"
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:         }
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]:     }
Dec 06 08:31:24 np0005548789.localdomain determined_carson[71110]: ]
Dec 06 08:31:25 np0005548789.localdomain systemd[1]: libpod-2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58.scope: Deactivated successfully.
Dec 06 08:31:25 np0005548789.localdomain podman[73127]: 2025-12-06 08:31:25.083989593 +0000 UTC m=+0.048181441 container died 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 08:31:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c85c724648715eadbccbb0bc5f6d6728d35df2e52420deaaae74774b2b67e410-merged.mount: Deactivated successfully.
Dec 06 08:31:25 np0005548789.localdomain podman[73127]: 2025-12-06 08:31:25.123066106 +0000 UTC m=+0.087257934 container remove 2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_carson, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 08:31:25 np0005548789.localdomain systemd[1]: libpod-conmon-2e4b9aa03f61b6547271efbb4f49ca3ed56f6495247ff4adbe6ad3f8b1fa5e58.scope: Deactivated successfully.
Dec 06 08:31:25 np0005548789.localdomain sudo[70998]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:25 np0005548789.localdomain sudo[73142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:31:25 np0005548789.localdomain sudo[73142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:31:25 np0005548789.localdomain sudo[73142]: pam_unix(sudo:session): session closed for user root
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:31:32 np0005548789.localdomain podman[73159]: 2025-12-06 08:31:32.923471891 +0000 UTC m=+0.082981734 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z)
Dec 06 08:31:32 np0005548789.localdomain podman[73159]: 2025-12-06 08:31:32.953076925 +0000 UTC m=+0.112586798 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:31:32 np0005548789.localdomain podman[73157]: 2025-12-06 08:31:32.972988193 +0000 UTC m=+0.133645001 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_id=tripleo_step4, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron)
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:31:32 np0005548789.localdomain podman[73157]: 2025-12-06 08:31:32.98011982 +0000 UTC m=+0.140776638 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:31:32 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: tmp-crun.qmMigW.mount: Deactivated successfully.
Dec 06 08:31:33 np0005548789.localdomain podman[73208]: 2025-12-06 08:31:33.052299123 +0000 UTC m=+0.075145595 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 08:31:33 np0005548789.localdomain podman[73208]: 2025-12-06 08:31:33.079960028 +0000 UTC m=+0.102806470 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team)
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:31:33 np0005548789.localdomain podman[73158]: 2025-12-06 08:31:33.032375085 +0000 UTC m=+0.193284371 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com)
Dec 06 08:31:33 np0005548789.localdomain podman[73158]: 2025-12-06 08:31:33.16945153 +0000 UTC m=+0.330360846 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:31:33 np0005548789.localdomain podman[73245]: 2025-12-06 08:31:33.187003115 +0000 UTC m=+0.084202631 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:31:33 np0005548789.localdomain podman[73245]: 2025-12-06 08:31:33.19533537 +0000 UTC m=+0.092534856 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:31:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:31:33 np0005548789.localdomain podman[73268]: 2025-12-06 08:31:33.906891841 +0000 UTC m=+0.069733199 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 06 08:31:34 np0005548789.localdomain podman[73268]: 2025-12-06 08:31:34.286249602 +0000 UTC m=+0.449090960 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:31:34 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:31:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:31:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:31:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:31:37 np0005548789.localdomain podman[73293]: 2025-12-06 08:31:37.934119586 +0000 UTC m=+0.090399770 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:31:37 np0005548789.localdomain podman[73292]: 2025-12-06 08:31:37.913048632 +0000 UTC m=+0.074459403 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:31:37 np0005548789.localdomain podman[73291]: 2025-12-06 08:31:37.97324212 +0000 UTC m=+0.137098406 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:31:37 np0005548789.localdomain podman[73293]: 2025-12-06 08:31:37.990077574 +0000 UTC m=+0.146357788 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:37 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:31:38 np0005548789.localdomain podman[73291]: 2025-12-06 08:31:38.019391159 +0000 UTC m=+0.183247395 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:31:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:31:38 np0005548789.localdomain podman[73292]: 2025-12-06 08:31:38.095136821 +0000 UTC m=+0.256547612 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:31:38 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:32:03 np0005548789.localdomain systemd[1]: tmp-crun.JGxccm.mount: Deactivated successfully.
Dec 06 08:32:03 np0005548789.localdomain podman[73367]: 2025-12-06 08:32:03.897821284 +0000 UTC m=+0.061790927 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-cron, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:32:03 np0005548789.localdomain podman[73373]: 2025-12-06 08:32:03.973294108 +0000 UTC m=+0.126317167 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:32:04 np0005548789.localdomain podman[73368]: 2025-12-06 08:32:04.008520914 +0000 UTC m=+0.170558408 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, tcib_managed=true, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:32:04 np0005548789.localdomain podman[73379]: 2025-12-06 08:32:03.960928741 +0000 UTC m=+0.116014723 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 06 08:32:04 np0005548789.localdomain podman[73368]: 2025-12-06 08:32:04.019995654 +0000 UTC m=+0.182033158 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044)
Dec 06 08:32:04 np0005548789.localdomain podman[73369]: 2025-12-06 08:32:03.929325026 +0000 UTC m=+0.085578163 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1)
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:32:04 np0005548789.localdomain podman[73367]: 2025-12-06 08:32:04.032714642 +0000 UTC m=+0.196684285 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:32:04 np0005548789.localdomain podman[73379]: 2025-12-06 08:32:04.040061076 +0000 UTC m=+0.195147028 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:32:04 np0005548789.localdomain podman[73369]: 2025-12-06 08:32:04.061081408 +0000 UTC m=+0.217334565 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, version=17.1.12, release=1761123044, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:32:04 np0005548789.localdomain podman[73373]: 2025-12-06 08:32:04.083546143 +0000 UTC m=+0.236569222 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-type=git, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:32:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:32:04 np0005548789.localdomain podman[73476]: 2025-12-06 08:32:04.913501919 +0000 UTC m=+0.075041102 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true)
Dec 06 08:32:05 np0005548789.localdomain podman[73476]: 2025-12-06 08:32:05.338192453 +0000 UTC m=+0.499731616 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Dec 06 08:32:05 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:32:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:32:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:32:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:32:08 np0005548789.localdomain systemd[1]: tmp-crun.xPJYBR.mount: Deactivated successfully.
Dec 06 08:32:08 np0005548789.localdomain podman[73501]: 2025-12-06 08:32:08.923550169 +0000 UTC m=+0.082422776 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:32:08 np0005548789.localdomain podman[73500]: 2025-12-06 08:32:08.968710218 +0000 UTC m=+0.131402273 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:32:09 np0005548789.localdomain podman[73502]: 2025-12-06 08:32:09.018350963 +0000 UTC m=+0.174330612 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:32:09 np0005548789.localdomain podman[73500]: 2025-12-06 08:32:09.044413928 +0000 UTC m=+0.207105993 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:32:09 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:32:09 np0005548789.localdomain podman[73502]: 2025-12-06 08:32:09.075234979 +0000 UTC m=+0.231214628 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=)
Dec 06 08:32:09 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:32:09 np0005548789.localdomain podman[73501]: 2025-12-06 08:32:09.118987265 +0000 UTC m=+0.277859822 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:32:09 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:32:09 np0005548789.localdomain systemd[1]: tmp-crun.JuIaC0.mount: Deactivated successfully.
Dec 06 08:32:15 np0005548789.localdomain sshd[73576]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:20 np0005548789.localdomain sshd[73576]: Invalid user www from 45.140.17.124 port 55938
Dec 06 08:32:21 np0005548789.localdomain sshd[73576]: Connection reset by invalid user www 45.140.17.124 port 55938 [preauth]
Dec 06 08:32:21 np0005548789.localdomain sshd[73578]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:22 np0005548789.localdomain sshd[73578]: Invalid user oracle from 45.140.17.124 port 55974
Dec 06 08:32:23 np0005548789.localdomain sshd[73578]: Connection reset by invalid user oracle 45.140.17.124 port 55974 [preauth]
Dec 06 08:32:23 np0005548789.localdomain sshd[73580]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:25 np0005548789.localdomain sudo[73582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:32:25 np0005548789.localdomain sudo[73582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:25 np0005548789.localdomain sudo[73582]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:25 np0005548789.localdomain sudo[73597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:32:25 np0005548789.localdomain sudo[73597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:26 np0005548789.localdomain sshd[73580]: Connection reset by authenticating user root 45.140.17.124 port 41152 [preauth]
Dec 06 08:32:26 np0005548789.localdomain sshd[73694]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:26 np0005548789.localdomain podman[73683]: 2025-12-06 08:32:26.690669676 +0000 UTC m=+0.144561323 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.openshift.tags=rhceph ceph, release=1763362218, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 08:32:26 np0005548789.localdomain podman[73683]: 2025-12-06 08:32:26.793460104 +0000 UTC m=+0.247351771 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, name=rhceph, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1763362218)
Dec 06 08:32:27 np0005548789.localdomain sudo[73597]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:27 np0005548789.localdomain sudo[73749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:32:27 np0005548789.localdomain sudo[73749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:27 np0005548789.localdomain sudo[73749]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:27 np0005548789.localdomain sudo[73764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:32:27 np0005548789.localdomain sudo[73764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:27 np0005548789.localdomain sudo[73764]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:28 np0005548789.localdomain sshd[73694]: Connection reset by authenticating user root 45.140.17.124 port 41172 [preauth]
Dec 06 08:32:28 np0005548789.localdomain sudo[73810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:32:28 np0005548789.localdomain sudo[73810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:32:28 np0005548789.localdomain sudo[73810]: pam_unix(sudo:session): session closed for user root
Dec 06 08:32:28 np0005548789.localdomain sshd[73825]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:32:30 np0005548789.localdomain sshd[73825]: Connection reset by authenticating user root 45.140.17.124 port 41186 [preauth]
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:32:34 np0005548789.localdomain systemd[1]: tmp-crun.HcpkEd.mount: Deactivated successfully.
Dec 06 08:32:34 np0005548789.localdomain podman[73830]: 2025-12-06 08:32:34.929416852 +0000 UTC m=+0.087696738 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:32:34 np0005548789.localdomain podman[73828]: 2025-12-06 08:32:34.974960452 +0000 UTC m=+0.132577058 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 06 08:32:34 np0005548789.localdomain podman[73830]: 2025-12-06 08:32:34.994441076 +0000 UTC m=+0.152721012 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:32:35 np0005548789.localdomain podman[73828]: 2025-12-06 08:32:35.008516486 +0000 UTC m=+0.166133112 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:32:35 np0005548789.localdomain podman[73829]: 2025-12-06 08:32:35.069679733 +0000 UTC m=+0.227653189 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:32:35 np0005548789.localdomain podman[73829]: 2025-12-06 08:32:35.121281479 +0000 UTC m=+0.279254925 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:32:35 np0005548789.localdomain podman[73831]: 2025-12-06 08:32:35.135259756 +0000 UTC m=+0.288474937 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:32:35 np0005548789.localdomain podman[73831]: 2025-12-06 08:32:35.163975722 +0000 UTC m=+0.317190883 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:32:35 np0005548789.localdomain podman[73827]: 2025-12-06 08:32:35.235546726 +0000 UTC m=+0.392653936 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Dec 06 08:32:35 np0005548789.localdomain podman[73827]: 2025-12-06 08:32:35.267914105 +0000 UTC m=+0.425021285 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, version=17.1.12, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:32:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:32:35 np0005548789.localdomain podman[73937]: 2025-12-06 08:32:35.916879436 +0000 UTC m=+0.077854648 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:32:36 np0005548789.localdomain podman[73937]: 2025-12-06 08:32:36.286282542 +0000 UTC m=+0.447257804 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z)
Dec 06 08:32:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:32:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:32:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:32:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:32:39 np0005548789.localdomain podman[73961]: 2025-12-06 08:32:39.893018151 +0000 UTC m=+0.058582900 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller)
Dec 06 08:32:39 np0005548789.localdomain podman[73961]: 2025-12-06 08:32:39.935090845 +0000 UTC m=+0.100655604 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 06 08:32:39 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:32:39 np0005548789.localdomain podman[73968]: 2025-12-06 08:32:39.947202344 +0000 UTC m=+0.101595372 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:32:40 np0005548789.localdomain podman[73968]: 2025-12-06 08:32:40.008165785 +0000 UTC m=+0.162558823 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Dec 06 08:32:40 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:32:40 np0005548789.localdomain podman[73962]: 2025-12-06 08:32:40.025417912 +0000 UTC m=+0.187442393 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd)
Dec 06 08:32:40 np0005548789.localdomain podman[73962]: 2025-12-06 08:32:40.219206747 +0000 UTC m=+0.381231268 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 08:32:40 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:32:40 np0005548789.localdomain systemd[1]: tmp-crun.uGMz2E.mount: Deactivated successfully.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: tmp-crun.IjPAEY.mount: Deactivated successfully.
Dec 06 08:33:05 np0005548789.localdomain podman[74037]: 2025-12-06 08:33:05.912729327 +0000 UTC m=+0.074438472 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-cron, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:33:05 np0005548789.localdomain podman[74038]: 2025-12-06 08:33:05.96750809 +0000 UTC m=+0.125557324 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:33:05 np0005548789.localdomain podman[74038]: 2025-12-06 08:33:05.973515883 +0000 UTC m=+0.131565107 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:33:05 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:33:05 np0005548789.localdomain podman[74037]: 2025-12-06 08:33:05.997238627 +0000 UTC m=+0.158947752 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, tcib_managed=true, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:33:06 np0005548789.localdomain podman[74040]: 2025-12-06 08:33:06.077442316 +0000 UTC m=+0.231392925 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:33:06 np0005548789.localdomain podman[74041]: 2025-12-06 08:33:05.949893461 +0000 UTC m=+0.108383118 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64)
Dec 06 08:33:06 np0005548789.localdomain podman[74040]: 2025-12-06 08:33:06.111050541 +0000 UTC m=+0.265001150 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:33:06 np0005548789.localdomain podman[74041]: 2025-12-06 08:33:06.13427791 +0000 UTC m=+0.292767587 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:33:06 np0005548789.localdomain podman[74039]: 2025-12-06 08:33:05.900238956 +0000 UTC m=+0.063083527 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute)
Dec 06 08:33:06 np0005548789.localdomain podman[74039]: 2025-12-06 08:33:06.182538693 +0000 UTC m=+0.345383314 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, version=17.1.12)
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:33:06 np0005548789.localdomain systemd[1]: tmp-crun.kLeSDz.mount: Deactivated successfully.
Dec 06 08:33:06 np0005548789.localdomain podman[74142]: 2025-12-06 08:33:06.953455227 +0000 UTC m=+0.110067351 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:33:07 np0005548789.localdomain podman[74142]: 2025-12-06 08:33:07.306808783 +0000 UTC m=+0.463420937 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4)
Dec 06 08:33:07 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:33:07 np0005548789.localdomain sshd[74165]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:33:08 np0005548789.localdomain sshd[74165]: Invalid user ubuntu from 92.118.39.95 port 47810
Dec 06 08:33:08 np0005548789.localdomain sshd[74165]: Connection closed by invalid user ubuntu 92.118.39.95 port 47810 [preauth]
Dec 06 08:33:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:33:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:33:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:33:10 np0005548789.localdomain podman[74167]: 2025-12-06 08:33:10.93180075 +0000 UTC m=+0.094148476 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true)
Dec 06 08:33:10 np0005548789.localdomain podman[74167]: 2025-12-06 08:33:10.958113693 +0000 UTC m=+0.120461449 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:33:10 np0005548789.localdomain systemd[1]: tmp-crun.KZaxrX.mount: Deactivated successfully.
Dec 06 08:33:10 np0005548789.localdomain podman[74168]: 2025-12-06 08:33:10.979428234 +0000 UTC m=+0.138167450 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:33:10 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:33:11 np0005548789.localdomain podman[74169]: 2025-12-06 08:33:11.032516274 +0000 UTC m=+0.187823574 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64)
Dec 06 08:33:11 np0005548789.localdomain podman[74169]: 2025-12-06 08:33:11.07462773 +0000 UTC m=+0.229934990 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1)
Dec 06 08:33:11 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:33:11 np0005548789.localdomain podman[74168]: 2025-12-06 08:33:11.157626103 +0000 UTC m=+0.316365349 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:33:11 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:33:28 np0005548789.localdomain sudo[74243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:33:28 np0005548789.localdomain sudo[74243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:28 np0005548789.localdomain sudo[74243]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:28 np0005548789.localdomain sudo[74258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:33:28 np0005548789.localdomain sudo[74258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:29 np0005548789.localdomain sudo[74258]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:30 np0005548789.localdomain sudo[74306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:33:30 np0005548789.localdomain sudo[74306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:33:30 np0005548789.localdomain sudo[74306]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:33:36 np0005548789.localdomain recover_tripleo_nova_virtqemud[74349]: 61814
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: tmp-crun.6LEVvV.mount: Deactivated successfully.
Dec 06 08:33:36 np0005548789.localdomain podman[74322]: 2025-12-06 08:33:36.950904537 +0000 UTC m=+0.105292405 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible)
Dec 06 08:33:36 np0005548789.localdomain systemd[1]: tmp-crun.e1OaA0.mount: Deactivated successfully.
Dec 06 08:33:36 np0005548789.localdomain podman[74321]: 2025-12-06 08:33:36.992264029 +0000 UTC m=+0.146400409 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1)
Dec 06 08:33:37 np0005548789.localdomain podman[74321]: 2025-12-06 08:33:37.030120525 +0000 UTC m=+0.184256895 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:33:37 np0005548789.localdomain podman[74323]: 2025-12-06 08:33:37.045546916 +0000 UTC m=+0.197567132 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:33:37 np0005548789.localdomain podman[74322]: 2025-12-06 08:33:37.085335211 +0000 UTC m=+0.239723049 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, name=rhosp17/openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, distribution-scope=public)
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:33:37 np0005548789.localdomain podman[74323]: 2025-12-06 08:33:37.150448038 +0000 UTC m=+0.302468204 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:33:37 np0005548789.localdomain podman[74324]: 2025-12-06 08:33:37.130104358 +0000 UTC m=+0.278090051 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, batch=17.1_20251118.1, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Dec 06 08:33:37 np0005548789.localdomain podman[74327]: 2025-12-06 08:33:37.151002495 +0000 UTC m=+0.288506108 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:33:37 np0005548789.localdomain podman[74324]: 2025-12-06 08:33:37.211048269 +0000 UTC m=+0.359033952 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:33:37 np0005548789.localdomain podman[74327]: 2025-12-06 08:33:37.23467838 +0000 UTC m=+0.372182033 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:33:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:33:37 np0005548789.localdomain podman[74438]: 2025-12-06 08:33:37.925429226 +0000 UTC m=+0.081098066 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible)
Dec 06 08:33:38 np0005548789.localdomain podman[74438]: 2025-12-06 08:33:38.331735879 +0000 UTC m=+0.487404699 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:33:38 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:33:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:33:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:33:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:33:41 np0005548789.localdomain podman[74462]: 2025-12-06 08:33:41.930617878 +0000 UTC m=+0.089979938 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:33:41 np0005548789.localdomain systemd[1]: tmp-crun.80YkBg.mount: Deactivated successfully.
Dec 06 08:33:41 np0005548789.localdomain podman[74463]: 2025-12-06 08:33:41.97589092 +0000 UTC m=+0.134358323 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 08:33:42 np0005548789.localdomain podman[74461]: 2025-12-06 08:33:42.037369636 +0000 UTC m=+0.199632844 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, vcs-type=git)
Dec 06 08:33:42 np0005548789.localdomain podman[74461]: 2025-12-06 08:33:42.086393883 +0000 UTC m=+0.248657091 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Dec 06 08:33:42 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:33:42 np0005548789.localdomain podman[74463]: 2025-12-06 08:33:42.140840645 +0000 UTC m=+0.299308038 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true)
Dec 06 08:33:42 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:33:42 np0005548789.localdomain podman[74462]: 2025-12-06 08:33:42.197176384 +0000 UTC m=+0.356538384 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 06 08:33:42 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:33:42 np0005548789.localdomain systemd[1]: tmp-crun.V4Edor.mount: Deactivated successfully.
Dec 06 08:33:47 np0005548789.localdomain sudo[74583]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xirgldphnsatzguzvmzwhcpdyfwtryiz ; /usr/bin/python3
Dec 06 08:33:47 np0005548789.localdomain sudo[74583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:48 np0005548789.localdomain python3[74585]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:33:48 np0005548789.localdomain sudo[74583]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:48 np0005548789.localdomain sudo[74628]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbodizmndlivloeenphavsbmkegtinrd ; /usr/bin/python3
Dec 06 08:33:48 np0005548789.localdomain sudo[74628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:48 np0005548789.localdomain python3[74630]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010027.6962512-113371-280628189266036/source _original_basename=tmpc84d5vut follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:33:48 np0005548789.localdomain sudo[74628]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:49 np0005548789.localdomain sudo[74658]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eanwxavxgonxmakvvpelbstyammlkped ; /usr/bin/python3
Dec 06 08:33:49 np0005548789.localdomain sudo[74658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:49 np0005548789.localdomain python3[74660]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:33:49 np0005548789.localdomain sudo[74658]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:50 np0005548789.localdomain sudo[74708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewcuomoawhhaeicrtwibhltumautisby ; /usr/bin/python3
Dec 06 08:33:50 np0005548789.localdomain sudo[74708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:50 np0005548789.localdomain sudo[74708]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:50 np0005548789.localdomain sudo[74726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdnoxnjeutszziuefcvxcxxqhfurnnjp ; /usr/bin/python3
Dec 06 08:33:50 np0005548789.localdomain sudo[74726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:50 np0005548789.localdomain sudo[74726]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:51 np0005548789.localdomain sudo[74830]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyzkslhstvsbwimcaqqaqunpcsxswdwv ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.7180123-113613-108250585370319/async_wrapper.py 923341096259 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.7180123-113613-108250585370319/AnsiballZ_command.py _
Dec 06 08:33:51 np0005548789.localdomain sudo[74830]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 08:33:51 np0005548789.localdomain ansible-async_wrapper.py[74832]: Invoked with 923341096259 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010030.7180123-113613-108250585370319/AnsiballZ_command.py _
Dec 06 08:33:51 np0005548789.localdomain ansible-async_wrapper.py[74835]: Starting module and watcher
Dec 06 08:33:51 np0005548789.localdomain ansible-async_wrapper.py[74835]: Start watching 74836 (3600)
Dec 06 08:33:51 np0005548789.localdomain ansible-async_wrapper.py[74836]: Start module (74836)
Dec 06 08:33:51 np0005548789.localdomain ansible-async_wrapper.py[74832]: Return async_wrapper task started.
Dec 06 08:33:51 np0005548789.localdomain sudo[74830]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:51 np0005548789.localdomain sudo[74851]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilyzujkypztqjzbxjtkuiqrgqolwpulg ; /usr/bin/python3
Dec 06 08:33:51 np0005548789.localdomain sudo[74851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:33:51 np0005548789.localdomain python3[74856]: ansible-ansible.legacy.async_status Invoked with jid=923341096259.74832 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:33:51 np0005548789.localdomain sudo[74851]: pam_unix(sudo:session): session closed for user root
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (file: /etc/puppet/hiera.yaml)
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: Undefined variable '::deploy_config_name';
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (file & line not available)
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (file & line not available)
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Notice: Compiled catalog for np0005548789.localdomain in environment production in 0.22 seconds
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Notice: Applied catalog in 0.31 seconds
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Application:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    Initial environment: production
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    Converged environment: production
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:          Run mode: user
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Changes:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Events:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Resources:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:             Total: 19
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Time:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:        Filebucket: 0.00
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:           Package: 0.00
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:          Schedule: 0.00
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:              Exec: 0.01
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:            Augeas: 0.01
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:              File: 0.02
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:           Service: 0.06
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    Config retrieval: 0.29
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    Transaction evaluation: 0.30
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:    Catalog application: 0.31
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:          Last run: 1765010035
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:             Total: 0.32
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]: Version:
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:            Config: 1765010035
Dec 06 08:33:55 np0005548789.localdomain puppet-user[74855]:            Puppet: 7.10.0
Dec 06 08:33:55 np0005548789.localdomain ansible-async_wrapper.py[74836]: Module complete (74836)
Dec 06 08:33:56 np0005548789.localdomain ansible-async_wrapper.py[74835]: Done in kid B.
Dec 06 08:34:01 np0005548789.localdomain sudo[74992]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcenaddlehzaphxcxeebudbyoceamdmx ; /usr/bin/python3
Dec 06 08:34:01 np0005548789.localdomain sudo[74992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:01 np0005548789.localdomain python3[74994]: ansible-ansible.legacy.async_status Invoked with jid=923341096259.74832 mode=status _async_dir=/tmp/.ansible_async
Dec 06 08:34:01 np0005548789.localdomain sudo[74992]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:02 np0005548789.localdomain sudo[75008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jktcddgpeqbuojomytzxnnmoecwaprjk ; /usr/bin/python3
Dec 06 08:34:02 np0005548789.localdomain sudo[75008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:02 np0005548789.localdomain python3[75010]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:34:02 np0005548789.localdomain sudo[75008]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:02 np0005548789.localdomain sudo[75024]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpfshkblygkeipcqljpmcqzfvavpicoy ; /usr/bin/python3
Dec 06 08:34:02 np0005548789.localdomain sudo[75024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:02 np0005548789.localdomain python3[75026]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:34:02 np0005548789.localdomain sudo[75024]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:03 np0005548789.localdomain sudo[75074]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsawtibrstsxsxsaywylohmcsdifotyo ; /usr/bin/python3
Dec 06 08:34:03 np0005548789.localdomain sudo[75074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:03 np0005548789.localdomain python3[75076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:03 np0005548789.localdomain sudo[75074]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:03 np0005548789.localdomain sudo[75092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-weppyxbgvhhoeasqoqjmclghaugderep ; /usr/bin/python3
Dec 06 08:34:03 np0005548789.localdomain sudo[75092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:03 np0005548789.localdomain python3[75094]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpke4vg_sj recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 08:34:03 np0005548789.localdomain sudo[75092]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:03 np0005548789.localdomain sudo[75122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivyjwkrvsikhbcdpnlgcdzahfvtmvygl ; /usr/bin/python3
Dec 06 08:34:03 np0005548789.localdomain sudo[75122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:04 np0005548789.localdomain python3[75124]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:04 np0005548789.localdomain sudo[75122]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:04 np0005548789.localdomain sudo[75138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdzovsylgetfjmpnwjyydcwmzrixhzlo ; /usr/bin/python3
Dec 06 08:34:04 np0005548789.localdomain sudo[75138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:04 np0005548789.localdomain sudo[75138]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:05 np0005548789.localdomain sudo[75227]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icxipoklwbdmvrvhdzitotspfqjenzcd ; /usr/bin/python3
Dec 06 08:34:05 np0005548789.localdomain sudo[75227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:05 np0005548789.localdomain python3[75229]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Dec 06 08:34:05 np0005548789.localdomain sudo[75227]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:05 np0005548789.localdomain sudo[75246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnccevoelmacfrquigeccgdobutbzowu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:05 np0005548789.localdomain sudo[75246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:05 np0005548789.localdomain python3[75248]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:05 np0005548789.localdomain sudo[75246]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:06 np0005548789.localdomain sudo[75262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ganvlikktwrwkfpgydaxbdrjglosxhgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:06 np0005548789.localdomain sudo[75262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:06 np0005548789.localdomain sudo[75262]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:06 np0005548789.localdomain sudo[75278]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drompjfvhniruadxpwqxsrvkxmgrzzlb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:06 np0005548789.localdomain sudo[75278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:06 np0005548789.localdomain python3[75280]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:34:06 np0005548789.localdomain sudo[75278]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548789.localdomain sudo[75328]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzajnhxepwcbxflraiicfjcdekztinev ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:07 np0005548789.localdomain sudo[75328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: tmp-crun.PfyWMK.mount: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain podman[75332]: 2025-12-06 08:34:07.366638338 +0000 UTC m=+0.109906116 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:34:07 np0005548789.localdomain python3[75330]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:07 np0005548789.localdomain podman[75331]: 2025-12-06 08:34:07.393009382 +0000 UTC m=+0.134672042 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:34:07 np0005548789.localdomain sudo[75328]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548789.localdomain podman[75332]: 2025-12-06 08:34:07.48005915 +0000 UTC m=+0.223326918 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Dec 06 08:34:07 np0005548789.localdomain sudo[75431]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idztotsuwvxuuzgcngeuyvufvkchdwnt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain sudo[75431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:07 np0005548789.localdomain podman[75333]: 2025-12-06 08:34:07.454630603 +0000 UTC m=+0.193668332 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Dec 06 08:34:07 np0005548789.localdomain podman[75334]: 2025-12-06 08:34:07.470943561 +0000 UTC m=+0.205026919 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:34:07 np0005548789.localdomain podman[75385]: 2025-12-06 08:34:07.554932695 +0000 UTC m=+0.170592918 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:34:07 np0005548789.localdomain podman[75334]: 2025-12-06 08:34:07.557834664 +0000 UTC m=+0.291918052 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-iscsid)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain podman[75331]: 2025-12-06 08:34:07.583315252 +0000 UTC m=+0.324977912 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain podman[75385]: 2025-12-06 08:34:07.604579301 +0000 UTC m=+0.220239544 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain python3[75439]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:07 np0005548789.localdomain podman[75333]: 2025-12-06 08:34:07.635279008 +0000 UTC m=+0.374316737 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 08:34:07 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:34:07 np0005548789.localdomain sudo[75431]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:07 np0005548789.localdomain sudo[75519]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bebijreifhpkhjnmjgtedbehdhulgzll ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:07 np0005548789.localdomain sudo[75519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548789.localdomain python3[75521]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:08 np0005548789.localdomain sudo[75519]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:08 np0005548789.localdomain sudo[75537]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtsmsahhxooaemudebjhnmlvnpndgojw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:08 np0005548789.localdomain sudo[75537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:34:08 np0005548789.localdomain python3[75539]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:08 np0005548789.localdomain sudo[75537]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:08 np0005548789.localdomain podman[75540]: 2025-12-06 08:34:08.46183677 +0000 UTC m=+0.071422131 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target)
Dec 06 08:34:08 np0005548789.localdomain sudo[75622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvxamvrqcspgsqhcapuzmccmaywtmowq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:08 np0005548789.localdomain sudo[75622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:08 np0005548789.localdomain podman[75540]: 2025-12-06 08:34:08.876357753 +0000 UTC m=+0.485943174 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Dec 06 08:34:08 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:34:09 np0005548789.localdomain python3[75624]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:09 np0005548789.localdomain sudo[75622]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:09 np0005548789.localdomain sudo[75640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htvfdwikcukxkjfkxqmmngqvvsvzhxpt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:09 np0005548789.localdomain sudo[75640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:09 np0005548789.localdomain python3[75642]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:09 np0005548789.localdomain sudo[75640]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:09 np0005548789.localdomain sudo[75702]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuxhnavvoeqjfinfgbvrgfsypcoehrll ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:09 np0005548789.localdomain sudo[75702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:09 np0005548789.localdomain python3[75704]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:09 np0005548789.localdomain sudo[75702]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:09 np0005548789.localdomain sudo[75720]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbqemyslzkutfsjmedybpexqiurtlnzt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:09 np0005548789.localdomain sudo[75720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:10 np0005548789.localdomain python3[75722]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:10 np0005548789.localdomain sudo[75720]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:10 np0005548789.localdomain sudo[75750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqrgcopsztepnfauqpmjshsomwyjtgjd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:10 np0005548789.localdomain sudo[75750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:10 np0005548789.localdomain python3[75752]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:34:10 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:34:10 np0005548789.localdomain systemd-rc-local-generator[75775]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:34:10 np0005548789.localdomain systemd-sysv-generator[75779]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:34:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:34:11 np0005548789.localdomain sudo[75750]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:11 np0005548789.localdomain sudo[75836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwmsusszhkqbntbdmtsiveqwvhgpwrlb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:11 np0005548789.localdomain sudo[75836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:11 np0005548789.localdomain python3[75838]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:11 np0005548789.localdomain sudo[75836]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:11 np0005548789.localdomain sudo[75854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuuprkhzbtnorklohhlsoewkjblfvmpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:11 np0005548789.localdomain sudo[75854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:11 np0005548789.localdomain python3[75856]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:11 np0005548789.localdomain sudo[75854]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:12 np0005548789.localdomain sudo[75916]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzszrkqctdizuxhgnsdymvlyhbonqxqh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:12 np0005548789.localdomain sudo[75916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:34:12 np0005548789.localdomain podman[75920]: 2025-12-06 08:34:12.296623841 +0000 UTC m=+0.092301289 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible)
Dec 06 08:34:12 np0005548789.localdomain python3[75918]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 08:34:12 np0005548789.localdomain podman[75919]: 2025-12-06 08:34:12.344809621 +0000 UTC m=+0.139930422 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 08:34:12 np0005548789.localdomain sudo[75916]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:12 np0005548789.localdomain podman[75919]: 2025-12-06 08:34:12.379354266 +0000 UTC m=+0.174475047 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: tmp-crun.FmSKgv.mount: Deactivated successfully.
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:34:12 np0005548789.localdomain podman[75950]: 2025-12-06 08:34:12.40205761 +0000 UTC m=+0.099324454 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.openshift.expose-services=, container_name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044)
Dec 06 08:34:12 np0005548789.localdomain podman[75920]: 2025-12-06 08:34:12.425499895 +0000 UTC m=+0.221177403 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:34:12 np0005548789.localdomain sudo[76010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfqfgxjdqfpwimtleyhqlapdhryrxaux ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:12 np0005548789.localdomain sudo[76010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:12 np0005548789.localdomain python3[76012]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:34:12 np0005548789.localdomain podman[75950]: 2025-12-06 08:34:12.597101043 +0000 UTC m=+0.294367927 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:34:12 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:34:12 np0005548789.localdomain sudo[76010]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:12 np0005548789.localdomain sudo[76040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzbclgoqybvkmwwxkzivcyttlrdolksp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:12 np0005548789.localdomain sudo[76040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:13 np0005548789.localdomain python3[76042]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:34:13 np0005548789.localdomain systemd-rc-local-generator[76064]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:34:13 np0005548789.localdomain systemd-sysv-generator[76070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 08:34:13 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 08:34:13 np0005548789.localdomain sudo[76040]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:13 np0005548789.localdomain sudo[76097]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvtrnownacugvhoctuqinshlcougpjjz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:13 np0005548789.localdomain sudo[76097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:14 np0005548789.localdomain python3[76099]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Dec 06 08:34:14 np0005548789.localdomain sudo[76097]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:14 np0005548789.localdomain sudo[76113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjmqhfeuuwaviwgfpukfmbefcvhjnpsg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:14 np0005548789.localdomain sudo[76113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:15 np0005548789.localdomain sudo[76113]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:15 np0005548789.localdomain sudo[76155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-renbdcbpeznwyguyldwcmmhuowuotiji ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:34:15 np0005548789.localdomain sudo[76155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:34:16 np0005548789.localdomain python3[76157]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Dec 06 08:34:16 np0005548789.localdomain podman[76196]: 2025-12-06 08:34:16.479312631 +0000 UTC m=+0.087398358 container create 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_compute, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started libpod-conmon-41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.scope.
Dec 06 08:34:16 np0005548789.localdomain podman[76196]: 2025-12-06 08:34:16.431069529 +0000 UTC m=+0.039155286 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:34:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:34:16 np0005548789.localdomain podman[76196]: 2025-12-06 08:34:16.595422205 +0000 UTC m=+0.203507942 container init 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, container_name=nova_compute)
Dec 06 08:34:16 np0005548789.localdomain sudo[76216]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:34:16 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:34:16 np0005548789.localdomain podman[76196]: 2025-12-06 08:34:16.643647208 +0000 UTC m=+0.251732935 container start 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:34:16 np0005548789.localdomain python3[76157]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:34:16 np0005548789.localdomain podman[76217]: 2025-12-06 08:34:16.753411659 +0000 UTC m=+0.104791241 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:34:16 np0005548789.localdomain podman[76217]: 2025-12-06 08:34:16.810074398 +0000 UTC m=+0.161453920 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Dec 06 08:34:16 np0005548789.localdomain podman[76217]: unhealthy
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Queued start job for default target Main User Target.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Created slice User Application Slice.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Reached target Paths.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Reached target Timers.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Starting D-Bus User Message Bus Socket...
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Starting Create User's Volatile Files and Directories...
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Reached target Sockets.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Finished Create User's Volatile Files and Directories.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Reached target Basic System.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Reached target Main User Target.
Dec 06 08:34:16 np0005548789.localdomain systemd[76225]: Startup finished in 138ms.
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: Started Session c10 of User root.
Dec 06 08:34:16 np0005548789.localdomain sudo[76216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 06 08:34:16 np0005548789.localdomain sudo[76216]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:16 np0005548789.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Dec 06 08:34:17 np0005548789.localdomain podman[76317]: 2025-12-06 08:34:17.201351862 +0000 UTC m=+0.105407169 container create b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 08:34:17 np0005548789.localdomain systemd[1]: Started libpod-conmon-b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931.scope.
Dec 06 08:34:17 np0005548789.localdomain podman[76317]: 2025-12-06 08:34:17.150561262 +0000 UTC m=+0.054616579 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:34:17 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:34:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 08:34:17 np0005548789.localdomain podman[76317]: 2025-12-06 08:34:17.278101505 +0000 UTC m=+0.182156782 container init b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1761123044, io.buildah.version=1.41.4, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, distribution-scope=public, config_id=tripleo_step5, batch=17.1_20251118.1, container_name=nova_wait_for_compute_service, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:17 np0005548789.localdomain podman[76317]: 2025-12-06 08:34:17.288161822 +0000 UTC m=+0.192217069 container start b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:34:17 np0005548789.localdomain podman[76317]: 2025-12-06 08:34:17.288464692 +0000 UTC m=+0.192519949 container attach b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public)
Dec 06 08:34:17 np0005548789.localdomain sudo[76336]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 08:34:17 np0005548789.localdomain sudo[76336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 06 08:34:17 np0005548789.localdomain sudo[76336]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Activating special unit Exit the Session...
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped target Main User Target.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped target Basic System.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped target Paths.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped target Sockets.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped target Timers.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Closed D-Bus User Message Bus Socket.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Stopped Create User's Volatile Files and Directories.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Removed slice User Application Slice.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Reached target Shutdown.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Finished Exit the Session.
Dec 06 08:34:27 np0005548789.localdomain systemd[76225]: Reached target Exit the Session.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 08:34:27 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 08:34:30 np0005548789.localdomain sudo[76341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:34:30 np0005548789.localdomain sudo[76341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:30 np0005548789.localdomain sudo[76341]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:30 np0005548789.localdomain sudo[76356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:34:30 np0005548789.localdomain sudo[76356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:31 np0005548789.localdomain sudo[76356]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:31 np0005548789.localdomain sudo[76403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:34:31 np0005548789.localdomain sudo[76403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:34:31 np0005548789.localdomain sudo[76403]: pam_unix(sudo:session): session closed for user root
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:34:37 np0005548789.localdomain podman[76420]: 2025-12-06 08:34:37.956700492 +0000 UTC m=+0.106077329 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute)
Dec 06 08:34:37 np0005548789.localdomain systemd[1]: tmp-crun.68PmCD.mount: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain podman[76421]: 2025-12-06 08:34:38.003337715 +0000 UTC m=+0.153696833 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4)
Dec 06 08:34:38 np0005548789.localdomain podman[76420]: 2025-12-06 08:34:38.01231427 +0000 UTC m=+0.161691167 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044)
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain podman[76421]: 2025-12-06 08:34:38.047178094 +0000 UTC m=+0.197537202 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, release=1761123044)
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain podman[76429]: 2025-12-06 08:34:38.100861762 +0000 UTC m=+0.242336888 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:34:38 np0005548789.localdomain podman[76419]: 2025-12-06 08:34:38.152906541 +0000 UTC m=+0.305578139 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com)
Dec 06 08:34:38 np0005548789.localdomain podman[76419]: 2025-12-06 08:34:38.168060564 +0000 UTC m=+0.320732132 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20251118.1, container_name=collectd, managed_by=tripleo_ansible)
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain podman[76429]: 2025-12-06 08:34:38.184126904 +0000 UTC m=+0.325602090 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain podman[76418]: 2025-12-06 08:34:38.25738294 +0000 UTC m=+0.412046969 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:34:38 np0005548789.localdomain podman[76418]: 2025-12-06 08:34:38.294182704 +0000 UTC m=+0.448846803 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:34:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:34:39 np0005548789.localdomain podman[76530]: 2025-12-06 08:34:39.05604498 +0000 UTC m=+0.083102758 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4)
Dec 06 08:34:39 np0005548789.localdomain podman[76530]: 2025-12-06 08:34:39.432152331 +0000 UTC m=+0.459210089 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:34:39 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:34:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:34:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:34:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:34:42 np0005548789.localdomain systemd[1]: tmp-crun.FPuwRg.mount: Deactivated successfully.
Dec 06 08:34:42 np0005548789.localdomain podman[76555]: 2025-12-06 08:34:42.928587013 +0000 UTC m=+0.090744820 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:34:42 np0005548789.localdomain podman[76554]: 2025-12-06 08:34:42.970832034 +0000 UTC m=+0.132880608 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, tcib_managed=true)
Dec 06 08:34:42 np0005548789.localdomain podman[76555]: 2025-12-06 08:34:42.975164536 +0000 UTC m=+0.137322333 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Dec 06 08:34:42 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:34:43 np0005548789.localdomain podman[76553]: 2025-12-06 08:34:43.059470929 +0000 UTC m=+0.224228435 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z)
Dec 06 08:34:43 np0005548789.localdomain podman[76553]: 2025-12-06 08:34:43.111120916 +0000 UTC m=+0.275878412 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, vcs-type=git, architecture=x86_64)
Dec 06 08:34:43 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:34:43 np0005548789.localdomain podman[76554]: 2025-12-06 08:34:43.212318045 +0000 UTC m=+0.374366569 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, config_id=tripleo_step1, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:34:43 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:34:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:34:47 np0005548789.localdomain podman[76630]: 2025-12-06 08:34:47.890480121 +0000 UTC m=+0.050229904 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:34:47 np0005548789.localdomain podman[76630]: 2025-12-06 08:34:47.923796938 +0000 UTC m=+0.083546741 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Dec 06 08:34:47 np0005548789.localdomain podman[76630]: unhealthy
Dec 06 08:34:47 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:34:47 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:35:08 np0005548789.localdomain recover_tripleo_nova_virtqemud[76683]: 61814
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:35:08 np0005548789.localdomain podman[76663]: 2025-12-06 08:35:08.955479871 +0000 UTC m=+0.096122645 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:35:08 np0005548789.localdomain systemd[1]: tmp-crun.RFfAvf.mount: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain podman[76651]: 2025-12-06 08:35:09.00458582 +0000 UTC m=+0.160368686 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:35:09 np0005548789.localdomain podman[76663]: 2025-12-06 08:35:09.010297555 +0000 UTC m=+0.150940389 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12)
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain podman[76662]: 2025-12-06 08:35:09.05372639 +0000 UTC m=+0.198618784 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 08:35:09 np0005548789.localdomain podman[76662]: 2025-12-06 08:35:09.063653313 +0000 UTC m=+0.208545697 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step3, distribution-scope=public)
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain podman[76653]: 2025-12-06 08:35:09.109308097 +0000 UTC m=+0.257058528 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:35:09 np0005548789.localdomain podman[76651]: 2025-12-06 08:35:09.114875647 +0000 UTC m=+0.270658513 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain podman[76652]: 2025-12-06 08:35:09.156735715 +0000 UTC m=+0.302574107 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=)
Dec 06 08:35:09 np0005548789.localdomain podman[76653]: 2025-12-06 08:35:09.163189011 +0000 UTC m=+0.310939512 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com)
Dec 06 08:35:09 np0005548789.localdomain podman[76652]: 2025-12-06 08:35:09.169163394 +0000 UTC m=+0.315001786 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1)
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:35:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:35:09 np0005548789.localdomain podman[76760]: 2025-12-06 08:35:09.916647391 +0000 UTC m=+0.079596390 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 08:35:10 np0005548789.localdomain podman[76760]: 2025-12-06 08:35:10.285207402 +0000 UTC m=+0.448156441 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute)
Dec 06 08:35:10 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:35:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:35:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:35:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:35:13 np0005548789.localdomain podman[76785]: 2025-12-06 08:35:13.927833574 +0000 UTC m=+0.081526948 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 06 08:35:13 np0005548789.localdomain podman[76785]: 2025-12-06 08:35:13.984096707 +0000 UTC m=+0.137790011 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12)
Dec 06 08:35:13 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:35:14 np0005548789.localdomain systemd[1]: tmp-crun.bcrFLn.mount: Deactivated successfully.
Dec 06 08:35:14 np0005548789.localdomain podman[76783]: 2025-12-06 08:35:14.033129589 +0000 UTC m=+0.190655910 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:35:14 np0005548789.localdomain podman[76784]: 2025-12-06 08:35:13.986354866 +0000 UTC m=+0.139452752 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z)
Dec 06 08:35:14 np0005548789.localdomain podman[76783]: 2025-12-06 08:35:14.092682553 +0000 UTC m=+0.250208874 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:35:14 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:35:14 np0005548789.localdomain podman[76784]: 2025-12-06 08:35:14.202840236 +0000 UTC m=+0.355938122 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible)
Dec 06 08:35:14 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:35:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:35:18 np0005548789.localdomain podman[76860]: 2025-12-06 08:35:18.920576636 +0000 UTC m=+0.081622211 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:35:18 np0005548789.localdomain podman[76860]: 2025-12-06 08:35:18.97816856 +0000 UTC m=+0.139214105 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 08:35:18 np0005548789.localdomain podman[76860]: unhealthy
Dec 06 08:35:18 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:35:18 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:35:19 np0005548789.localdomain sshd[35595]: Received disconnect from 192.168.122.100 port 36432:11: disconnected by user
Dec 06 08:35:19 np0005548789.localdomain sshd[35595]: Disconnected from user zuul 192.168.122.100 port 36432
Dec 06 08:35:19 np0005548789.localdomain sshd[35592]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:35:19 np0005548789.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Dec 06 08:35:19 np0005548789.localdomain systemd[1]: session-27.scope: Consumed 3.071s CPU time.
Dec 06 08:35:19 np0005548789.localdomain systemd-logind[766]: Session 27 logged out. Waiting for processes to exit.
Dec 06 08:35:19 np0005548789.localdomain systemd-logind[766]: Removed session 27.
Dec 06 08:35:21 np0005548789.localdomain sshd[76883]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:35:21 np0005548789.localdomain sshd[76883]: Invalid user solv from 92.118.39.95 port 34542
Dec 06 08:35:22 np0005548789.localdomain sshd[76883]: Connection closed by invalid user solv 92.118.39.95 port 34542 [preauth]
Dec 06 08:35:31 np0005548789.localdomain sudo[76885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:35:31 np0005548789.localdomain sudo[76885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:31 np0005548789.localdomain sudo[76885]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:32 np0005548789.localdomain sudo[76900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:35:32 np0005548789.localdomain sudo[76900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:32 np0005548789.localdomain sudo[76900]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:33 np0005548789.localdomain sudo[76946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:35:33 np0005548789.localdomain sudo[76946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:35:33 np0005548789.localdomain sudo[76946]: pam_unix(sudo:session): session closed for user root
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: tmp-crun.OJKSYn.mount: Deactivated successfully.
Dec 06 08:35:39 np0005548789.localdomain podman[76961]: 2025-12-06 08:35:39.928200603 +0000 UTC m=+0.086828920 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 06 08:35:39 np0005548789.localdomain podman[76963]: 2025-12-06 08:35:39.937299812 +0000 UTC m=+0.086674945 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute)
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: tmp-crun.CBAl2S.mount: Deactivated successfully.
Dec 06 08:35:39 np0005548789.localdomain podman[76961]: 2025-12-06 08:35:39.9431228 +0000 UTC m=+0.101751127 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-cron)
Dec 06 08:35:39 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:35:40 np0005548789.localdomain podman[76962]: 2025-12-06 08:35:40.002668144 +0000 UTC m=+0.157873856 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:35:40 np0005548789.localdomain podman[76962]: 2025-12-06 08:35:40.013394162 +0000 UTC m=+0.168599904 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:35:40 np0005548789.localdomain podman[76963]: 2025-12-06 08:35:40.013806425 +0000 UTC m=+0.163181608 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true)
Dec 06 08:35:40 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:35:40 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:35:40 np0005548789.localdomain podman[76964]: 2025-12-06 08:35:40.067390596 +0000 UTC m=+0.219136122 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, release=1761123044)
Dec 06 08:35:40 np0005548789.localdomain podman[76970]: 2025-12-06 08:35:40.141862327 +0000 UTC m=+0.287874527 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:35:40 np0005548789.localdomain podman[76964]: 2025-12-06 08:35:40.202142302 +0000 UTC m=+0.353887848 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:35:40 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:35:40 np0005548789.localdomain podman[76970]: 2025-12-06 08:35:40.223206918 +0000 UTC m=+0.369219118 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, version=17.1.12, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 06 08:35:40 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:35:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:35:40 np0005548789.localdomain podman[77069]: 2025-12-06 08:35:40.911352582 +0000 UTC m=+0.077895127 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 08:35:41 np0005548789.localdomain podman[77069]: 2025-12-06 08:35:41.31815871 +0000 UTC m=+0.484701215 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:35:41 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:35:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:35:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:35:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:35:44 np0005548789.localdomain podman[77092]: 2025-12-06 08:35:44.919032587 +0000 UTC m=+0.083505338 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:35:44 np0005548789.localdomain podman[77092]: 2025-12-06 08:35:44.963673574 +0000 UTC m=+0.128146275 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:35:44 np0005548789.localdomain systemd[1]: tmp-crun.itCvUa.mount: Deactivated successfully.
Dec 06 08:35:44 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:35:44 np0005548789.localdomain podman[77093]: 2025-12-06 08:35:44.993544968 +0000 UTC m=+0.154319847 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044)
Dec 06 08:35:45 np0005548789.localdomain podman[77094]: 2025-12-06 08:35:45.034814793 +0000 UTC m=+0.190611749 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:35:45 np0005548789.localdomain podman[77094]: 2025-12-06 08:35:45.091055825 +0000 UTC m=+0.246852751 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12)
Dec 06 08:35:45 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:35:45 np0005548789.localdomain podman[77093]: 2025-12-06 08:35:45.217292661 +0000 UTC m=+0.378067620 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git)
Dec 06 08:35:45 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:35:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:35:49 np0005548789.localdomain systemd[1]: tmp-crun.wzsXga.mount: Deactivated successfully.
Dec 06 08:35:49 np0005548789.localdomain podman[77168]: 2025-12-06 08:35:49.91634894 +0000 UTC m=+0.073522384 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:35:49 np0005548789.localdomain podman[77168]: 2025-12-06 08:35:49.99603529 +0000 UTC m=+0.153208724 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 08:35:50 np0005548789.localdomain podman[77168]: unhealthy
Dec 06 08:35:50 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:35:50 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:36:10 np0005548789.localdomain podman[77192]: 2025-12-06 08:36:10.911171534 +0000 UTC m=+0.073595795 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 08:36:10 np0005548789.localdomain podman[77191]: 2025-12-06 08:36:10.923925995 +0000 UTC m=+0.085894093 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:36:10 np0005548789.localdomain podman[77191]: 2025-12-06 08:36:10.928410722 +0000 UTC m=+0.090378860 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:36:10 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:36:10 np0005548789.localdomain podman[77190]: 2025-12-06 08:36:10.975115073 +0000 UTC m=+0.137669378 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container)
Dec 06 08:36:10 np0005548789.localdomain podman[77192]: 2025-12-06 08:36:10.982970204 +0000 UTC m=+0.145394435 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z)
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:36:11 np0005548789.localdomain podman[77190]: 2025-12-06 08:36:11.05469943 +0000 UTC m=+0.217253765 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, release=1761123044, container_name=logrotate_crond)
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:36:11 np0005548789.localdomain podman[77194]: 2025-12-06 08:36:11.037501643 +0000 UTC m=+0.193010363 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, version=17.1.12, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi)
Dec 06 08:36:11 np0005548789.localdomain podman[77193]: 2025-12-06 08:36:11.05893533 +0000 UTC m=+0.218553265 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com)
Dec 06 08:36:11 np0005548789.localdomain podman[77194]: 2025-12-06 08:36:11.117944737 +0000 UTC m=+0.273453477 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044)
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:36:11 np0005548789.localdomain podman[77193]: 2025-12-06 08:36:11.138205107 +0000 UTC m=+0.297823042 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:36:11 np0005548789.localdomain podman[77301]: 2025-12-06 08:36:11.895616673 +0000 UTC m=+0.060760492 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, distribution-scope=public, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:36:11 np0005548789.localdomain systemd[1]: tmp-crun.wZl3yy.mount: Deactivated successfully.
Dec 06 08:36:12 np0005548789.localdomain podman[77301]: 2025-12-06 08:36:12.277154418 +0000 UTC m=+0.442298247 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git)
Dec 06 08:36:12 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:36:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:36:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:36:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:36:15 np0005548789.localdomain podman[77325]: 2025-12-06 08:36:15.924514307 +0000 UTC m=+0.080446476 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:36:15 np0005548789.localdomain systemd[1]: tmp-crun.wtiIF8.mount: Deactivated successfully.
Dec 06 08:36:15 np0005548789.localdomain podman[77324]: 2025-12-06 08:36:15.985727472 +0000 UTC m=+0.146348895 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:36:16 np0005548789.localdomain podman[77324]: 2025-12-06 08:36:16.023136388 +0000 UTC m=+0.183757861 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team)
Dec 06 08:36:16 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:36:16 np0005548789.localdomain podman[77326]: 2025-12-06 08:36:16.07283566 +0000 UTC m=+0.225445917 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:36:16 np0005548789.localdomain podman[77325]: 2025-12-06 08:36:16.106005926 +0000 UTC m=+0.261938015 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:16 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:36:16 np0005548789.localdomain podman[77326]: 2025-12-06 08:36:16.127081321 +0000 UTC m=+0.279691538 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12)
Dec 06 08:36:16 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:36:16 np0005548789.localdomain systemd[1]: tmp-crun.UcU9Gg.mount: Deactivated successfully.
Dec 06 08:36:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:36:20 np0005548789.localdomain podman[77397]: 2025-12-06 08:36:20.909583513 +0000 UTC m=+0.075968268 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 06 08:36:20 np0005548789.localdomain podman[77397]: 2025-12-06 08:36:20.973221943 +0000 UTC m=+0.139606728 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 06 08:36:20 np0005548789.localdomain podman[77397]: unhealthy
Dec 06 08:36:20 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:36:20 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:36:33 np0005548789.localdomain sudo[77419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:36:33 np0005548789.localdomain sudo[77419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:33 np0005548789.localdomain sudo[77419]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:33 np0005548789.localdomain sudo[77434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:36:33 np0005548789.localdomain sudo[77434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:34 np0005548789.localdomain sudo[77434]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:35 np0005548789.localdomain sudo[77482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:36:35 np0005548789.localdomain sudo[77482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:36:35 np0005548789.localdomain sudo[77482]: pam_unix(sudo:session): session closed for user root
Dec 06 08:36:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:36:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:36:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:36:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:36:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:36:41 np0005548789.localdomain podman[77497]: 2025-12-06 08:36:41.94260488 +0000 UTC m=+0.097005122 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z)
Dec 06 08:36:41 np0005548789.localdomain podman[77499]: 2025-12-06 08:36:41.991626862 +0000 UTC m=+0.141633760 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:36:42 np0005548789.localdomain podman[77499]: 2025-12-06 08:36:42.044219552 +0000 UTC m=+0.194226450 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, release=1761123044)
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:36:42 np0005548789.localdomain podman[77498]: 2025-12-06 08:36:42.049577296 +0000 UTC m=+0.203591377 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 06 08:36:42 np0005548789.localdomain podman[77511]: 2025-12-06 08:36:42.100318401 +0000 UTC m=+0.243056896 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:36:42 np0005548789.localdomain podman[77500]: 2025-12-06 08:36:42.146598157 +0000 UTC m=+0.291905381 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z)
Dec 06 08:36:42 np0005548789.localdomain podman[77500]: 2025-12-06 08:36:42.154239572 +0000 UTC m=+0.299546866 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:36:42 np0005548789.localdomain podman[77497]: 2025-12-06 08:36:42.175403289 +0000 UTC m=+0.329803541 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:36:42 np0005548789.localdomain podman[77498]: 2025-12-06 08:36:42.184087695 +0000 UTC m=+0.338101756 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container)
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:36:42 np0005548789.localdomain podman[77511]: 2025-12-06 08:36:42.205734628 +0000 UTC m=+0.348473083 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true)
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:36:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:36:42 np0005548789.localdomain podman[77612]: 2025-12-06 08:36:42.918367973 +0000 UTC m=+0.079704652 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:36:43 np0005548789.localdomain podman[77612]: 2025-12-06 08:36:43.278197612 +0000 UTC m=+0.439534231 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=)
Dec 06 08:36:43 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: tmp-crun.wz9oN7.mount: Deactivated successfully.
Dec 06 08:36:46 np0005548789.localdomain podman[77638]: 2025-12-06 08:36:46.918102153 +0000 UTC m=+0.073592534 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: tmp-crun.hhWT3d.mount: Deactivated successfully.
Dec 06 08:36:46 np0005548789.localdomain podman[77638]: 2025-12-06 08:36:46.975397778 +0000 UTC m=+0.130888159 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible)
Dec 06 08:36:46 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:36:47 np0005548789.localdomain podman[77636]: 2025-12-06 08:36:47.022710617 +0000 UTC m=+0.183359036 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044)
Dec 06 08:36:47 np0005548789.localdomain podman[77637]: 2025-12-06 08:36:46.97772398 +0000 UTC m=+0.134477440 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd)
Dec 06 08:36:47 np0005548789.localdomain podman[77636]: 2025-12-06 08:36:47.067477478 +0000 UTC m=+0.228125867 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:47 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:36:47 np0005548789.localdomain podman[77637]: 2025-12-06 08:36:47.186171073 +0000 UTC m=+0.342924543 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:36:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:36:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:36:51 np0005548789.localdomain podman[77711]: 2025-12-06 08:36:51.906087741 +0000 UTC m=+0.068579552 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible)
Dec 06 08:36:51 np0005548789.localdomain podman[77711]: 2025-12-06 08:36:51.975103575 +0000 UTC m=+0.137595396 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:36:51 np0005548789.localdomain podman[77711]: unhealthy
Dec 06 08:36:51 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:36:51 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 08:37:06 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:37:06 np0005548789.localdomain recover_tripleo_nova_virtqemud[77735]: 61814
Dec 06 08:37:06 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:37:06 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:37:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:37:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:37:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:37:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:37:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:37:12 np0005548789.localdomain podman[77738]: 2025-12-06 08:37:12.929331235 +0000 UTC m=+0.082118336 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:37:12 np0005548789.localdomain podman[77738]: 2025-12-06 08:37:12.952619379 +0000 UTC m=+0.105406490 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:37:12 np0005548789.localdomain podman[77746]: 2025-12-06 08:37:12.99348244 +0000 UTC m=+0.138598385 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain podman[77746]: 2025-12-06 08:37:13.054920061 +0000 UTC m=+0.200035996 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z)
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: tmp-crun.8w1b5r.mount: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain podman[77739]: 2025-12-06 08:37:13.085639662 +0000 UTC m=+0.234828453 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:37:13 np0005548789.localdomain podman[77739]: 2025-12-06 08:37:13.139071509 +0000 UTC m=+0.288260380 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com)
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain podman[77737]: 2025-12-06 08:37:13.066845907 +0000 UTC m=+0.222263438 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:37:13 np0005548789.localdomain podman[77736]: 2025-12-06 08:37:13.142331008 +0000 UTC m=+0.298833122 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:37:13 np0005548789.localdomain podman[77737]: 2025-12-06 08:37:13.200105348 +0000 UTC m=+0.355522879 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd)
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain podman[77736]: 2025-12-06 08:37:13.226352331 +0000 UTC m=+0.382854425 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:37:13 np0005548789.localdomain systemd[1]: tmp-crun.nRe4UZ.mount: Deactivated successfully.
Dec 06 08:37:13 np0005548789.localdomain podman[77847]: 2025-12-06 08:37:13.952967305 +0000 UTC m=+0.078906088 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:37:14 np0005548789.localdomain podman[77847]: 2025-12-06 08:37:14.36294675 +0000 UTC m=+0.488885533 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 06 08:37:14 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:37:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:37:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:37:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:37:17 np0005548789.localdomain podman[77872]: 2025-12-06 08:37:17.908473062 +0000 UTC m=+0.068963254 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4)
Dec 06 08:37:17 np0005548789.localdomain podman[77871]: 2025-12-06 08:37:17.975501835 +0000 UTC m=+0.136231864 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr)
Dec 06 08:37:17 np0005548789.localdomain podman[77872]: 2025-12-06 08:37:17.980225369 +0000 UTC m=+0.140715581 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z)
Dec 06 08:37:17 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:37:18 np0005548789.localdomain podman[77870]: 2025-12-06 08:37:18.035663337 +0000 UTC m=+0.198649335 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, vcs-type=git)
Dec 06 08:37:18 np0005548789.localdomain podman[77870]: 2025-12-06 08:37:18.058829377 +0000 UTC m=+0.221815445 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12)
Dec 06 08:37:18 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:37:18 np0005548789.localdomain podman[77871]: 2025-12-06 08:37:18.157163658 +0000 UTC m=+0.317893747 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Dec 06 08:37:18 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:37:18 np0005548789.localdomain systemd[1]: tmp-crun.U44YQA.mount: Deactivated successfully.
Dec 06 08:37:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:37:22 np0005548789.localdomain podman[78035]: 2025-12-06 08:37:22.909079945 +0000 UTC m=+0.075759701 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:37:22 np0005548789.localdomain podman[78035]: 2025-12-06 08:37:22.962469 +0000 UTC m=+0.129148776 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:37:22 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:37:29 np0005548789.localdomain systemd[1]: libpod-b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931.scope: Deactivated successfully.
Dec 06 08:37:29 np0005548789.localdomain podman[76317]: 2025-12-06 08:37:29.894614364 +0000 UTC m=+192.798669611 container died b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=nova_wait_for_compute_service, vcs-type=git)
Dec 06 08:37:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931-userdata-shm.mount: Deactivated successfully.
Dec 06 08:37:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd-merged.mount: Deactivated successfully.
Dec 06 08:37:30 np0005548789.localdomain podman[78061]: 2025-12-06 08:37:29.997656151 +0000 UTC m=+0.088294675 container cleanup b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step5, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:37:30 np0005548789.localdomain systemd[1]: libpod-conmon-b526ff5c18e8d1e347c23ef14b01140c0e908977002aa81cbae031036fc67931.scope: Deactivated successfully.
Dec 06 08:37:30 np0005548789.localdomain python3[76157]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=179caa3982511c1fd3314b961771f96c --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Dec 06 08:37:30 np0005548789.localdomain sudo[76155]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:30 np0005548789.localdomain sudo[78114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkeplukzqygvtycfmevjudmwmbjhgkjj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:30 np0005548789.localdomain sudo[78114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:30 np0005548789.localdomain python3[78116]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:30 np0005548789.localdomain sudo[78114]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:30 np0005548789.localdomain sudo[78130]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buyzuwzdodxynufrjubejhgkhkelyeau ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:30 np0005548789.localdomain sudo[78130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:30 np0005548789.localdomain python3[78132]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 08:37:30 np0005548789.localdomain sudo[78130]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:31 np0005548789.localdomain sudo[78191]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhxnoyvjcccqrlallfxoanlbwbvocllv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:31 np0005548789.localdomain sudo[78191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:31 np0005548789.localdomain python3[78193]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765010250.996636-118350-4166422878226/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:31 np0005548789.localdomain sudo[78191]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:31 np0005548789.localdomain sudo[78207]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tecgktgdiimfakhvsegacmuiovjiowcr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:31 np0005548789.localdomain sudo[78207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:32 np0005548789.localdomain python3[78209]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 08:37:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:37:32 np0005548789.localdomain systemd-sysv-generator[78241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:32 np0005548789.localdomain systemd-rc-local-generator[78238]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:32 np0005548789.localdomain sudo[78207]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:32 np0005548789.localdomain sudo[78260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzdhjsksvkhvfvdhtgrvqlppuynjqgoz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Dec 06 08:37:32 np0005548789.localdomain sudo[78260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:33 np0005548789.localdomain python3[78262]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:37:33 np0005548789.localdomain systemd-rc-local-generator[78289]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:33 np0005548789.localdomain systemd-sysv-generator[78294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: Starting nova_compute container...
Dec 06 08:37:33 np0005548789.localdomain tripleo-start-podman-container[78302]: Creating additional drop-in dependency for "nova_compute" (41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007)
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 08:37:33 np0005548789.localdomain systemd-rc-local-generator[78362]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 08:37:33 np0005548789.localdomain systemd-sysv-generator[78365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 08:37:33 np0005548789.localdomain systemd[1]: Started nova_compute container.
Dec 06 08:37:33 np0005548789.localdomain sudo[78260]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:34 np0005548789.localdomain sudo[78398]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlivwzghlqyzbmbllciqzmjgzcgmuzrm ; /usr/bin/python3
Dec 06 08:37:34 np0005548789.localdomain sudo[78398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:34 np0005548789.localdomain python3[78400]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:34 np0005548789.localdomain sudo[78398]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:34 np0005548789.localdomain sudo[78446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywesglqgwewgkcoypsihglqzofuyhurj ; /usr/bin/python3
Dec 06 08:37:35 np0005548789.localdomain sudo[78446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:35 np0005548789.localdomain sudo[78446]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548789.localdomain sudo[78462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:37:35 np0005548789.localdomain sudo[78462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:35 np0005548789.localdomain sudo[78462]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548789.localdomain sudo[78491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:37:35 np0005548789.localdomain sudo[78491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:35 np0005548789.localdomain sudo[78518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pocvpujqcdxybyqbgkpoghwciwopswnm ; /usr/bin/python3
Dec 06 08:37:35 np0005548789.localdomain sudo[78518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:35 np0005548789.localdomain sudo[78518]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:35 np0005548789.localdomain sudo[78563]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqopjkdubtbftjbcsqkqgzypatbwmrmm ; /usr/bin/python3
Dec 06 08:37:35 np0005548789.localdomain sudo[78563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:35 np0005548789.localdomain python3[78568]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005548789 step=5 update_config_hash_only=False
Dec 06 08:37:35 np0005548789.localdomain sudo[78563]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548789.localdomain sudo[78491]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548789.localdomain sudo[78596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzkplsazzomucntetcduexlikxnlqiqw ; /usr/bin/python3
Dec 06 08:37:36 np0005548789.localdomain sudo[78596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:36 np0005548789.localdomain python3[78598]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 08:37:36 np0005548789.localdomain sudo[78596]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548789.localdomain sudo[78599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:37:36 np0005548789.localdomain sudo[78599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:37:36 np0005548789.localdomain sudo[78599]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:36 np0005548789.localdomain sudo[78627]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txenvopihqludblcjxhttffnqsscajgr ; /usr/bin/python3
Dec 06 08:37:36 np0005548789.localdomain sudo[78627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Dec 06 08:37:36 np0005548789.localdomain python3[78629]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Dec 06 08:37:36 np0005548789.localdomain sudo[78627]: pam_unix(sudo:session): session closed for user root
Dec 06 08:37:37 np0005548789.localdomain sshd[78630]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:37:37 np0005548789.localdomain sshd[78630]: Invalid user sol from 92.118.39.95 port 49520
Dec 06 08:37:37 np0005548789.localdomain sshd[78630]: Connection closed by invalid user sol 92.118.39.95 port 49520 [preauth]
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:37:43 np0005548789.localdomain podman[78632]: 2025-12-06 08:37:43.910526393 +0000 UTC m=+0.068683475 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, config_id=tripleo_step4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:37:43 np0005548789.localdomain podman[78632]: 2025-12-06 08:37:43.923611613 +0000 UTC m=+0.081768685 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:37:43 np0005548789.localdomain podman[78634]: 2025-12-06 08:37:43.925664606 +0000 UTC m=+0.079579857 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: tmp-crun.BtMOLN.mount: Deactivated successfully.
Dec 06 08:37:43 np0005548789.localdomain podman[78634]: 2025-12-06 08:37:43.980286669 +0000 UTC m=+0.134201951 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 06 08:37:43 np0005548789.localdomain podman[78635]: 2025-12-06 08:37:43.980381122 +0000 UTC m=+0.130196068 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, version=17.1.12)
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:37:43 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:37:44 np0005548789.localdomain podman[78633]: 2025-12-06 08:37:44.015712304 +0000 UTC m=+0.169719708 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com)
Dec 06 08:37:44 np0005548789.localdomain podman[78635]: 2025-12-06 08:37:44.078211789 +0000 UTC m=+0.228026715 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:37:44 np0005548789.localdomain podman[78641]: 2025-12-06 08:37:44.078869658 +0000 UTC m=+0.226057184 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public)
Dec 06 08:37:44 np0005548789.localdomain podman[78633]: 2025-12-06 08:37:44.078615291 +0000 UTC m=+0.232622645 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git)
Dec 06 08:37:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:37:44 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:37:44 np0005548789.localdomain podman[78641]: 2025-12-06 08:37:44.213483431 +0000 UTC m=+0.360670957 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:37:44 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:37:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:37:44 np0005548789.localdomain podman[78745]: 2025-12-06 08:37:44.915107688 +0000 UTC m=+0.078713322 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64)
Dec 06 08:37:45 np0005548789.localdomain podman[78745]: 2025-12-06 08:37:45.280047725 +0000 UTC m=+0.443653329 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible)
Dec 06 08:37:45 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:37:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:37:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:37:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:37:48 np0005548789.localdomain podman[78770]: 2025-12-06 08:37:48.921455882 +0000 UTC m=+0.082543199 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=)
Dec 06 08:37:48 np0005548789.localdomain systemd[1]: tmp-crun.XVCYYi.mount: Deactivated successfully.
Dec 06 08:37:48 np0005548789.localdomain podman[78769]: 2025-12-06 08:37:48.978068327 +0000 UTC m=+0.142375293 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1)
Dec 06 08:37:49 np0005548789.localdomain podman[78771]: 2025-12-06 08:37:49.02064243 +0000 UTC m=+0.178575301 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:37:49 np0005548789.localdomain podman[78769]: 2025-12-06 08:37:49.031160643 +0000 UTC m=+0.195467599 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container)
Dec 06 08:37:49 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:37:49 np0005548789.localdomain podman[78771]: 2025-12-06 08:37:49.068143695 +0000 UTC m=+0.226076546 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, release=1761123044, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:37:49 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:37:49 np0005548789.localdomain podman[78770]: 2025-12-06 08:37:49.137151278 +0000 UTC m=+0.298238595 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, tcib_managed=true, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible)
Dec 06 08:37:49 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:37:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:37:53 np0005548789.localdomain podman[78842]: 2025-12-06 08:37:53.913257926 +0000 UTC m=+0.069529991 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, container_name=nova_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:37:53 np0005548789.localdomain podman[78842]: 2025-12-06 08:37:53.945138303 +0000 UTC m=+0.101410428 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, tcib_managed=true, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:37:53 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:38:01 np0005548789.localdomain sshd[78868]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:38:02 np0005548789.localdomain sshd[78868]: Accepted publickey for zuul from 192.168.122.100 port 49540 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:38:02 np0005548789.localdomain systemd-logind[766]: New session 33 of user zuul.
Dec 06 08:38:02 np0005548789.localdomain systemd[1]: Started Session 33 of User zuul.
Dec 06 08:38:02 np0005548789.localdomain sshd[78868]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:38:02 np0005548789.localdomain sudo[78975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfirpayxitaaxeufjdoysxbplvlejmub ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010282.209431-40283-49607842747777/AnsiballZ_setup.py
Dec 06 08:38:02 np0005548789.localdomain sudo[78975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:02 np0005548789.localdomain python3[78977]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 08:38:05 np0005548789.localdomain sudo[78975]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:10 np0005548789.localdomain sudo[79238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpkyyqunzscpmvcqttlnffvlcpfhexvu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010290.075413-40370-195528229124565/AnsiballZ_dnf.py
Dec 06 08:38:10 np0005548789.localdomain sudo[79238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:10 np0005548789.localdomain python3[79240]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Dec 06 08:38:14 np0005548789.localdomain sudo[79238]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: tmp-crun.5MW6bi.mount: Deactivated successfully.
Dec 06 08:38:14 np0005548789.localdomain podman[79258]: 2025-12-06 08:38:14.943706651 +0000 UTC m=+0.100184648 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container)
Dec 06 08:38:14 np0005548789.localdomain podman[79258]: 2025-12-06 08:38:14.957124792 +0000 UTC m=+0.113602789 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z)
Dec 06 08:38:14 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: tmp-crun.dcmf7l.mount: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain podman[79259]: 2025-12-06 08:38:15.04650357 +0000 UTC m=+0.201224304 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:38:15 np0005548789.localdomain podman[79260]: 2025-12-06 08:38:15.091123336 +0000 UTC m=+0.242716274 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:38:15 np0005548789.localdomain podman[79259]: 2025-12-06 08:38:15.102117623 +0000 UTC m=+0.256838397 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible)
Dec 06 08:38:15 np0005548789.localdomain podman[79257]: 2025-12-06 08:38:15.011203279 +0000 UTC m=+0.167456939 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain podman[79260]: 2025-12-06 08:38:15.128200571 +0000 UTC m=+0.279793529 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain podman[79257]: 2025-12-06 08:38:15.148523174 +0000 UTC m=+0.304776794 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container)
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain podman[79261]: 2025-12-06 08:38:15.204932332 +0000 UTC m=+0.351005561 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:38:15 np0005548789.localdomain podman[79261]: 2025-12-06 08:38:15.242166612 +0000 UTC m=+0.388239801 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, container_name=ceilometer_agent_ipmi)
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:38:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:38:15 np0005548789.localdomain podman[79364]: 2025-12-06 08:38:15.919130884 +0000 UTC m=+0.076839054 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git)
Dec 06 08:38:16 np0005548789.localdomain podman[79364]: 2025-12-06 08:38:16.285853845 +0000 UTC m=+0.443562075 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 06 08:38:16 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:38:17 np0005548789.localdomain sudo[79460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdtccfhqlixncwffkzvyueplqpctwftl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765010297.4227784-40427-138133459613830/AnsiballZ_iptables.py
Dec 06 08:38:17 np0005548789.localdomain sudo[79460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 08:38:17 np0005548789.localdomain python3[79462]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Dec 06 08:38:17 np0005548789.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 06 08:38:17 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Dec 06 08:38:17 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 08:38:17 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:38:17 np0005548789.localdomain sudo[79460]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:17 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 08:38:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:38:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:38:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:38:19 np0005548789.localdomain systemd[1]: tmp-crun.odyxZe.mount: Deactivated successfully.
Dec 06 08:38:19 np0005548789.localdomain podman[79509]: 2025-12-06 08:38:19.948832643 +0000 UTC m=+0.099701445 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.41.4, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:38:20 np0005548789.localdomain podman[79508]: 2025-12-06 08:38:20.002265879 +0000 UTC m=+0.151873992 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public)
Dec 06 08:38:20 np0005548789.localdomain podman[79510]: 2025-12-06 08:38:20.055158809 +0000 UTC m=+0.205934537 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 08:38:20 np0005548789.localdomain podman[79508]: 2025-12-06 08:38:20.083630531 +0000 UTC m=+0.233238644 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 06 08:38:20 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:38:20 np0005548789.localdomain podman[79510]: 2025-12-06 08:38:20.144342541 +0000 UTC m=+0.295118269 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:38:20 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:38:20 np0005548789.localdomain podman[79509]: 2025-12-06 08:38:20.182546501 +0000 UTC m=+0.333415303 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:38:20 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:38:20 np0005548789.localdomain systemd[1]: tmp-crun.6yFv94.mount: Deactivated successfully.
Dec 06 08:38:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:38:24 np0005548789.localdomain podman[79606]: 2025-12-06 08:38:24.933856059 +0000 UTC m=+0.090873505 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 06 08:38:24 np0005548789.localdomain podman[79606]: 2025-12-06 08:38:24.965874779 +0000 UTC m=+0.122892215 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:38:24 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:38:36 np0005548789.localdomain sudo[79633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:38:36 np0005548789.localdomain sudo[79633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:36 np0005548789.localdomain sudo[79633]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:36 np0005548789.localdomain sudo[79648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:38:36 np0005548789.localdomain sudo[79648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:37 np0005548789.localdomain sudo[79648]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:38 np0005548789.localdomain sudo[79695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:38:38 np0005548789.localdomain sudo[79695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:38:38 np0005548789.localdomain sudo[79695]: pam_unix(sudo:session): session closed for user root
Dec 06 08:38:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:38:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:38:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:38:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:38:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:38:45 np0005548789.localdomain podman[79710]: 2025-12-06 08:38:45.962338465 +0000 UTC m=+0.106505443 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-cron)
Dec 06 08:38:45 np0005548789.localdomain podman[79714]: 2025-12-06 08:38:45.99877171 +0000 UTC m=+0.137677937 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:38:46 np0005548789.localdomain podman[79710]: 2025-12-06 08:38:46.045350097 +0000 UTC m=+0.189517105 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, tcib_managed=true, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:38:46 np0005548789.localdomain podman[79711]: 2025-12-06 08:38:46.067667671 +0000 UTC m=+0.210377615 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container)
Dec 06 08:38:46 np0005548789.localdomain podman[79714]: 2025-12-06 08:38:46.082437873 +0000 UTC m=+0.221344120 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:38:46 np0005548789.localdomain podman[79712]: 2025-12-06 08:38:46.103064754 +0000 UTC m=+0.247279194 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, distribution-scope=public)
Dec 06 08:38:46 np0005548789.localdomain podman[79712]: 2025-12-06 08:38:46.162189355 +0000 UTC m=+0.306403795 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:38:46 np0005548789.localdomain podman[79711]: 2025-12-06 08:38:46.178598017 +0000 UTC m=+0.321307961 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:38:46 np0005548789.localdomain podman[79713]: 2025-12-06 08:38:46.166299691 +0000 UTC m=+0.305833557 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:38:46 np0005548789.localdomain podman[79713]: 2025-12-06 08:38:46.2466084 +0000 UTC m=+0.386142206 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:38:46 np0005548789.localdomain podman[79822]: 2025-12-06 08:38:46.915928488 +0000 UTC m=+0.080920400 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:38:46 np0005548789.localdomain systemd[1]: tmp-crun.pv4GIQ.mount: Deactivated successfully.
Dec 06 08:38:47 np0005548789.localdomain podman[79822]: 2025-12-06 08:38:47.29347553 +0000 UTC m=+0.458467422 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 06 08:38:47 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:38:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:38:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:38:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:38:50 np0005548789.localdomain podman[79843]: 2025-12-06 08:38:50.910189182 +0000 UTC m=+0.075822884 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com)
Dec 06 08:38:50 np0005548789.localdomain podman[79843]: 2025-12-06 08:38:50.960270996 +0000 UTC m=+0.125904698 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4)
Dec 06 08:38:50 np0005548789.localdomain systemd[1]: tmp-crun.Pon1pI.mount: Deactivated successfully.
Dec 06 08:38:50 np0005548789.localdomain podman[79845]: 2025-12-06 08:38:50.978983829 +0000 UTC m=+0.135702568 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12)
Dec 06 08:38:50 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:38:51 np0005548789.localdomain podman[79844]: 2025-12-06 08:38:51.039232274 +0000 UTC m=+0.198934884 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:38:51 np0005548789.localdomain podman[79845]: 2025-12-06 08:38:51.049193719 +0000 UTC m=+0.205912498 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Dec 06 08:38:51 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:38:51 np0005548789.localdomain podman[79844]: 2025-12-06 08:38:51.263057628 +0000 UTC m=+0.422760208 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z)
Dec 06 08:38:51 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:38:55 np0005548789.localdomain recover_tripleo_nova_virtqemud[79922]: 61814
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: tmp-crun.fBfC7z.mount: Deactivated successfully.
Dec 06 08:38:55 np0005548789.localdomain podman[79920]: 2025-12-06 08:38:55.916050196 +0000 UTC m=+0.082088055 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:38:55 np0005548789.localdomain podman[79920]: 2025-12-06 08:38:55.946748926 +0000 UTC m=+0.112786765 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, config_id=tripleo_step5, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:38:55 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: tmp-crun.LE9irJ.mount: Deactivated successfully.
Dec 06 08:39:16 np0005548789.localdomain podman[79949]: 2025-12-06 08:39:16.927676604 +0000 UTC m=+0.088310016 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: tmp-crun.bfnr6s.mount: Deactivated successfully.
Dec 06 08:39:16 np0005548789.localdomain podman[79949]: 2025-12-06 08:39:16.940152745 +0000 UTC m=+0.100786107 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 06 08:39:16 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:39:16 np0005548789.localdomain podman[79950]: 2025-12-06 08:39:16.979675186 +0000 UTC m=+0.131750146 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:39:16 np0005548789.localdomain podman[79950]: 2025-12-06 08:39:16.990002812 +0000 UTC m=+0.142077762 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, version=17.1.12)
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:39:17 np0005548789.localdomain podman[79957]: 2025-12-06 08:39:17.034618289 +0000 UTC m=+0.183715288 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:17 np0005548789.localdomain podman[79958]: 2025-12-06 08:39:16.941648892 +0000 UTC m=+0.083928902 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:39:17 np0005548789.localdomain podman[79957]: 2025-12-06 08:39:17.071861349 +0000 UTC m=+0.220958318 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:39:17 np0005548789.localdomain podman[79951]: 2025-12-06 08:39:17.085126035 +0000 UTC m=+0.236983448 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:39:17 np0005548789.localdomain podman[79958]: 2025-12-06 08:39:17.123079267 +0000 UTC m=+0.265359257 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, config_id=tripleo_step4)
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:39:17 np0005548789.localdomain podman[79951]: 2025-12-06 08:39:17.139093758 +0000 UTC m=+0.290951171 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, url=https://www.redhat.com)
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:39:17 np0005548789.localdomain sshd[78868]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: session-33.scope: Consumed 5.637s CPU time.
Dec 06 08:39:17 np0005548789.localdomain systemd-logind[766]: Session 33 logged out. Waiting for processes to exit.
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:39:17 np0005548789.localdomain systemd-logind[766]: Removed session 33.
Dec 06 08:39:17 np0005548789.localdomain podman[80062]: 2025-12-06 08:39:17.546455003 +0000 UTC m=+0.086772659 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, name=rhosp17/openstack-nova-compute)
Dec 06 08:39:17 np0005548789.localdomain podman[80062]: 2025-12-06 08:39:17.901586439 +0000 UTC m=+0.441904045 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:39:17 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:39:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:39:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:39:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:39:21 np0005548789.localdomain podman[80131]: 2025-12-06 08:39:21.913896566 +0000 UTC m=+0.076775442 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=)
Dec 06 08:39:21 np0005548789.localdomain systemd[1]: tmp-crun.K2gCo9.mount: Deactivated successfully.
Dec 06 08:39:21 np0005548789.localdomain podman[80130]: 2025-12-06 08:39:21.969012274 +0000 UTC m=+0.132261951 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:39:22 np0005548789.localdomain podman[80130]: 2025-12-06 08:39:22.023189983 +0000 UTC m=+0.186439670 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Dec 06 08:39:22 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:39:22 np0005548789.localdomain podman[80132]: 2025-12-06 08:39:22.023551095 +0000 UTC m=+0.184170712 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:14:25Z)
Dec 06 08:39:22 np0005548789.localdomain podman[80131]: 2025-12-06 08:39:22.102130681 +0000 UTC m=+0.265009547 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:39:22 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:39:22 np0005548789.localdomain podman[80132]: 2025-12-06 08:39:22.160740466 +0000 UTC m=+0.321360063 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true)
Dec 06 08:39:22 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:39:22 np0005548789.localdomain systemd[1]: tmp-crun.OL1NaM.mount: Deactivated successfully.
Dec 06 08:39:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:39:26 np0005548789.localdomain podman[80203]: 2025-12-06 08:39:26.915683566 +0000 UTC m=+0.080947471 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, distribution-scope=public)
Dec 06 08:39:26 np0005548789.localdomain podman[80203]: 2025-12-06 08:39:26.94425154 +0000 UTC m=+0.109515415 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:39:26 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:39:29 np0005548789.localdomain sshd[80229]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:39:29 np0005548789.localdomain sshd[80229]: Accepted publickey for zuul from 38.102.83.114 port 37538 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:39:29 np0005548789.localdomain systemd-logind[766]: New session 34 of user zuul.
Dec 06 08:39:29 np0005548789.localdomain systemd[1]: Started Session 34 of User zuul.
Dec 06 08:39:29 np0005548789.localdomain sshd[80229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:39:29 np0005548789.localdomain sudo[80246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgprwsbqlwvbcgcywsobnacthznymzsx ; /usr/bin/python3
Dec 06 08:39:29 np0005548789.localdomain sudo[80246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:39:30 np0005548789.localdomain python3[80248]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:39:32 np0005548789.localdomain sudo[80246]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:38 np0005548789.localdomain sudo[80250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:39:38 np0005548789.localdomain sudo[80250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:38 np0005548789.localdomain sudo[80250]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:38 np0005548789.localdomain sudo[80265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:39:38 np0005548789.localdomain sudo[80265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:39 np0005548789.localdomain sudo[80265]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:40 np0005548789.localdomain sshd[80312]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:39:40 np0005548789.localdomain sshd[80312]: Invalid user sol from 92.118.39.95 port 36270
Dec 06 08:39:40 np0005548789.localdomain sshd[80312]: Connection closed by invalid user sol 92.118.39.95 port 36270 [preauth]
Dec 06 08:39:41 np0005548789.localdomain sudo[80314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:39:41 np0005548789.localdomain sudo[80314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:39:41 np0005548789.localdomain sudo[80314]: pam_unix(sudo:session): session closed for user root
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:39:47 np0005548789.localdomain podman[80331]: 2025-12-06 08:39:47.925234011 +0000 UTC m=+0.081629717 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:39:47 np0005548789.localdomain podman[80331]: 2025-12-06 08:39:47.960116078 +0000 UTC m=+0.116511814 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z)
Dec 06 08:39:47 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:39:47 np0005548789.localdomain podman[80333]: 2025-12-06 08:39:47.980989981 +0000 UTC m=+0.133474348 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:39:48 np0005548789.localdomain podman[80329]: 2025-12-06 08:39:48.041073074 +0000 UTC m=+0.197219212 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 06 08:39:48 np0005548789.localdomain podman[80333]: 2025-12-06 08:39:48.044116776 +0000 UTC m=+0.196601193 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:39:48 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:39:48 np0005548789.localdomain podman[80329]: 2025-12-06 08:39:48.078084626 +0000 UTC m=+0.234230754 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, batch=17.1_20251118.1)
Dec 06 08:39:48 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:39:48 np0005548789.localdomain podman[80388]: 2025-12-06 08:39:48.090151481 +0000 UTC m=+0.141713988 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:39:48 np0005548789.localdomain podman[80330]: 2025-12-06 08:39:48.135207029 +0000 UTC m=+0.291597315 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:39:48 np0005548789.localdomain podman[80332]: 2025-12-06 08:39:48.205928703 +0000 UTC m=+0.357551114 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:39:48 np0005548789.localdomain podman[80332]: 2025-12-06 08:39:48.21803938 +0000 UTC m=+0.369661811 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:48 np0005548789.localdomain podman[80330]: 2025-12-06 08:39:48.227814457 +0000 UTC m=+0.384204793 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Dec 06 08:39:48 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:39:48 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:39:48 np0005548789.localdomain podman[80388]: 2025-12-06 08:39:48.477793277 +0000 UTC m=+0.529355804 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:48 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:39:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:39:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:39:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:39:52 np0005548789.localdomain systemd[1]: tmp-crun.Ilslfu.mount: Deactivated successfully.
Dec 06 08:39:52 np0005548789.localdomain podman[80463]: 2025-12-06 08:39:52.932159418 +0000 UTC m=+0.096591841 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, release=1761123044)
Dec 06 08:39:52 np0005548789.localdomain podman[80463]: 2025-12-06 08:39:52.959173948 +0000 UTC m=+0.123606371 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller)
Dec 06 08:39:52 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:39:53 np0005548789.localdomain systemd[1]: tmp-crun.UhnBlX.mount: Deactivated successfully.
Dec 06 08:39:53 np0005548789.localdomain podman[80464]: 2025-12-06 08:39:53.021214349 +0000 UTC m=+0.182288829 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:39:53 np0005548789.localdomain podman[80465]: 2025-12-06 08:39:53.079741904 +0000 UTC m=+0.237685099 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:39:53 np0005548789.localdomain podman[80465]: 2025-12-06 08:39:53.123071417 +0000 UTC m=+0.281014572 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git)
Dec 06 08:39:53 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:39:53 np0005548789.localdomain podman[80464]: 2025-12-06 08:39:53.249058028 +0000 UTC m=+0.410132438 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044)
Dec 06 08:39:53 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:39:56 np0005548789.localdomain sudo[80551]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwzqtlamdfqbouembkayeagtqwdvugtd ; /usr/bin/python3
Dec 06 08:39:56 np0005548789.localdomain sudo[80551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:39:57 np0005548789.localdomain python3[80553]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 08:39:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:39:57 np0005548789.localdomain podman[80555]: 2025-12-06 08:39:57.895130213 +0000 UTC m=+0.062433134 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 08:39:57 np0005548789.localdomain podman[80555]: 2025-12-06 08:39:57.923099081 +0000 UTC m=+0.090402022 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, architecture=x86_64)
Dec 06 08:39:57 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:40:00 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:40:00 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 08:40:00 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 08:40:01 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 08:40:01 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 08:40:01 np0005548789.localdomain systemd[1]: run-rea2e02d7346144b99da475d0d939a2a4.service: Deactivated successfully.
Dec 06 08:40:01 np0005548789.localdomain systemd[1]: run-reab5424f6eeb43719b36a37c932c6b3a.service: Deactivated successfully.
Dec 06 08:40:01 np0005548789.localdomain sudo[80551]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:40:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 5168 writes, 22K keys, 5168 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5168 writes, 575 syncs, 8.99 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:40:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:40:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.2 total, 600.0 interval
                                                          Cumulative writes: 4467 writes, 20K keys, 4467 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4467 writes, 521 syncs, 8.57 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:40:18 np0005548789.localdomain podman[80732]: 2025-12-06 08:40:18.953137169 +0000 UTC m=+0.106949815 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=)
Dec 06 08:40:18 np0005548789.localdomain systemd[1]: tmp-crun.QvOZzq.mount: Deactivated successfully.
Dec 06 08:40:18 np0005548789.localdomain podman[80730]: 2025-12-06 08:40:18.991924295 +0000 UTC m=+0.149122103 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: tmp-crun.UwHNXJ.mount: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain podman[80731]: 2025-12-06 08:40:19.048034627 +0000 UTC m=+0.203545633 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Dec 06 08:40:19 np0005548789.localdomain podman[80731]: 2025-12-06 08:40:19.056475093 +0000 UTC m=+0.211986099 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, container_name=collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain podman[80730]: 2025-12-06 08:40:19.075915192 +0000 UTC m=+0.233113010 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain podman[80739]: 2025-12-06 08:40:19.139271274 +0000 UTC m=+0.285029055 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044)
Dec 06 08:40:19 np0005548789.localdomain podman[80739]: 2025-12-06 08:40:19.171343376 +0000 UTC m=+0.317101137 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true)
Dec 06 08:40:19 np0005548789.localdomain podman[80733]: 2025-12-06 08:40:19.197441318 +0000 UTC m=+0.347797978 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain podman[80749]: 2025-12-06 08:40:19.241875786 +0000 UTC m=+0.384223804 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:40:19 np0005548789.localdomain podman[80749]: 2025-12-06 08:40:19.267259455 +0000 UTC m=+0.409607483 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:40:19 np0005548789.localdomain podman[80733]: 2025-12-06 08:40:19.276448004 +0000 UTC m=+0.426804634 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:40:19 np0005548789.localdomain podman[80732]: 2025-12-06 08:40:19.32579987 +0000 UTC m=+0.479612446 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Dec 06 08:40:19 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:40:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:40:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:40:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:40:23 np0005548789.localdomain systemd[1]: tmp-crun.5vnnM5.mount: Deactivated successfully.
Dec 06 08:40:23 np0005548789.localdomain podman[80905]: 2025-12-06 08:40:23.917894169 +0000 UTC m=+0.074554002 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z)
Dec 06 08:40:23 np0005548789.localdomain podman[80905]: 2025-12-06 08:40:23.935031479 +0000 UTC m=+0.091691342 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 06 08:40:23 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:40:23 np0005548789.localdomain podman[80906]: 2025-12-06 08:40:23.982240881 +0000 UTC m=+0.138540163 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 08:40:24 np0005548789.localdomain podman[80907]: 2025-12-06 08:40:24.032273078 +0000 UTC m=+0.186473166 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4)
Dec 06 08:40:24 np0005548789.localdomain podman[80907]: 2025-12-06 08:40:24.097099314 +0000 UTC m=+0.251299392 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:40:24 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:40:24 np0005548789.localdomain podman[80906]: 2025-12-06 08:40:24.185476193 +0000 UTC m=+0.341775435 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z)
Dec 06 08:40:24 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:40:26 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:40:26 np0005548789.localdomain recover_tripleo_nova_virtqemud[80981]: 61814
Dec 06 08:40:26 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:40:26 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:40:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:40:28 np0005548789.localdomain podman[80982]: 2025-12-06 08:40:28.908485571 +0000 UTC m=+0.070590151 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5)
Dec 06 08:40:28 np0005548789.localdomain podman[80982]: 2025-12-06 08:40:28.961294653 +0000 UTC m=+0.123399333 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc.)
Dec 06 08:40:28 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:40:42 np0005548789.localdomain sudo[81010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:40:42 np0005548789.localdomain sudo[81010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548789.localdomain sudo[81010]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548789.localdomain sudo[81025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:40:42 np0005548789.localdomain sudo[81025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548789.localdomain sudo[81025]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548789.localdomain sudo[81061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:40:42 np0005548789.localdomain sudo[81061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:42 np0005548789.localdomain sudo[81061]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:42 np0005548789.localdomain sudo[81076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:40:42 np0005548789.localdomain sudo[81076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:43 np0005548789.localdomain sudo[81076]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:44 np0005548789.localdomain sudo[81124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:40:44 np0005548789.localdomain sudo[81124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:40:44 np0005548789.localdomain sudo[81124]: pam_unix(sudo:session): session closed for user root
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:40:49 np0005548789.localdomain podman[81154]: 2025-12-06 08:40:49.95900606 +0000 UTC m=+0.101220150 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:40:49 np0005548789.localdomain systemd[1]: tmp-crun.ce7y2a.mount: Deactivated successfully.
Dec 06 08:40:49 np0005548789.localdomain podman[81140]: 2025-12-06 08:40:49.979067319 +0000 UTC m=+0.135789139 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Dec 06 08:40:50 np0005548789.localdomain podman[81142]: 2025-12-06 08:40:50.01801435 +0000 UTC m=+0.173679788 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:40:50 np0005548789.localdomain podman[81139]: 2025-12-06 08:40:50.024461686 +0000 UTC m=+0.180624259 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron)
Dec 06 08:40:50 np0005548789.localdomain podman[81139]: 2025-12-06 08:40:50.033989244 +0000 UTC m=+0.190151827 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, distribution-scope=public, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:40:50 np0005548789.localdomain podman[81154]: 2025-12-06 08:40:50.036959934 +0000 UTC m=+0.179174044 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:40:50 np0005548789.localdomain podman[81142]: 2025-12-06 08:40:50.047329929 +0000 UTC m=+0.202995317 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:40:50 np0005548789.localdomain podman[81140]: 2025-12-06 08:40:50.09153896 +0000 UTC m=+0.248260760 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container)
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:40:50 np0005548789.localdomain podman[81143]: 2025-12-06 08:40:50.124976634 +0000 UTC m=+0.276776955 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public)
Dec 06 08:40:50 np0005548789.localdomain podman[81143]: 2025-12-06 08:40:50.137264866 +0000 UTC m=+0.289065197 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:40:50 np0005548789.localdomain podman[81141]: 2025-12-06 08:40:49.93821582 +0000 UTC m=+0.091695402 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64)
Dec 06 08:40:50 np0005548789.localdomain podman[81141]: 2025-12-06 08:40:50.293303538 +0000 UTC m=+0.446783120 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Dec 06 08:40:50 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:40:52 np0005548789.localdomain sudo[81284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfwcjfjwlnvuxaflkpmjsmbkuxxjseyq ; /usr/bin/python3
Dec 06 08:40:52 np0005548789.localdomain sudo[81284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:40:52 np0005548789.localdomain python3[81286]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:40:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:40:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:40:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:40:54 np0005548789.localdomain systemd[1]: tmp-crun.67iZVj.mount: Deactivated successfully.
Dec 06 08:40:55 np0005548789.localdomain podman[81291]: 2025-12-06 08:40:55.000922559 +0000 UTC m=+0.153660121 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr)
Dec 06 08:40:55 np0005548789.localdomain podman[81290]: 2025-12-06 08:40:54.955865852 +0000 UTC m=+0.111077318 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:40:55 np0005548789.localdomain podman[81290]: 2025-12-06 08:40:55.039425087 +0000 UTC m=+0.194636563 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:40:55 np0005548789.localdomain podman[81292]: 2025-12-06 08:40:55.053364419 +0000 UTC m=+0.198193211 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent)
Dec 06 08:40:55 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:40:55 np0005548789.localdomain podman[81292]: 2025-12-06 08:40:55.126385614 +0000 UTC m=+0.271214416 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 08:40:55 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:40:55 np0005548789.localdomain podman[81291]: 2025-12-06 08:40:55.214421633 +0000 UTC m=+0.367159185 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:40:55 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:40:56 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:40:56 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:40:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:40:59 np0005548789.localdomain podman[81492]: 2025-12-06 08:40:59.908892555 +0000 UTC m=+0.068994473 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:40:59 np0005548789.localdomain podman[81492]: 2025-12-06 08:40:59.941103192 +0000 UTC m=+0.101205110 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, release=1761123044, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z)
Dec 06 08:40:59 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:41:02 np0005548789.localdomain sudo[81284]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: tmp-crun.8OjNL9.mount: Deactivated successfully.
Dec 06 08:41:20 np0005548789.localdomain podman[81577]: 2025-12-06 08:41:20.947247016 +0000 UTC m=+0.103269703 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:41:20 np0005548789.localdomain podman[81577]: 2025-12-06 08:41:20.985103764 +0000 UTC m=+0.141126441 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: tmp-crun.cgMslG.mount: Deactivated successfully.
Dec 06 08:41:20 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:41:21 np0005548789.localdomain podman[81579]: 2025-12-06 08:41:21.001353797 +0000 UTC m=+0.153780694 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target)
Dec 06 08:41:21 np0005548789.localdomain podman[81578]: 2025-12-06 08:41:21.049089985 +0000 UTC m=+0.203105451 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:41:21 np0005548789.localdomain podman[81578]: 2025-12-06 08:41:21.090970584 +0000 UTC m=+0.244986070 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true)
Dec 06 08:41:21 np0005548789.localdomain podman[81580]: 2025-12-06 08:41:21.101503203 +0000 UTC m=+0.247783484 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Dec 06 08:41:21 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:41:21 np0005548789.localdomain podman[81586]: 2025-12-06 08:41:21.159617776 +0000 UTC m=+0.304418303 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, release=1761123044, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container)
Dec 06 08:41:21 np0005548789.localdomain podman[81586]: 2025-12-06 08:41:21.173075904 +0000 UTC m=+0.317876451 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, container_name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Dec 06 08:41:21 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:41:21 np0005548789.localdomain podman[81580]: 2025-12-06 08:41:21.189228834 +0000 UTC m=+0.335509125 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible)
Dec 06 08:41:21 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:41:21 np0005548789.localdomain podman[81599]: 2025-12-06 08:41:21.25769095 +0000 UTC m=+0.398331470 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:12:45Z)
Dec 06 08:41:21 np0005548789.localdomain podman[81599]: 2025-12-06 08:41:21.290935188 +0000 UTC m=+0.431575708 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:41:21 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:41:21 np0005548789.localdomain podman[81579]: 2025-12-06 08:41:21.359152917 +0000 UTC m=+0.511579864 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:41:21 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:41:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:41:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:41:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:41:25 np0005548789.localdomain podman[81751]: 2025-12-06 08:41:25.920619047 +0000 UTC m=+0.083516584 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 08:41:25 np0005548789.localdomain podman[81751]: 2025-12-06 08:41:25.975702836 +0000 UTC m=+0.138600373 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public)
Dec 06 08:41:25 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:41:26 np0005548789.localdomain podman[81753]: 2025-12-06 08:41:25.976709488 +0000 UTC m=+0.133824340 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12)
Dec 06 08:41:26 np0005548789.localdomain podman[81752]: 2025-12-06 08:41:26.033881181 +0000 UTC m=+0.193914271 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd)
Dec 06 08:41:26 np0005548789.localdomain podman[81753]: 2025-12-06 08:41:26.056113305 +0000 UTC m=+0.213228077 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12)
Dec 06 08:41:26 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:41:26 np0005548789.localdomain podman[81752]: 2025-12-06 08:41:26.235111014 +0000 UTC m=+0.395144114 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team)
Dec 06 08:41:26 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:41:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:41:30 np0005548789.localdomain podman[81823]: 2025-12-06 08:41:30.915248651 +0000 UTC m=+0.077272244 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, vcs-type=git, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 06 08:41:31 np0005548789.localdomain podman[81823]: 2025-12-06 08:41:31.016202262 +0000 UTC m=+0.178225805 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044)
Dec 06 08:41:31 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:41:44 np0005548789.localdomain sudo[81849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:41:44 np0005548789.localdomain sudo[81849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:44 np0005548789.localdomain sudo[81849]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:44 np0005548789.localdomain sudo[81864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:41:44 np0005548789.localdomain sudo[81864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:44 np0005548789.localdomain sshd[81879]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:44 np0005548789.localdomain sshd[81879]: Invalid user sol from 92.118.39.95 port 51282
Dec 06 08:41:44 np0005548789.localdomain sudo[81864]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:44 np0005548789.localdomain sshd[81879]: Connection closed by invalid user sol 92.118.39.95 port 51282 [preauth]
Dec 06 08:41:45 np0005548789.localdomain sshd[81914]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:45 np0005548789.localdomain sudo[81915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:41:45 np0005548789.localdomain sudo[81915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:41:45 np0005548789.localdomain sudo[81915]: pam_unix(sudo:session): session closed for user root
Dec 06 08:41:47 np0005548789.localdomain sshd[81914]: Connection reset by authenticating user root 45.135.232.92 port 51076 [preauth]
Dec 06 08:41:48 np0005548789.localdomain sshd[81931]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:49 np0005548789.localdomain sshd[81931]: Invalid user ubnt from 45.135.232.92 port 51088
Dec 06 08:41:49 np0005548789.localdomain sshd[81931]: Connection reset by invalid user ubnt 45.135.232.92 port 51088 [preauth]
Dec 06 08:41:49 np0005548789.localdomain sshd[81933]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:51 np0005548789.localdomain sshd[81933]: Invalid user admin from 45.135.232.92 port 51102
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: tmp-crun.vAIxSi.mount: Deactivated successfully.
Dec 06 08:41:51 np0005548789.localdomain podman[81937]: 2025-12-06 08:41:51.753038358 +0000 UTC m=+0.124134206 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:41:51 np0005548789.localdomain podman[81936]: 2025-12-06 08:41:51.766017261 +0000 UTC m=+0.139231462 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:41:51 np0005548789.localdomain podman[81938]: 2025-12-06 08:41:51.809901132 +0000 UTC m=+0.171642255 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 06 08:41:51 np0005548789.localdomain podman[81936]: 2025-12-06 08:41:51.824073172 +0000 UTC m=+0.197287393 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true)
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:41:51 np0005548789.localdomain podman[81945]: 2025-12-06 08:41:51.873040938 +0000 UTC m=+0.232183632 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12)
Dec 06 08:41:51 np0005548789.localdomain podman[81945]: 2025-12-06 08:41:51.880750701 +0000 UTC m=+0.239893375 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com)
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:41:51 np0005548789.localdomain podman[81955]: 2025-12-06 08:41:51.936643406 +0000 UTC m=+0.246122565 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12)
Dec 06 08:41:51 np0005548789.localdomain podman[81938]: 2025-12-06 08:41:51.945267917 +0000 UTC m=+0.307009030 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git)
Dec 06 08:41:51 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:41:51 np0005548789.localdomain podman[81935]: 2025-12-06 08:41:51.85628376 +0000 UTC m=+0.228414569 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:41:51 np0005548789.localdomain podman[81935]: 2025-12-06 08:41:51.989434867 +0000 UTC m=+0.361565706 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:41:52 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:41:52 np0005548789.localdomain podman[81955]: 2025-12-06 08:41:52.04560934 +0000 UTC m=+0.355088549 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:41:52 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:41:52 np0005548789.localdomain podman[81937]: 2025-12-06 08:41:52.110276261 +0000 UTC m=+0.481372159 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Dec 06 08:41:52 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:41:52 np0005548789.localdomain sshd[81933]: Connection reset by invalid user admin 45.135.232.92 port 51102 [preauth]
Dec 06 08:41:52 np0005548789.localdomain sshd[82066]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:54 np0005548789.localdomain sshd[82066]: Connection reset by authenticating user ftp 45.135.232.92 port 51104 [preauth]
Dec 06 08:41:54 np0005548789.localdomain sshd[82069]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:41:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:41:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:41:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:41:56 np0005548789.localdomain systemd[1]: tmp-crun.8zuOwL.mount: Deactivated successfully.
Dec 06 08:41:56 np0005548789.localdomain podman[82071]: 2025-12-06 08:41:56.944319727 +0000 UTC m=+0.101078957 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public)
Dec 06 08:41:56 np0005548789.localdomain podman[82071]: 2025-12-06 08:41:56.990328712 +0000 UTC m=+0.147087912 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=)
Dec 06 08:41:56 np0005548789.localdomain systemd[1]: tmp-crun.KuPQqh.mount: Deactivated successfully.
Dec 06 08:41:57 np0005548789.localdomain podman[82072]: 2025-12-06 08:41:56.999667185 +0000 UTC m=+0.154792956 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 06 08:41:57 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:41:57 np0005548789.localdomain podman[82073]: 2025-12-06 08:41:57.052977612 +0000 UTC m=+0.204542164 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:41:57 np0005548789.localdomain podman[82073]: 2025-12-06 08:41:57.106125314 +0000 UTC m=+0.257689846 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:41:57 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:41:57 np0005548789.localdomain sshd[82069]: Invalid user admin from 45.135.232.92 port 21508
Dec 06 08:41:57 np0005548789.localdomain podman[82072]: 2025-12-06 08:41:57.222162202 +0000 UTC m=+0.377288023 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:41:57 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:41:57 np0005548789.localdomain sshd[82069]: Connection reset by invalid user admin 45.135.232.92 port 21508 [preauth]
Dec 06 08:42:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:42:02 np0005548789.localdomain podman[82148]: 2025-12-06 08:42:02.045137031 +0000 UTC m=+0.051017059 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Dec 06 08:42:02 np0005548789.localdomain podman[82148]: 2025-12-06 08:42:02.069050046 +0000 UTC m=+0.074930074 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:42:02 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:42:02 np0005548789.localdomain sshd[80232]: Received disconnect from 38.102.83.114 port 37538:11: disconnected by user
Dec 06 08:42:02 np0005548789.localdomain sshd[80232]: Disconnected from user zuul 38.102.83.114 port 37538
Dec 06 08:42:02 np0005548789.localdomain sshd[80229]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:42:02 np0005548789.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Dec 06 08:42:02 np0005548789.localdomain systemd[1]: session-34.scope: Consumed 13.946s CPU time.
Dec 06 08:42:02 np0005548789.localdomain systemd-logind[766]: Session 34 logged out. Waiting for processes to exit.
Dec 06 08:42:02 np0005548789.localdomain systemd-logind[766]: Removed session 34.
Dec 06 08:42:04 np0005548789.localdomain sshd[82174]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:42:04 np0005548789.localdomain sshd[82174]: Accepted publickey for zuul from 38.102.83.114 port 42448 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 08:42:04 np0005548789.localdomain systemd-logind[766]: New session 35 of user zuul.
Dec 06 08:42:04 np0005548789.localdomain systemd[1]: Started Session 35 of User zuul.
Dec 06 08:42:04 np0005548789.localdomain sshd[82174]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 08:42:04 np0005548789.localdomain sudo[82191]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsuzypcfswyjqxbmjnlwqssumzhovuxk ; /usr/bin/python3
Dec 06 08:42:04 np0005548789.localdomain sudo[82191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 08:42:04 np0005548789.localdomain python3[82193]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 08:42:08 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:42:08 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 08:42:11 np0005548789.localdomain sudo[82191]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:16 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:42:16 np0005548789.localdomain recover_tripleo_nova_virtqemud[82382]: 61814
Dec 06 08:42:16 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:42:16 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:42:22 np0005548789.localdomain podman[82406]: 2025-12-06 08:42:22.929547802 +0000 UTC m=+0.090593779 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:42:22 np0005548789.localdomain podman[82406]: 2025-12-06 08:42:22.942252136 +0000 UTC m=+0.103298093 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: tmp-crun.osZj4M.mount: Deactivated successfully.
Dec 06 08:42:22 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:42:22 np0005548789.localdomain podman[82426]: 2025-12-06 08:42:22.955378075 +0000 UTC m=+0.098025664 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:42:23 np0005548789.localdomain podman[82408]: 2025-12-06 08:42:22.999425581 +0000 UTC m=+0.155196438 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:42:23 np0005548789.localdomain podman[82409]: 2025-12-06 08:42:23.038450414 +0000 UTC m=+0.189958041 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public)
Dec 06 08:42:23 np0005548789.localdomain podman[82407]: 2025-12-06 08:42:23.048864959 +0000 UTC m=+0.206184003 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:42:23 np0005548789.localdomain podman[82426]: 2025-12-06 08:42:23.080265663 +0000 UTC m=+0.222913282 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi)
Dec 06 08:42:23 np0005548789.localdomain podman[82409]: 2025-12-06 08:42:23.089451681 +0000 UTC m=+0.240959328 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:42:23 np0005548789.localdomain podman[82407]: 2025-12-06 08:42:23.136216448 +0000 UTC m=+0.293535512 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public)
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:42:23 np0005548789.localdomain podman[82415]: 2025-12-06 08:42:23.091347298 +0000 UTC m=+0.238203165 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:42:23 np0005548789.localdomain podman[82415]: 2025-12-06 08:42:23.220929118 +0000 UTC m=+0.367784935 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044)
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:42:23 np0005548789.localdomain podman[82408]: 2025-12-06 08:42:23.338544845 +0000 UTC m=+0.494315712 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044)
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:42:23 np0005548789.localdomain systemd[1]: tmp-crun.glsk2O.mount: Deactivated successfully.
Dec 06 08:42:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:42:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:42:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:42:27 np0005548789.localdomain podman[82562]: 2025-12-06 08:42:27.916872714 +0000 UTC m=+0.077180642 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 08:42:27 np0005548789.localdomain podman[82561]: 2025-12-06 08:42:27.980116282 +0000 UTC m=+0.141319267 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:42:28 np0005548789.localdomain podman[82561]: 2025-12-06 08:42:28.001062497 +0000 UTC m=+0.162265462 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, container_name=ovn_controller)
Dec 06 08:42:28 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:42:28 np0005548789.localdomain podman[82563]: 2025-12-06 08:42:28.09611309 +0000 UTC m=+0.250434546 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:42:28 np0005548789.localdomain podman[82562]: 2025-12-06 08:42:28.11031755 +0000 UTC m=+0.270625508 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com)
Dec 06 08:42:28 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:42:28 np0005548789.localdomain podman[82563]: 2025-12-06 08:42:28.178311412 +0000 UTC m=+0.332632808 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z)
Dec 06 08:42:28 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:42:28 np0005548789.localdomain systemd[1]: tmp-crun.r660r9.mount: Deactivated successfully.
Dec 06 08:42:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:42:32 np0005548789.localdomain podman[82634]: 2025-12-06 08:42:32.921632736 +0000 UTC m=+0.082637158 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044)
Dec 06 08:42:32 np0005548789.localdomain podman[82634]: 2025-12-06 08:42:32.975260162 +0000 UTC m=+0.136264514 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute)
Dec 06 08:42:32 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:42:33 np0005548789.localdomain python3[82673]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Dec 06 08:42:45 np0005548789.localdomain sudo[82674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:42:45 np0005548789.localdomain sudo[82674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:45 np0005548789.localdomain sudo[82674]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:45 np0005548789.localdomain sudo[82689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:42:45 np0005548789.localdomain sudo[82689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:46 np0005548789.localdomain podman[82777]: 2025-12-06 08:42:46.62804494 +0000 UTC m=+0.116337628 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 08:42:46 np0005548789.localdomain podman[82777]: 2025-12-06 08:42:46.759130466 +0000 UTC m=+0.247423174 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.41.4, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 08:42:47 np0005548789.localdomain sudo[82689]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:47 np0005548789.localdomain sudo[82843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:42:47 np0005548789.localdomain sudo[82843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:47 np0005548789.localdomain sudo[82843]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:47 np0005548789.localdomain sudo[82858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:42:47 np0005548789.localdomain sudo[82858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:47 np0005548789.localdomain sudo[82858]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:48 np0005548789.localdomain sudo[82905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:42:48 np0005548789.localdomain sudo[82905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:42:48 np0005548789.localdomain sudo[82905]: pam_unix(sudo:session): session closed for user root
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:42:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:42:53 np0005548789.localdomain podman[82924]: 2025-12-06 08:42:53.959347586 +0000 UTC m=+0.101532370 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:42:53 np0005548789.localdomain podman[82924]: 2025-12-06 08:42:53.997162713 +0000 UTC m=+0.139347547 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:42:54 np0005548789.localdomain podman[82922]: 2025-12-06 08:42:54.00528771 +0000 UTC m=+0.158980863 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_migration_target)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain podman[82920]: 2025-12-06 08:42:53.941879916 +0000 UTC m=+0.099153817 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:42:54 np0005548789.localdomain podman[82923]: 2025-12-06 08:42:54.083230862 +0000 UTC m=+0.234571363 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:42:54 np0005548789.localdomain podman[82936]: 2025-12-06 08:42:54.108465818 +0000 UTC m=+0.249758335 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:42:54 np0005548789.localdomain podman[82936]: 2025-12-06 08:42:54.192362782 +0000 UTC m=+0.333655319 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain podman[82921]: 2025-12-06 08:42:54.161869518 +0000 UTC m=+0.319689537 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team)
Dec 06 08:42:54 np0005548789.localdomain podman[82923]: 2025-12-06 08:42:54.217233116 +0000 UTC m=+0.368573587 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain podman[82920]: 2025-12-06 08:42:54.239176042 +0000 UTC m=+0.396449953 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.12, tcib_managed=true, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain podman[82921]: 2025-12-06 08:42:54.292693185 +0000 UTC m=+0.450513234 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain podman[82922]: 2025-12-06 08:42:54.406231858 +0000 UTC m=+0.559925041 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:42:54 np0005548789.localdomain systemd[1]: tmp-crun.2RJIjD.mount: Deactivated successfully.
Dec 06 08:42:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:42:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:42:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:42:58 np0005548789.localdomain systemd[1]: tmp-crun.aRP2ZR.mount: Deactivated successfully.
Dec 06 08:42:58 np0005548789.localdomain podman[83053]: 2025-12-06 08:42:58.985615461 +0000 UTC m=+0.142970627 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:42:59 np0005548789.localdomain podman[83053]: 2025-12-06 08:42:59.041230928 +0000 UTC m=+0.198586124 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 08:42:59 np0005548789.localdomain podman[83054]: 2025-12-06 08:42:58.951460705 +0000 UTC m=+0.103503840 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:42:59 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:42:59 np0005548789.localdomain podman[83055]: 2025-12-06 08:42:59.0452449 +0000 UTC m=+0.194086418 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 06 08:42:59 np0005548789.localdomain podman[83055]: 2025-12-06 08:42:59.126535125 +0000 UTC m=+0.275376573 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 08:42:59 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:42:59 np0005548789.localdomain podman[83054]: 2025-12-06 08:42:59.151220304 +0000 UTC m=+0.303263439 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=metrics_qdr, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:42:59 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:43:00 np0005548789.localdomain sshd[83127]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:43:00 np0005548789.localdomain sshd[83129]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:43:00 np0005548789.localdomain sshd[83129]: error: kex_exchange_identification: client sent invalid protocol identifier "MGLNDD_38.102.83.150_22"
Dec 06 08:43:00 np0005548789.localdomain sshd[83129]: banner exchange: Connection from 20.64.105.32 port 39292: invalid format
Dec 06 08:43:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:43:03 np0005548789.localdomain podman[83130]: 2025-12-06 08:43:03.90781954 +0000 UTC m=+0.069480268 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:03 np0005548789.localdomain podman[83130]: 2025-12-06 08:43:03.954373111 +0000 UTC m=+0.116033819 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:43:03 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:43:10 np0005548789.localdomain sshd[83127]: Connection closed by 20.64.105.32 port 39284 [preauth]
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:43:24 np0005548789.localdomain systemd[1]: tmp-crun.PNoK57.mount: Deactivated successfully.
Dec 06 08:43:24 np0005548789.localdomain podman[83202]: 2025-12-06 08:43:24.927463995 +0000 UTC m=+0.086847425 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, tcib_managed=true)
Dec 06 08:43:24 np0005548789.localdomain podman[83203]: 2025-12-06 08:43:24.986071022 +0000 UTC m=+0.138582984 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, vcs-type=git, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:25 np0005548789.localdomain podman[83202]: 2025-12-06 08:43:25.010405389 +0000 UTC m=+0.169788829 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, name=rhosp17/openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:43:25 np0005548789.localdomain podman[83204]: 2025-12-06 08:43:25.103066229 +0000 UTC m=+0.255840799 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044)
Dec 06 08:43:25 np0005548789.localdomain podman[83204]: 2025-12-06 08:43:25.133986838 +0000 UTC m=+0.286761438 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:43:25 np0005548789.localdomain podman[83217]: 2025-12-06 08:43:25.148524669 +0000 UTC m=+0.293095660 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4)
Dec 06 08:43:25 np0005548789.localdomain podman[83201]: 2025-12-06 08:43:25.19901489 +0000 UTC m=+0.360144873 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, container_name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:43:25 np0005548789.localdomain podman[83211]: 2025-12-06 08:43:24.965656102 +0000 UTC m=+0.111384558 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:43:25 np0005548789.localdomain podman[83217]: 2025-12-06 08:43:25.225504313 +0000 UTC m=+0.370075304 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:43:25 np0005548789.localdomain podman[83201]: 2025-12-06 08:43:25.23433658 +0000 UTC m=+0.395466633 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:43:25 np0005548789.localdomain podman[83211]: 2025-12-06 08:43:25.25213149 +0000 UTC m=+0.397859876 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:43:25 np0005548789.localdomain podman[83203]: 2025-12-06 08:43:25.376074049 +0000 UTC m=+0.528586011 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com)
Dec 06 08:43:25 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:43:29 np0005548789.localdomain recover_tripleo_nova_virtqemud[83342]: 61814
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:43:29 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:43:29 np0005548789.localdomain podman[83329]: 2025-12-06 08:43:29.928224417 +0000 UTC m=+0.085540195 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:43:29 np0005548789.localdomain podman[83328]: 2025-12-06 08:43:29.989278519 +0000 UTC m=+0.148719651 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:43:30 np0005548789.localdomain podman[83328]: 2025-12-06 08:43:30.021124155 +0000 UTC m=+0.180565267 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:43:30 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:43:30 np0005548789.localdomain podman[83330]: 2025-12-06 08:43:30.092880881 +0000 UTC m=+0.245210938 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:43:30 np0005548789.localdomain podman[83330]: 2025-12-06 08:43:30.120139117 +0000 UTC m=+0.272469184 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64)
Dec 06 08:43:30 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:43:30 np0005548789.localdomain podman[83329]: 2025-12-06 08:43:30.155221741 +0000 UTC m=+0.312537439 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.)
Dec 06 08:43:30 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:43:33 np0005548789.localdomain sshd[82177]: Received disconnect from 38.102.83.114 port 42448:11: disconnected by user
Dec 06 08:43:33 np0005548789.localdomain sshd[82177]: Disconnected from user zuul 38.102.83.114 port 42448
Dec 06 08:43:33 np0005548789.localdomain sshd[82174]: pam_unix(sshd:session): session closed for user zuul
Dec 06 08:43:33 np0005548789.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Dec 06 08:43:33 np0005548789.localdomain systemd[1]: session-35.scope: Consumed 5.876s CPU time.
Dec 06 08:43:33 np0005548789.localdomain systemd-logind[766]: Session 35 logged out. Waiting for processes to exit.
Dec 06 08:43:33 np0005548789.localdomain systemd-logind[766]: Removed session 35.
Dec 06 08:43:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:43:34 np0005548789.localdomain podman[83404]: 2025-12-06 08:43:34.913391285 +0000 UTC m=+0.075191921 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 08:43:34 np0005548789.localdomain podman[83404]: 2025-12-06 08:43:34.944187009 +0000 UTC m=+0.105987715 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, version=17.1.12)
Dec 06 08:43:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:43:48 np0005548789.localdomain sudo[83431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:43:48 np0005548789.localdomain sudo[83431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:48 np0005548789.localdomain sudo[83431]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:48 np0005548789.localdomain sudo[83446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:43:48 np0005548789.localdomain sudo[83446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:49 np0005548789.localdomain sudo[83446]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:50 np0005548789.localdomain sudo[83493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:43:50 np0005548789.localdomain sudo[83493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:43:50 np0005548789.localdomain sudo[83493]: pam_unix(sudo:session): session closed for user root
Dec 06 08:43:50 np0005548789.localdomain sshd[83508]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:43:51 np0005548789.localdomain sshd[83508]: Invalid user validator from 92.118.39.95 port 37990
Dec 06 08:43:51 np0005548789.localdomain sshd[83508]: Connection closed by invalid user validator 92.118.39.95 port 37990 [preauth]
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:43:55 np0005548789.localdomain systemd[1]: tmp-crun.JtGThZ.mount: Deactivated successfully.
Dec 06 08:43:55 np0005548789.localdomain podman[83531]: 2025-12-06 08:43:55.986295755 +0000 UTC m=+0.104614440 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044)
Dec 06 08:43:56 np0005548789.localdomain podman[83524]: 2025-12-06 08:43:56.020998036 +0000 UTC m=+0.142619032 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com)
Dec 06 08:43:56 np0005548789.localdomain podman[83524]: 2025-12-06 08:43:56.055402418 +0000 UTC m=+0.177023444 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain podman[83531]: 2025-12-06 08:43:56.065983752 +0000 UTC m=+0.184302427 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain podman[83510]: 2025-12-06 08:43:56.072409268 +0000 UTC m=+0.206704151 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1)
Dec 06 08:43:56 np0005548789.localdomain podman[83511]: 2025-12-06 08:43:56.1309997 +0000 UTC m=+0.263155679 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:43:56 np0005548789.localdomain podman[83511]: 2025-12-06 08:43:56.139326545 +0000 UTC m=+0.271482574 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain podman[83510]: 2025-12-06 08:43:56.203004422 +0000 UTC m=+0.337299335 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain podman[83513]: 2025-12-06 08:43:56.193619335 +0000 UTC m=+0.318932984 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:43:56 np0005548789.localdomain podman[83512]: 2025-12-06 08:43:56.274794438 +0000 UTC m=+0.404638166 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:56 np0005548789.localdomain podman[83513]: 2025-12-06 08:43:56.327557751 +0000 UTC m=+0.452871440 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain podman[83512]: 2025-12-06 08:43:56.652175977 +0000 UTC m=+0.782019715 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:43:56 np0005548789.localdomain systemd[1]: tmp-crun.nqTMms.mount: Deactivated successfully.
Dec 06 08:44:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:44:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:44:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:44:00 np0005548789.localdomain systemd[1]: tmp-crun.dz3wuw.mount: Deactivated successfully.
Dec 06 08:44:00 np0005548789.localdomain podman[83645]: 2025-12-06 08:44:00.928534156 +0000 UTC m=+0.089530039 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:44:00 np0005548789.localdomain podman[83644]: 2025-12-06 08:44:00.986349554 +0000 UTC m=+0.148680108 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:44:01 np0005548789.localdomain podman[83645]: 2025-12-06 08:44:01.016057852 +0000 UTC m=+0.177053725 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:44:01 np0005548789.localdomain podman[83643]: 2025-12-06 08:44:00.965700882 +0000 UTC m=+0.129155700 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, container_name=ovn_controller, architecture=x86_64)
Dec 06 08:44:01 np0005548789.localdomain podman[83643]: 2025-12-06 08:44:01.048155124 +0000 UTC m=+0.211609892 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z)
Dec 06 08:44:01 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:44:01 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:44:01 np0005548789.localdomain podman[83644]: 2025-12-06 08:44:01.247249522 +0000 UTC m=+0.409580106 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, container_name=metrics_qdr, version=17.1.12, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:44:01 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:44:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:44:05 np0005548789.localdomain podman[83720]: 2025-12-06 08:44:05.945713198 +0000 UTC m=+0.108740447 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:44:05 np0005548789.localdomain podman[83720]: 2025-12-06 08:44:05.971326111 +0000 UTC m=+0.134353400 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:44:05 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:44:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:44:26 np0005548789.localdomain podman[83805]: 2025-12-06 08:44:26.932533883 +0000 UTC m=+0.081640417 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:26 np0005548789.localdomain podman[83791]: 2025-12-06 08:44:26.97721368 +0000 UTC m=+0.133405180 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Dec 06 08:44:26 np0005548789.localdomain podman[83791]: 2025-12-06 08:44:26.993942041 +0000 UTC m=+0.150133531 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd)
Dec 06 08:44:27 np0005548789.localdomain podman[83797]: 2025-12-06 08:44:27.003089452 +0000 UTC m=+0.155770115 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true)
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain podman[83805]: 2025-12-06 08:44:27.011998814 +0000 UTC m=+0.161105418 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain podman[83799]: 2025-12-06 08:44:27.098936143 +0000 UTC m=+0.246484089 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 06 08:44:27 np0005548789.localdomain podman[83790]: 2025-12-06 08:44:27.076981911 +0000 UTC m=+0.240469275 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, architecture=x86_64, release=1761123044, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron)
Dec 06 08:44:27 np0005548789.localdomain podman[83799]: 2025-12-06 08:44:27.138478802 +0000 UTC m=+0.286026808 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain podman[83798]: 2025-12-06 08:44:27.189302115 +0000 UTC m=+0.337867143 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4)
Dec 06 08:44:27 np0005548789.localdomain podman[83790]: 2025-12-06 08:44:27.210664099 +0000 UTC m=+0.374151503 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 08:44:27 np0005548789.localdomain podman[83798]: 2025-12-06 08:44:27.223251783 +0000 UTC m=+0.371816891 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain podman[83797]: 2025-12-06 08:44:27.382111041 +0000 UTC m=+0.534791614 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible)
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:44:27 np0005548789.localdomain systemd[1]: tmp-crun.Vc2xLx.mount: Deactivated successfully.
Dec 06 08:44:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:44:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:44:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:44:31 np0005548789.localdomain podman[83921]: 2025-12-06 08:44:31.908471925 +0000 UTC m=+0.076823751 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com)
Dec 06 08:44:31 np0005548789.localdomain podman[83928]: 2025-12-06 08:44:31.917995526 +0000 UTC m=+0.076182811 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, distribution-scope=public, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:44:31 np0005548789.localdomain systemd[1]: tmp-crun.cso6LJ.mount: Deactivated successfully.
Dec 06 08:44:31 np0005548789.localdomain podman[83921]: 2025-12-06 08:44:31.966970004 +0000 UTC m=+0.135321810 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.openshift.expose-services=)
Dec 06 08:44:31 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:44:31 np0005548789.localdomain podman[83928]: 2025-12-06 08:44:31.998646062 +0000 UTC m=+0.156833317 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:44:32 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:44:32 np0005548789.localdomain podman[83922]: 2025-12-06 08:44:31.968321885 +0000 UTC m=+0.129623635 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:44:32 np0005548789.localdomain podman[83922]: 2025-12-06 08:44:32.154979913 +0000 UTC m=+0.316281713 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12)
Dec 06 08:44:32 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:44:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:44:36 np0005548789.localdomain systemd[1]: tmp-crun.NtiYjc.mount: Deactivated successfully.
Dec 06 08:44:36 np0005548789.localdomain podman[84105]: 2025-12-06 08:44:36.936899481 +0000 UTC m=+0.094861742 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:44:36 np0005548789.localdomain podman[84105]: 2025-12-06 08:44:36.961493392 +0000 UTC m=+0.119455703 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:44:36 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:44:40 np0005548789.localdomain sudo[84398]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpy9adiokr/privsep.sock
Dec 06 08:44:40 np0005548789.localdomain systemd-logind[766]: Existing logind session ID 28 used by new audit session, ignoring.
Dec 06 08:44:40 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 08:44:41 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 08:44:41 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 08:44:41 np0005548789.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Queued start job for default target Main User Target.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Created slice User Application Slice.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Reached target Paths.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Reached target Timers.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Starting D-Bus User Message Bus Socket...
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Starting Create User's Volatile Files and Directories...
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Listening on D-Bus User Message Bus Socket.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Reached target Sockets.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Finished Create User's Volatile Files and Directories.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Reached target Basic System.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Reached target Main User Target.
Dec 06 08:44:41 np0005548789.localdomain systemd[84400]: Startup finished in 155ms.
Dec 06 08:44:41 np0005548789.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 08:44:41 np0005548789.localdomain systemd[1]: Started Session c11 of User root.
Dec 06 08:44:41 np0005548789.localdomain sudo[84398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Dec 06 08:44:41 np0005548789.localdomain sudo[84398]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:42 np0005548789.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 06 08:44:42 np0005548789.localdomain kernel: device tap86fc0b7a-fb entered promiscuous mode
Dec 06 08:44:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765010682.3705] manager: (tap86fc0b7a-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Dec 06 08:44:42 np0005548789.localdomain systemd-udevd[84436]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 08:44:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765010682.3850] device (tap86fc0b7a-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 08:44:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765010682.3859] device (tap86fc0b7a-fb): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 06 08:44:42 np0005548789.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 06 08:44:42 np0005548789.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 06 08:44:42 np0005548789.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 06 08:44:42 np0005548789.localdomain systemd-machined[84444]: New machine qemu-1-instance-00000002.
Dec 06 08:44:42 np0005548789.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Dec 06 08:44:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765010682.6084] manager: (tap652b6bdc-40): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Dec 06 08:44:42 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap652b6bdc-41: link becomes ready
Dec 06 08:44:42 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap652b6bdc-40: link becomes ready
Dec 06 08:44:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765010682.6733] device (tap652b6bdc-40): carrier: link connected
Dec 06 08:44:42 np0005548789.localdomain kernel: device tap652b6bdc-40 entered promiscuous mode
Dec 06 08:44:44 np0005548789.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 08:44:44 np0005548789.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 08:44:44 np0005548789.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 06 08:44:44 np0005548789.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 06 08:44:44 np0005548789.localdomain sudo[84555]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 haproxy -f /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 08:44:44 np0005548789.localdomain sudo[84555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Dec 06 08:44:45 np0005548789.localdomain podman[84581]: 2025-12-06 08:44:45.034213222 +0000 UTC m=+0.096904354 container create 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:44:45 np0005548789.localdomain podman[84581]: 2025-12-06 08:44:44.989467494 +0000 UTC m=+0.052158666 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Dec 06 08:44:45 np0005548789.localdomain systemd[1]: Started libpod-conmon-12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445.scope.
Dec 06 08:44:45 np0005548789.localdomain systemd[1]: tmp-crun.tFTTaw.mount: Deactivated successfully.
Dec 06 08:44:45 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 08:44:45 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31fbdb956fdb20faf0121dfd2c519c9e748cc292d5fc54ebad7f5d80f477ded1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 08:44:45 np0005548789.localdomain podman[84581]: 2025-12-06 08:44:45.159293967 +0000 UTC m=+0.221985109 container init 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64)
Dec 06 08:44:45 np0005548789.localdomain podman[84581]: 2025-12-06 08:44:45.176365619 +0000 UTC m=+0.239056761 container start 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, distribution-scope=public, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:44:45 np0005548789.localdomain sudo[84555]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:45 np0005548789.localdomain setroubleshoot[84539]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 58e2bb45-d8cf-42a0-b321-404a4f96b4c3
Dec 06 08:44:45 np0005548789.localdomain setroubleshoot[84539]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Dec 06 08:44:50 np0005548789.localdomain sudo[84606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:44:50 np0005548789.localdomain sudo[84606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:50 np0005548789.localdomain sudo[84606]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:50 np0005548789.localdomain sudo[84621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:44:50 np0005548789.localdomain sudo[84621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:50 np0005548789.localdomain sudo[84621]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:51 np0005548789.localdomain sudo[84669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:44:51 np0005548789.localdomain sudo[84669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:44:51 np0005548789.localdomain sudo[84669]: pam_unix(sudo:session): session closed for user root
Dec 06 08:44:54 np0005548789.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 06 08:44:55 np0005548789.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:44:57 np0005548789.localdomain systemd[1]: tmp-crun.rtckzU.mount: Deactivated successfully.
Dec 06 08:44:57 np0005548789.localdomain podman[84686]: 2025-12-06 08:44:57.94903043 +0000 UTC m=+0.105634440 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-cron, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:44:58 np0005548789.localdomain podman[84700]: 2025-12-06 08:44:57.98597217 +0000 UTC m=+0.130704768 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:44:58 np0005548789.localdomain podman[84687]: 2025-12-06 08:44:58.053544047 +0000 UTC m=+0.210408206 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Dec 06 08:44:58 np0005548789.localdomain podman[84700]: 2025-12-06 08:44:58.06805636 +0000 UTC m=+0.212788968 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain podman[84688]: 2025-12-06 08:44:58.026608273 +0000 UTC m=+0.181298886 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:44:58 np0005548789.localdomain podman[84686]: 2025-12-06 08:44:58.088457654 +0000 UTC m=+0.245061644 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain podman[84687]: 2025-12-06 08:44:58.143496958 +0000 UTC m=+0.300361087 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:44:58 np0005548789.localdomain podman[84689]: 2025-12-06 08:44:58.150934474 +0000 UTC m=+0.299915332 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain podman[84690]: 2025-12-06 08:44:58.205913916 +0000 UTC m=+0.354895074 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container)
Dec 06 08:44:58 np0005548789.localdomain podman[84689]: 2025-12-06 08:44:58.211164487 +0000 UTC m=+0.360145315 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain podman[84690]: 2025-12-06 08:44:58.241017459 +0000 UTC m=+0.389998577 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, distribution-scope=public)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain podman[84688]: 2025-12-06 08:44:58.356689927 +0000 UTC m=+0.511380460 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4)
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:44:58 np0005548789.localdomain systemd[1]: tmp-crun.swmenC.mount: Deactivated successfully.
Dec 06 08:45:01 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33064 [06/Dec/2025:08:45:00.308] listener listener/metadata 0/0/0/1660/1660 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33068 [06/Dec/2025:08:45:02.067] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33080 [06/Dec/2025:08:45:02.678] listener listener/metadata 0/0/0/12/12 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33088 [06/Dec/2025:08:45:02.762] listener listener/metadata 0/0/0/12/12 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:45:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:45:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33098 [06/Dec/2025:08:45:02.829] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33110 [06/Dec/2025:08:45:02.883] listener listener/metadata 0/0/0/17/17 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain podman[84820]: 2025-12-06 08:45:02.907168585 +0000 UTC m=+0.074124067 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:45:02 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33118 [06/Dec/2025:08:45:02.940] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Dec 06 08:45:02 np0005548789.localdomain systemd[1]: tmp-crun.ry8JGC.mount: Deactivated successfully.
Dec 06 08:45:02 np0005548789.localdomain podman[84822]: 2025-12-06 08:45:02.963653532 +0000 UTC m=+0.124062614 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:45:02 np0005548789.localdomain podman[84820]: 2025-12-06 08:45:02.984942324 +0000 UTC m=+0.151897836 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 08:45:02 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33128 [06/Dec/2025:08:45:02.999] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain podman[84822]: 2025-12-06 08:45:03.013722453 +0000 UTC m=+0.174131546 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:45:03 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33130 [06/Dec/2025:08:45:03.052] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain podman[84821]: 2025-12-06 08:45:03.072994846 +0000 UTC m=+0.234356867 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33140 [06/Dec/2025:08:45:03.130] listener listener/metadata 0/0/0/9/9 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33146 [06/Dec/2025:08:45:03.192] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33148 [06/Dec/2025:08:45:03.234] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain podman[84821]: 2025-12-06 08:45:03.284174674 +0000 UTC m=+0.445536745 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr)
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33152 [06/Dec/2025:08:45:03.281] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33158 [06/Dec/2025:08:45:03.331] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33160 [06/Dec/2025:08:45:03.385] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Dec 06 08:45:03 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[84602]: 192.168.0.162:33170 [06/Dec/2025:08:45:03.440] listener listener/metadata 0/0/0/10/10 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Dec 06 08:45:06 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 08:45:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:45:07 np0005548789.localdomain podman[84896]: 2025-12-06 08:45:07.922190483 +0000 UTC m=+0.079559243 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:45:07 np0005548789.localdomain podman[84896]: 2025-12-06 08:45:07.984260161 +0000 UTC m=+0.141628911 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, architecture=x86_64, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:45:07 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:45:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 08:45:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 08:45:26 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:45:26 np0005548789.localdomain recover_tripleo_nova_virtqemud[84969]: 61814
Dec 06 08:45:26 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:45:26 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:45:28 np0005548789.localdomain systemd[1]: tmp-crun.pJDxPC.mount: Deactivated successfully.
Dec 06 08:45:28 np0005548789.localdomain podman[84972]: 2025-12-06 08:45:28.967039284 +0000 UTC m=+0.116965048 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git)
Dec 06 08:45:29 np0005548789.localdomain podman[84986]: 2025-12-06 08:45:29.031731112 +0000 UTC m=+0.166530784 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 06 08:45:29 np0005548789.localdomain podman[84970]: 2025-12-06 08:45:29.073240192 +0000 UTC m=+0.227008684 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 08:45:29 np0005548789.localdomain podman[84986]: 2025-12-06 08:45:29.087068453 +0000 UTC m=+0.221868095 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:45:29 np0005548789.localdomain podman[84971]: 2025-12-06 08:45:29.108148339 +0000 UTC m=+0.258250598 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 08:45:29 np0005548789.localdomain podman[84971]: 2025-12-06 08:45:29.122160437 +0000 UTC m=+0.272262666 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:45:29 np0005548789.localdomain podman[84970]: 2025-12-06 08:45:29.161060657 +0000 UTC m=+0.314829159 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z)
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:45:29 np0005548789.localdomain podman[84979]: 2025-12-06 08:45:29.123188078 +0000 UTC m=+0.262329222 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:45:29 np0005548789.localdomain podman[84979]: 2025-12-06 08:45:29.207268339 +0000 UTC m=+0.346409533 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:45:29 np0005548789.localdomain podman[84973]: 2025-12-06 08:45:29.111040487 +0000 UTC m=+0.255344659 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=)
Dec 06 08:45:29 np0005548789.localdomain podman[84973]: 2025-12-06 08:45:29.291273308 +0000 UTC m=+0.435577470 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=)
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:45:29 np0005548789.localdomain podman[84972]: 2025-12-06 08:45:29.339410081 +0000 UTC m=+0.489335825 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:45:29 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:45:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:45:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:45:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:45:33 np0005548789.localdomain podman[85101]: 2025-12-06 08:45:33.924272122 +0000 UTC m=+0.082662798 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 08:45:33 np0005548789.localdomain podman[85103]: 2025-12-06 08:45:33.968143674 +0000 UTC m=+0.124611462 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 08:45:33 np0005548789.localdomain podman[85101]: 2025-12-06 08:45:33.971214947 +0000 UTC m=+0.129605653 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:45:33 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:45:34 np0005548789.localdomain podman[85103]: 2025-12-06 08:45:34.026838778 +0000 UTC m=+0.183306516 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 08:45:34 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:45:34 np0005548789.localdomain podman[85102]: 2025-12-06 08:45:34.027741586 +0000 UTC m=+0.181482541 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:45:34 np0005548789.localdomain podman[85102]: 2025-12-06 08:45:34.311216244 +0000 UTC m=+0.464957189 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:45:34 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:45:34 np0005548789.localdomain systemd[1]: tmp-crun.M1ph9e.mount: Deactivated successfully.
Dec 06 08:45:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:45:38 np0005548789.localdomain systemd[1]: tmp-crun.Q2WV0o.mount: Deactivated successfully.
Dec 06 08:45:38 np0005548789.localdomain podman[85175]: 2025-12-06 08:45:38.945036413 +0000 UTC m=+0.099592055 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:45:38 np0005548789.localdomain podman[85175]: 2025-12-06 08:45:38.979330043 +0000 UTC m=+0.133885685 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_compute, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute)
Dec 06 08:45:38 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:45:51 np0005548789.localdomain sudo[85201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:45:51 np0005548789.localdomain sudo[85201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:51 np0005548789.localdomain sudo[85201]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:51 np0005548789.localdomain sudo[85216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:45:51 np0005548789.localdomain sudo[85216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:52 np0005548789.localdomain sudo[85216]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:53 np0005548789.localdomain sudo[85264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:45:53 np0005548789.localdomain sudo[85264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:45:53 np0005548789.localdomain sudo[85264]: pam_unix(sudo:session): session closed for user root
Dec 06 08:45:53 np0005548789.localdomain sshd[85279]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:45:53 np0005548789.localdomain sshd[85279]: Invalid user node from 92.118.39.95 port 53060
Dec 06 08:45:54 np0005548789.localdomain sshd[85279]: Connection closed by invalid user node 92.118.39.95 port 53060 [preauth]
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:45:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:45:59 np0005548789.localdomain podman[85282]: 2025-12-06 08:45:59.944137221 +0000 UTC m=+0.098787852 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible)
Dec 06 08:45:59 np0005548789.localdomain podman[85297]: 2025-12-06 08:45:59.987447676 +0000 UTC m=+0.131440531 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 06 08:46:00 np0005548789.localdomain podman[85282]: 2025-12-06 08:46:00.057843108 +0000 UTC m=+0.212493729 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1)
Dec 06 08:46:00 np0005548789.localdomain podman[85297]: 2025-12-06 08:46:00.06801673 +0000 UTC m=+0.212009625 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain podman[85285]: 2025-12-06 08:46:00.074837288 +0000 UTC m=+0.220715251 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain podman[85285]: 2025-12-06 08:46:00.089226458 +0000 UTC m=+0.235104421 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain podman[85283]: 2025-12-06 08:46:00.039202828 +0000 UTC m=+0.190977251 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 08:46:00 np0005548789.localdomain podman[85281]: 2025-12-06 08:46:00.152776301 +0000 UTC m=+0.307784413 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:46:00 np0005548789.localdomain podman[85284]: 2025-12-06 08:46:00.199921403 +0000 UTC m=+0.348616862 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 06 08:46:00 np0005548789.localdomain podman[85281]: 2025-12-06 08:46:00.21387991 +0000 UTC m=+0.368888052 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain podman[85284]: 2025-12-06 08:46:00.260202666 +0000 UTC m=+0.408898135 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain podman[85283]: 2025-12-06 08:46:00.412265736 +0000 UTC m=+0.564040209 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, release=1761123044)
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:46:00 np0005548789.localdomain systemd[1]: tmp-crun.ywg8bs.mount: Deactivated successfully.
Dec 06 08:46:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:46:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:46:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:46:04 np0005548789.localdomain systemd[1]: tmp-crun.cI89pB.mount: Deactivated successfully.
Dec 06 08:46:04 np0005548789.localdomain podman[85418]: 2025-12-06 08:46:04.953053703 +0000 UTC m=+0.104399703 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4)
Dec 06 08:46:04 np0005548789.localdomain podman[85419]: 2025-12-06 08:46:04.97648926 +0000 UTC m=+0.123077774 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com)
Dec 06 08:46:05 np0005548789.localdomain podman[85417]: 2025-12-06 08:46:05.047919444 +0000 UTC m=+0.201982747 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:46:05 np0005548789.localdomain podman[85419]: 2025-12-06 08:46:05.055127545 +0000 UTC m=+0.201716009 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:46:05 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:46:05 np0005548789.localdomain podman[85417]: 2025-12-06 08:46:05.098337266 +0000 UTC m=+0.252400539 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 08:46:05 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:46:05 np0005548789.localdomain podman[85418]: 2025-12-06 08:46:05.152337567 +0000 UTC m=+0.303683577 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:46:05 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:46:05 np0005548789.localdomain systemd[1]: tmp-crun.bft6bg.mount: Deactivated successfully.
Dec 06 08:46:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:46:09 np0005548789.localdomain systemd[1]: tmp-crun.Kd077u.mount: Deactivated successfully.
Dec 06 08:46:09 np0005548789.localdomain podman[85493]: 2025-12-06 08:46:09.936075851 +0000 UTC m=+0.095500821 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5)
Dec 06 08:46:09 np0005548789.localdomain podman[85493]: 2025-12-06 08:46:09.971146864 +0000 UTC m=+0.130571854 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:46:09 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: tmp-crun.hxLiSj.mount: Deactivated successfully.
Dec 06 08:46:30 np0005548789.localdomain podman[85565]: 2025-12-06 08:46:30.957927297 +0000 UTC m=+0.110732047 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:46:30 np0005548789.localdomain podman[85565]: 2025-12-06 08:46:30.964493368 +0000 UTC m=+0.117298058 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:46:30 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:46:31 np0005548789.localdomain podman[85581]: 2025-12-06 08:46:31.011612788 +0000 UTC m=+0.148051438 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:46:31 np0005548789.localdomain podman[85581]: 2025-12-06 08:46:31.045969599 +0000 UTC m=+0.182408299 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=)
Dec 06 08:46:31 np0005548789.localdomain podman[85579]: 2025-12-06 08:46:31.05550082 +0000 UTC m=+0.193664312 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z)
Dec 06 08:46:31 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:46:31 np0005548789.localdomain podman[85567]: 2025-12-06 08:46:31.106156089 +0000 UTC m=+0.254237195 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=)
Dec 06 08:46:31 np0005548789.localdomain podman[85566]: 2025-12-06 08:46:31.159615684 +0000 UTC m=+0.312069723 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3)
Dec 06 08:46:31 np0005548789.localdomain podman[85579]: 2025-12-06 08:46:31.171118406 +0000 UTC m=+0.309281948 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 06 08:46:31 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:46:31 np0005548789.localdomain podman[85566]: 2025-12-06 08:46:31.222449026 +0000 UTC m=+0.374903045 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, container_name=collectd, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 06 08:46:31 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:46:31 np0005548789.localdomain podman[85568]: 2025-12-06 08:46:31.309853339 +0000 UTC m=+0.455112479 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:11:48Z)
Dec 06 08:46:31 np0005548789.localdomain podman[85568]: 2025-12-06 08:46:31.360147537 +0000 UTC m=+0.505406707 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 06 08:46:31 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:46:31 np0005548789.localdomain podman[85567]: 2025-12-06 08:46:31.565355972 +0000 UTC m=+0.713437048 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:46:31 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:46:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:46:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:46:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:46:35 np0005548789.localdomain systemd[1]: tmp-crun.i4oDh0.mount: Deactivated successfully.
Dec 06 08:46:35 np0005548789.localdomain podman[85698]: 2025-12-06 08:46:35.936106037 +0000 UTC m=+0.092988875 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team)
Dec 06 08:46:35 np0005548789.localdomain podman[85697]: 2025-12-06 08:46:35.983941519 +0000 UTC m=+0.140568659 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:46:35 np0005548789.localdomain podman[85698]: 2025-12-06 08:46:35.990208012 +0000 UTC m=+0.147090840 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 08:46:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:46:36 np0005548789.localdomain podman[85696]: 2025-12-06 08:46:36.070883338 +0000 UTC m=+0.230896221 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044)
Dec 06 08:46:36 np0005548789.localdomain podman[85696]: 2025-12-06 08:46:36.097553854 +0000 UTC m=+0.257566757 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:46:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:46:36 np0005548789.localdomain podman[85697]: 2025-12-06 08:46:36.205380821 +0000 UTC m=+0.362007981 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true)
Dec 06 08:46:36 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:46:36 np0005548789.localdomain systemd[1]: tmp-crun.jL8H3R.mount: Deactivated successfully.
Dec 06 08:46:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:46:40 np0005548789.localdomain podman[85772]: 2025-12-06 08:46:40.924026804 +0000 UTC m=+0.083948198 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:46:40 np0005548789.localdomain podman[85772]: 2025-12-06 08:46:40.954197206 +0000 UTC m=+0.114118560 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:46:40 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:46:53 np0005548789.localdomain sudo[85799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:46:53 np0005548789.localdomain sudo[85799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:53 np0005548789.localdomain sudo[85799]: pam_unix(sudo:session): session closed for user root
Dec 06 08:46:53 np0005548789.localdomain sudo[85814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:46:53 np0005548789.localdomain sudo[85814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:53 np0005548789.localdomain sudo[85814]: pam_unix(sudo:session): session closed for user root
Dec 06 08:46:54 np0005548789.localdomain sudo[85860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:46:54 np0005548789.localdomain sudo[85860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:46:54 np0005548789.localdomain sudo[85860]: pam_unix(sudo:session): session closed for user root
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:47:01 np0005548789.localdomain systemd[1]: tmp-crun.uzvnxT.mount: Deactivated successfully.
Dec 06 08:47:01 np0005548789.localdomain podman[85877]: 2025-12-06 08:47:01.989251935 +0000 UTC m=+0.145325965 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git)
Dec 06 08:47:01 np0005548789.localdomain podman[85876]: 2025-12-06 08:47:01.939261086 +0000 UTC m=+0.096823851 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:47:02 np0005548789.localdomain podman[85896]: 2025-12-06 08:47:01.95540947 +0000 UTC m=+0.100002549 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:47:02 np0005548789.localdomain podman[85876]: 2025-12-06 08:47:02.022159722 +0000 UTC m=+0.179722577 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:47:02 np0005548789.localdomain podman[85875]: 2025-12-06 08:47:02.035218481 +0000 UTC m=+0.194598792 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:32Z)
Dec 06 08:47:02 np0005548789.localdomain podman[85896]: 2025-12-06 08:47:02.03810865 +0000 UTC m=+0.182701669 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:47:02 np0005548789.localdomain podman[85884]: 2025-12-06 08:47:02.100507978 +0000 UTC m=+0.245980314 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:47:02 np0005548789.localdomain podman[85875]: 2025-12-06 08:47:02.121984935 +0000 UTC m=+0.281365296 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step4, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z)
Dec 06 08:47:02 np0005548789.localdomain podman[85878]: 2025-12-06 08:47:01.968616384 +0000 UTC m=+0.117626748 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:11:48Z, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:47:02 np0005548789.localdomain podman[85884]: 2025-12-06 08:47:02.138040295 +0000 UTC m=+0.283512641 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:47:02 np0005548789.localdomain podman[85878]: 2025-12-06 08:47:02.204337943 +0000 UTC m=+0.353348317 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:47:02 np0005548789.localdomain podman[85877]: 2025-12-06 08:47:02.353196394 +0000 UTC m=+0.509270424 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:47:02 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:47:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:47:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:47:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:47:06 np0005548789.localdomain podman[86004]: 2025-12-06 08:47:06.920465959 +0000 UTC m=+0.080040629 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller)
Dec 06 08:47:06 np0005548789.localdomain systemd[1]: tmp-crun.xtWjUi.mount: Deactivated successfully.
Dec 06 08:47:06 np0005548789.localdomain podman[86004]: 2025-12-06 08:47:06.970059445 +0000 UTC m=+0.129634065 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1)
Dec 06 08:47:06 np0005548789.localdomain podman[86005]: 2025-12-06 08:47:06.97546032 +0000 UTC m=+0.134678219 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:47:06 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:47:07 np0005548789.localdomain podman[86006]: 2025-12-06 08:47:07.027954906 +0000 UTC m=+0.182203393 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Dec 06 08:47:07 np0005548789.localdomain podman[86006]: 2025-12-06 08:47:07.094215372 +0000 UTC m=+0.248463859 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-19T00:14:25Z)
Dec 06 08:47:07 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:47:07 np0005548789.localdomain podman[86005]: 2025-12-06 08:47:07.158016842 +0000 UTC m=+0.317234721 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:47:07 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:47:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:47:11 np0005548789.localdomain podman[86079]: 2025-12-06 08:47:11.914841633 +0000 UTC m=+0.071304861 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044)
Dec 06 08:47:11 np0005548789.localdomain podman[86079]: 2025-12-06 08:47:11.974288791 +0000 UTC m=+0.130752039 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:47:11 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:47:16 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:47:16 np0005548789.localdomain recover_tripleo_nova_virtqemud[86106]: 61814
Dec 06 08:47:16 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:47:16 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:47:32 np0005548789.localdomain podman[86153]: 2025-12-06 08:47:32.961227571 +0000 UTC m=+0.107596711 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:47:33 np0005548789.localdomain podman[86154]: 2025-12-06 08:47:33.003443062 +0000 UTC m=+0.143554811 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:47:33 np0005548789.localdomain podman[86152]: 2025-12-06 08:47:33.050999646 +0000 UTC m=+0.197462309 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:47:33 np0005548789.localdomain podman[86152]: 2025-12-06 08:47:33.063093826 +0000 UTC m=+0.209556529 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.buildah.version=1.41.4)
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:47:33 np0005548789.localdomain podman[86165]: 2025-12-06 08:47:33.112736774 +0000 UTC m=+0.244084325 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:47:33 np0005548789.localdomain podman[86165]: 2025-12-06 08:47:33.122664588 +0000 UTC m=+0.254012129 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 08:47:33 np0005548789.localdomain podman[86153]: 2025-12-06 08:47:33.142002749 +0000 UTC m=+0.288371899 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, vcs-type=git)
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:47:33 np0005548789.localdomain podman[86155]: 2025-12-06 08:47:33.162492196 +0000 UTC m=+0.301089478 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com)
Dec 06 08:47:33 np0005548789.localdomain podman[86155]: 2025-12-06 08:47:33.202446827 +0000 UTC m=+0.341044159 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:47:33 np0005548789.localdomain podman[86170]: 2025-12-06 08:47:33.125327789 +0000 UTC m=+0.250526312 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 08:47:33 np0005548789.localdomain podman[86170]: 2025-12-06 08:47:33.308203901 +0000 UTC m=+0.433402434 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z)
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:47:33 np0005548789.localdomain podman[86154]: 2025-12-06 08:47:33.35918257 +0000 UTC m=+0.499294249 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true)
Dec 06 08:47:33 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:47:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:47:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:47:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:47:37 np0005548789.localdomain systemd[1]: tmp-crun.Vya2b5.mount: Deactivated successfully.
Dec 06 08:47:37 np0005548789.localdomain podman[86285]: 2025-12-06 08:47:37.922302126 +0000 UTC m=+0.084402461 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 08:47:37 np0005548789.localdomain systemd[1]: tmp-crun.z7fXYI.mount: Deactivated successfully.
Dec 06 08:47:37 np0005548789.localdomain podman[86283]: 2025-12-06 08:47:37.944605758 +0000 UTC m=+0.104430634 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:47:37 np0005548789.localdomain podman[86285]: 2025-12-06 08:47:37.99435426 +0000 UTC m=+0.156454535 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 06 08:47:38 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:47:38 np0005548789.localdomain podman[86284]: 2025-12-06 08:47:38.008522973 +0000 UTC m=+0.171170285 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12)
Dec 06 08:47:38 np0005548789.localdomain podman[86283]: 2025-12-06 08:47:38.045329549 +0000 UTC m=+0.205154435 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, container_name=ovn_controller, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, url=https://www.redhat.com)
Dec 06 08:47:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:47:38 np0005548789.localdomain podman[86284]: 2025-12-06 08:47:38.215135562 +0000 UTC m=+0.377782814 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, release=1761123044, container_name=metrics_qdr, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:47:38 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:47:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:47:42 np0005548789.localdomain podman[86356]: 2025-12-06 08:47:42.919786457 +0000 UTC m=+0.081714330 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:47:42 np0005548789.localdomain podman[86356]: 2025-12-06 08:47:42.974999405 +0000 UTC m=+0.136927258 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:47:42 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:47:54 np0005548789.localdomain sudo[86382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:47:54 np0005548789.localdomain sudo[86382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:54 np0005548789.localdomain sudo[86382]: pam_unix(sudo:session): session closed for user root
Dec 06 08:47:54 np0005548789.localdomain sudo[86397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:47:54 np0005548789.localdomain sudo[86397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:55 np0005548789.localdomain sudo[86397]: pam_unix(sudo:session): session closed for user root
Dec 06 08:47:56 np0005548789.localdomain sudo[86444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:47:56 np0005548789.localdomain sudo[86444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:47:56 np0005548789.localdomain sudo[86444]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:48:03 np0005548789.localdomain podman[86460]: 2025-12-06 08:48:03.948721119 +0000 UTC m=+0.096720079 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:48:03 np0005548789.localdomain podman[86460]: 2025-12-06 08:48:03.985155413 +0000 UTC m=+0.133154333 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 08:48:03 np0005548789.localdomain systemd[1]: tmp-crun.GyAkv1.mount: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain podman[86462]: 2025-12-06 08:48:04.003209365 +0000 UTC m=+0.148323627 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Dec 06 08:48:04 np0005548789.localdomain podman[86461]: 2025-12-06 08:48:04.050305645 +0000 UTC m=+0.198282214 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Dec 06 08:48:04 np0005548789.localdomain podman[86461]: 2025-12-06 08:48:04.060039523 +0000 UTC m=+0.208016032 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain podman[86480]: 2025-12-06 08:48:04.101773089 +0000 UTC m=+0.239128503 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Dec 06 08:48:04 np0005548789.localdomain podman[86480]: 2025-12-06 08:48:04.109714422 +0000 UTC m=+0.247069826 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-iscsid-container)
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain podman[86465]: 2025-12-06 08:48:04.159827485 +0000 UTC m=+0.294794626 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:48:04 np0005548789.localdomain podman[86465]: 2025-12-06 08:48:04.200117786 +0000 UTC m=+0.335084867 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain podman[86481]: 2025-12-06 08:48:04.216405734 +0000 UTC m=+0.343436792 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Dec 06 08:48:04 np0005548789.localdomain podman[86481]: 2025-12-06 08:48:04.275274445 +0000 UTC m=+0.402305453 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain podman[86462]: 2025-12-06 08:48:04.367825995 +0000 UTC m=+0.512940217 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target)
Dec 06 08:48:04 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:48:04 np0005548789.localdomain sshd[86591]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:48:05 np0005548789.localdomain sshd[86591]: Invalid user minima from 92.118.39.95 port 39820
Dec 06 08:48:05 np0005548789.localdomain sshd[86591]: Connection closed by invalid user minima 92.118.39.95 port 39820 [preauth]
Dec 06 08:48:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:48:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:48:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:48:08 np0005548789.localdomain podman[86595]: 2025-12-06 08:48:08.951742378 +0000 UTC m=+0.089333893 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:48:08 np0005548789.localdomain podman[86595]: 2025-12-06 08:48:08.988484672 +0000 UTC m=+0.126076207 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team)
Dec 06 08:48:09 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:48:09 np0005548789.localdomain podman[86594]: 2025-12-06 08:48:09.010182515 +0000 UTC m=+0.150037019 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:48:09 np0005548789.localdomain podman[86593]: 2025-12-06 08:48:08.985070437 +0000 UTC m=+0.130115440 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4)
Dec 06 08:48:09 np0005548789.localdomain podman[86593]: 2025-12-06 08:48:09.063830536 +0000 UTC m=+0.208875509 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, release=1761123044, version=17.1.12, architecture=x86_64)
Dec 06 08:48:09 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:48:09 np0005548789.localdomain podman[86594]: 2025-12-06 08:48:09.204199898 +0000 UTC m=+0.344054322 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:48:09 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:48:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:48:13 np0005548789.localdomain podman[86670]: 2025-12-06 08:48:13.969572312 +0000 UTC m=+0.085303500 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64)
Dec 06 08:48:14 np0005548789.localdomain podman[86670]: 2025-12-06 08:48:14.027242475 +0000 UTC m=+0.142973663 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:48:14 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:48:34 np0005548789.localdomain systemd[1]: tmp-crun.6xdcGY.mount: Deactivated successfully.
Dec 06 08:48:34 np0005548789.localdomain podman[86757]: 2025-12-06 08:48:34.944421 +0000 UTC m=+0.083208896 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:48:35 np0005548789.localdomain podman[86757]: 2025-12-06 08:48:35.0062249 +0000 UTC m=+0.145012796 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:48:35 np0005548789.localdomain podman[86743]: 2025-12-06 08:48:35.047396109 +0000 UTC m=+0.197225043 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd)
Dec 06 08:48:35 np0005548789.localdomain podman[86751]: 2025-12-06 08:48:35.007459518 +0000 UTC m=+0.147109500 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:48:35 np0005548789.localdomain podman[86751]: 2025-12-06 08:48:35.08767702 +0000 UTC m=+0.227326962 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, version=17.1.12, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:48:35 np0005548789.localdomain podman[86744]: 2025-12-06 08:48:35.108471827 +0000 UTC m=+0.254185804 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:48:35 np0005548789.localdomain podman[86743]: 2025-12-06 08:48:35.136541515 +0000 UTC m=+0.286370439 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Dec 06 08:48:35 np0005548789.localdomain podman[86742]: 2025-12-06 08:48:35.147477609 +0000 UTC m=+0.300376047 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:48:35 np0005548789.localdomain podman[86742]: 2025-12-06 08:48:35.182672585 +0000 UTC m=+0.335570993 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:48:35 np0005548789.localdomain podman[86746]: 2025-12-06 08:48:35.198149509 +0000 UTC m=+0.339609247 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:48:35 np0005548789.localdomain podman[86746]: 2025-12-06 08:48:35.253184921 +0000 UTC m=+0.394644649 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:48:35 np0005548789.localdomain podman[86744]: 2025-12-06 08:48:35.515377149 +0000 UTC m=+0.661091166 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, release=1761123044, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Dec 06 08:48:35 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:48:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:48:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:48:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:48:39 np0005548789.localdomain systemd[1]: tmp-crun.jifAer.mount: Deactivated successfully.
Dec 06 08:48:39 np0005548789.localdomain podman[86876]: 2025-12-06 08:48:39.935629588 +0000 UTC m=+0.096063490 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4)
Dec 06 08:48:39 np0005548789.localdomain podman[86876]: 2025-12-06 08:48:39.987180584 +0000 UTC m=+0.147614456 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64)
Dec 06 08:48:40 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:48:40 np0005548789.localdomain podman[86877]: 2025-12-06 08:48:39.98704699 +0000 UTC m=+0.142606722 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd)
Dec 06 08:48:40 np0005548789.localdomain podman[86878]: 2025-12-06 08:48:40.08384272 +0000 UTC m=+0.237111022 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:48:40 np0005548789.localdomain podman[86878]: 2025-12-06 08:48:40.12507131 +0000 UTC m=+0.278339622 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 08:48:40 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:48:40 np0005548789.localdomain podman[86877]: 2025-12-06 08:48:40.177549035 +0000 UTC m=+0.333108757 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vcs-type=git, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4)
Dec 06 08:48:40 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:48:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:48:44 np0005548789.localdomain systemd[1]: tmp-crun.kPIxE1.mount: Deactivated successfully.
Dec 06 08:48:44 np0005548789.localdomain podman[86952]: 2025-12-06 08:48:44.946539739 +0000 UTC m=+0.103026840 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true)
Dec 06 08:48:44 np0005548789.localdomain podman[86952]: 2025-12-06 08:48:44.979854518 +0000 UTC m=+0.136341659 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 08:48:44 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:48:56 np0005548789.localdomain sudo[86979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:48:56 np0005548789.localdomain sudo[86979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:56 np0005548789.localdomain sudo[86979]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:56 np0005548789.localdomain sudo[86994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:48:56 np0005548789.localdomain sudo[86994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:57 np0005548789.localdomain sudo[86994]: pam_unix(sudo:session): session closed for user root
Dec 06 08:48:57 np0005548789.localdomain sudo[87040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:48:57 np0005548789.localdomain sudo[87040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:48:57 np0005548789.localdomain sudo[87040]: pam_unix(sudo:session): session closed for user root
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:49:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:49:05 np0005548789.localdomain podman[87055]: 2025-12-06 08:49:05.958829493 +0000 UTC m=+0.106845268 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 08:49:05 np0005548789.localdomain podman[87058]: 2025-12-06 08:49:05.968933833 +0000 UTC m=+0.094098559 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:49:05 np0005548789.localdomain podman[87055]: 2025-12-06 08:49:05.998133635 +0000 UTC m=+0.146149400 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain podman[87056]: 2025-12-06 08:49:06.017481947 +0000 UTC m=+0.165547714 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, distribution-scope=public)
Dec 06 08:49:06 np0005548789.localdomain podman[87058]: 2025-12-06 08:49:06.048269098 +0000 UTC m=+0.173433814 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute)
Dec 06 08:49:06 np0005548789.localdomain podman[87057]: 2025-12-06 08:49:06.054765976 +0000 UTC m=+0.201469791 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain podman[87077]: 2025-12-06 08:49:06.110818971 +0000 UTC m=+0.245403445 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:49:06 np0005548789.localdomain podman[87056]: 2025-12-06 08:49:06.133410801 +0000 UTC m=+0.281476568 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain podman[87077]: 2025-12-06 08:49:06.185175265 +0000 UTC m=+0.319759689 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain podman[87059]: 2025-12-06 08:49:06.269934436 +0000 UTC m=+0.402270521 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, container_name=iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git)
Dec 06 08:49:06 np0005548789.localdomain podman[87059]: 2025-12-06 08:49:06.282111438 +0000 UTC m=+0.414447563 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain podman[87057]: 2025-12-06 08:49:06.417107957 +0000 UTC m=+0.563811752 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:49:06 np0005548789.localdomain systemd[1]: tmp-crun.c26Y8T.mount: Deactivated successfully.
Dec 06 08:49:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:49:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:49:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:49:10 np0005548789.localdomain podman[87187]: 2025-12-06 08:49:10.918675014 +0000 UTC m=+0.082191275 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1)
Dec 06 08:49:10 np0005548789.localdomain systemd[1]: tmp-crun.JFHs5J.mount: Deactivated successfully.
Dec 06 08:49:10 np0005548789.localdomain podman[87189]: 2025-12-06 08:49:10.983015702 +0000 UTC m=+0.140021163 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Dec 06 08:49:11 np0005548789.localdomain podman[87188]: 2025-12-06 08:49:11.034593418 +0000 UTC m=+0.194671634 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 08:49:11 np0005548789.localdomain podman[87187]: 2025-12-06 08:49:11.04544721 +0000 UTC m=+0.208963471 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, version=17.1.12, container_name=ovn_controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Dec 06 08:49:11 np0005548789.localdomain podman[87189]: 2025-12-06 08:49:11.058315543 +0000 UTC m=+0.215320994 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:49:11 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:49:11 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:49:11 np0005548789.localdomain podman[87188]: 2025-12-06 08:49:11.231273223 +0000 UTC m=+0.391351409 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12)
Dec 06 08:49:11 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:49:15 np0005548789.localdomain recover_tripleo_nova_virtqemud[87268]: 61814
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: tmp-crun.owH1NC.mount: Deactivated successfully.
Dec 06 08:49:15 np0005548789.localdomain podman[87266]: 2025-12-06 08:49:15.920241101 +0000 UTC m=+0.083855305 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 08:49:15 np0005548789.localdomain podman[87266]: 2025-12-06 08:49:15.976280525 +0000 UTC m=+0.139894769 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:49:15 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:49:36 np0005548789.localdomain systemd[1]: tmp-crun.BTS0Dz.mount: Deactivated successfully.
Dec 06 08:49:36 np0005548789.localdomain podman[87341]: 2025-12-06 08:49:36.942418787 +0000 UTC m=+0.097169993 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:49:37 np0005548789.localdomain podman[87340]: 2025-12-06 08:49:37.00139076 +0000 UTC m=+0.155886067 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:49:37 np0005548789.localdomain podman[87341]: 2025-12-06 08:49:37.005460395 +0000 UTC m=+0.160211561 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:49:37 np0005548789.localdomain podman[87340]: 2025-12-06 08:49:37.031067588 +0000 UTC m=+0.185562885 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64)
Dec 06 08:49:37 np0005548789.localdomain podman[87357]: 2025-12-06 08:49:37.037217186 +0000 UTC m=+0.172797115 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:49:37 np0005548789.localdomain podman[87354]: 2025-12-06 08:49:36.959946793 +0000 UTC m=+0.101426023 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, container_name=iscsid, distribution-scope=public, url=https://www.redhat.com)
Dec 06 08:49:37 np0005548789.localdomain podman[87354]: 2025-12-06 08:49:37.089220707 +0000 UTC m=+0.230699967 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:49:37 np0005548789.localdomain podman[87357]: 2025-12-06 08:49:37.113963043 +0000 UTC m=+0.249542992 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:49:37 np0005548789.localdomain podman[87343]: 2025-12-06 08:49:37.096357955 +0000 UTC m=+0.241997981 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=)
Dec 06 08:49:37 np0005548789.localdomain podman[87342]: 2025-12-06 08:49:37.197094305 +0000 UTC m=+0.345365632 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4)
Dec 06 08:49:37 np0005548789.localdomain podman[87343]: 2025-12-06 08:49:37.225543005 +0000 UTC m=+0.371183091 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:49:37 np0005548789.localdomain podman[87342]: 2025-12-06 08:49:37.568247125 +0000 UTC m=+0.716518462 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, release=1761123044, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 08:49:37 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:49:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:49:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:49:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:49:41 np0005548789.localdomain systemd[84400]: Created slice User Background Tasks Slice.
Dec 06 08:49:41 np0005548789.localdomain podman[87472]: 2025-12-06 08:49:41.939858578 +0000 UTC m=+0.095628265 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:34:05Z)
Dec 06 08:49:41 np0005548789.localdomain systemd[84400]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 08:49:41 np0005548789.localdomain systemd[84400]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 08:49:42 np0005548789.localdomain podman[87472]: 2025-12-06 08:49:41.994189409 +0000 UTC m=+0.149959056 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:49:42 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:49:42 np0005548789.localdomain podman[87474]: 2025-12-06 08:49:42.053482283 +0000 UTC m=+0.203106723 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Dec 06 08:49:42 np0005548789.localdomain podman[87474]: 2025-12-06 08:49:42.097934101 +0000 UTC m=+0.247558591 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:49:42 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:49:42 np0005548789.localdomain podman[87473]: 2025-12-06 08:49:42.01614324 +0000 UTC m=+0.168155203 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:49:42 np0005548789.localdomain podman[87473]: 2025-12-06 08:49:42.22803942 +0000 UTC m=+0.380051343 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, container_name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd)
Dec 06 08:49:42 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:49:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:49:46 np0005548789.localdomain systemd[1]: tmp-crun.P7gqiP.mount: Deactivated successfully.
Dec 06 08:49:46 np0005548789.localdomain podman[87547]: 2025-12-06 08:49:46.927702156 +0000 UTC m=+0.089564730 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:49:46 np0005548789.localdomain podman[87547]: 2025-12-06 08:49:46.984375078 +0000 UTC m=+0.146237682 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute)
Dec 06 08:49:46 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:49:57 np0005548789.localdomain sudo[87573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:49:57 np0005548789.localdomain sudo[87573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:49:57 np0005548789.localdomain sudo[87573]: pam_unix(sudo:session): session closed for user root
Dec 06 08:49:57 np0005548789.localdomain sudo[87588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:49:57 np0005548789.localdomain sudo[87588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:49:58 np0005548789.localdomain sudo[87588]: pam_unix(sudo:session): session closed for user root
Dec 06 08:50:01 np0005548789.localdomain sudo[87634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:50:01 np0005548789.localdomain sudo[87634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:50:01 np0005548789.localdomain sudo[87634]: pam_unix(sudo:session): session closed for user root
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:50:07 np0005548789.localdomain podman[87650]: 2025-12-06 08:50:07.931384564 +0000 UTC m=+0.085575527 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 08:50:07 np0005548789.localdomain podman[87650]: 2025-12-06 08:50:07.938031097 +0000 UTC m=+0.092222070 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd)
Dec 06 08:50:07 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:50:07 np0005548789.localdomain podman[87649]: 2025-12-06 08:50:07.946397853 +0000 UTC m=+0.095520002 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: tmp-crun.TcsEqh.mount: Deactivated successfully.
Dec 06 08:50:08 np0005548789.localdomain podman[87653]: 2025-12-06 08:50:08.006796231 +0000 UTC m=+0.151672699 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-iscsid, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:50:08 np0005548789.localdomain podman[87651]: 2025-12-06 08:50:08.010226175 +0000 UTC m=+0.162146989 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:50:08 np0005548789.localdomain podman[87649]: 2025-12-06 08:50:08.030298049 +0000 UTC m=+0.179420198 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z)
Dec 06 08:50:08 np0005548789.localdomain podman[87653]: 2025-12-06 08:50:08.042556774 +0000 UTC m=+0.187433282 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step3, release=1761123044, tcib_managed=true, url=https://www.redhat.com)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:50:08 np0005548789.localdomain podman[87652]: 2025-12-06 08:50:08.053005764 +0000 UTC m=+0.201628307 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:50:08 np0005548789.localdomain podman[87662]: 2025-12-06 08:50:08.105091146 +0000 UTC m=+0.247534770 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4)
Dec 06 08:50:08 np0005548789.localdomain podman[87652]: 2025-12-06 08:50:08.111244044 +0000 UTC m=+0.259866557 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:50:08 np0005548789.localdomain podman[87662]: 2025-12-06 08:50:08.133278378 +0000 UTC m=+0.275721962 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:50:08 np0005548789.localdomain podman[87651]: 2025-12-06 08:50:08.367702227 +0000 UTC m=+0.519623081 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-type=git)
Dec 06 08:50:08 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:50:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:50:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 593 writes, 2365 keys, 593 commit groups, 1.0 writes per commit group, ingest: 3.12 MB, 0.01 MB/s
                                                          Interval WAL: 593 writes, 185 syncs, 3.21 writes per sync, written: 0.00 GB, 0.01 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:50:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:50:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:50:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:50:12 np0005548789.localdomain systemd[1]: tmp-crun.otVyqC.mount: Deactivated successfully.
Dec 06 08:50:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 08:50:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 412 writes, 1624 keys, 412 commit groups, 1.0 writes per commit group, ingest: 1.78 MB, 0.00 MB/s
                                                          Interval WAL: 412 writes, 148 syncs, 2.78 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 08:50:12 np0005548789.localdomain podman[87780]: 2025-12-06 08:50:12.9708961 +0000 UTC m=+0.129192442 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 06 08:50:12 np0005548789.localdomain podman[87779]: 2025-12-06 08:50:12.938111027 +0000 UTC m=+0.099571335 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=)
Dec 06 08:50:13 np0005548789.localdomain podman[87780]: 2025-12-06 08:50:13.006124178 +0000 UTC m=+0.164420490 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Dec 06 08:50:13 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:50:13 np0005548789.localdomain podman[87778]: 2025-12-06 08:50:13.084337069 +0000 UTC m=+0.245416296 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 08:50:13 np0005548789.localdomain podman[87779]: 2025-12-06 08:50:13.1023634 +0000 UTC m=+0.263823748 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4)
Dec 06 08:50:13 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:50:13 np0005548789.localdomain podman[87778]: 2025-12-06 08:50:13.146428468 +0000 UTC m=+0.307507675 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible)
Dec 06 08:50:13 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:50:13 np0005548789.localdomain systemd[1]: tmp-crun.CfUE3L.mount: Deactivated successfully.
Dec 06 08:50:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:50:17 np0005548789.localdomain podman[87856]: 2025-12-06 08:50:17.918402932 +0000 UTC m=+0.078118320 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.)
Dec 06 08:50:17 np0005548789.localdomain podman[87856]: 2025-12-06 08:50:17.945862062 +0000 UTC m=+0.105577470 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:50:17 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:50:19 np0005548789.localdomain sshd[87882]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:50:19 np0005548789.localdomain sshd[87882]: Invalid user mina from 92.118.39.95 port 54798
Dec 06 08:50:19 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:50:19 np0005548789.localdomain recover_tripleo_nova_virtqemud[87885]: 61814
Dec 06 08:50:19 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:50:19 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:50:19 np0005548789.localdomain sshd[87882]: Connection closed by invalid user mina 92.118.39.95 port 54798 [preauth]
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:50:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:50:38 np0005548789.localdomain podman[87932]: 2025-12-06 08:50:38.948537273 +0000 UTC m=+0.099664858 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: tmp-crun.4kosVR.mount: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain podman[87934]: 2025-12-06 08:50:39.006250658 +0000 UTC m=+0.152155884 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:50:39 np0005548789.localdomain podman[87934]: 2025-12-06 08:50:39.03804552 +0000 UTC m=+0.183950746 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain podman[87933]: 2025-12-06 08:50:39.053961747 +0000 UTC m=+0.202112561 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:50:39 np0005548789.localdomain podman[87941]: 2025-12-06 08:50:39.098502089 +0000 UTC m=+0.231769758 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true)
Dec 06 08:50:39 np0005548789.localdomain podman[87941]: 2025-12-06 08:50:39.110138205 +0000 UTC m=+0.243405914 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-type=git, io.buildah.version=1.41.4, release=1761123044)
Dec 06 08:50:39 np0005548789.localdomain podman[87932]: 2025-12-06 08:50:39.119580503 +0000 UTC m=+0.270708138 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, container_name=collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain podman[87951]: 2025-12-06 08:50:39.20872882 +0000 UTC m=+0.346329802 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:50:39 np0005548789.localdomain podman[87951]: 2025-12-06 08:50:39.2600805 +0000 UTC m=+0.397681452 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain podman[87931]: 2025-12-06 08:50:39.263616669 +0000 UTC m=+0.415990373 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-cron)
Dec 06 08:50:39 np0005548789.localdomain podman[87931]: 2025-12-06 08:50:39.347281076 +0000 UTC m=+0.499654730 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, vcs-type=git)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain podman[87933]: 2025-12-06 08:50:39.413171911 +0000 UTC m=+0.561322735 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, distribution-scope=public)
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:50:39 np0005548789.localdomain systemd[1]: tmp-crun.pOFKGu.mount: Deactivated successfully.
Dec 06 08:50:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:50:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:50:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:50:43 np0005548789.localdomain systemd[1]: tmp-crun.TOjdyo.mount: Deactivated successfully.
Dec 06 08:50:43 np0005548789.localdomain podman[88067]: 2025-12-06 08:50:43.900098528 +0000 UTC m=+0.061005906 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:50:43 np0005548789.localdomain systemd[1]: tmp-crun.ZD05SM.mount: Deactivated successfully.
Dec 06 08:50:43 np0005548789.localdomain podman[88066]: 2025-12-06 08:50:43.998677813 +0000 UTC m=+0.158928982 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.12, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:50:44 np0005548789.localdomain podman[88068]: 2025-12-06 08:50:43.967257392 +0000 UTC m=+0.119656110 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:50:44 np0005548789.localdomain podman[88068]: 2025-12-06 08:50:44.048102794 +0000 UTC m=+0.200501492 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 08:50:44 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:50:44 np0005548789.localdomain podman[88066]: 2025-12-06 08:50:44.07217674 +0000 UTC m=+0.232427929 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:50:44 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:50:44 np0005548789.localdomain podman[88067]: 2025-12-06 08:50:44.10913653 +0000 UTC m=+0.270043918 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:50:44 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:50:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:50:48 np0005548789.localdomain systemd[1]: tmp-crun.Y0F8aw.mount: Deactivated successfully.
Dec 06 08:50:48 np0005548789.localdomain podman[88142]: 2025-12-06 08:50:48.892221054 +0000 UTC m=+0.061112480 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, release=1761123044, distribution-scope=public, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-type=git)
Dec 06 08:50:48 np0005548789.localdomain podman[88142]: 2025-12-06 08:50:48.921091647 +0000 UTC m=+0.089983113 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:50:48 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:51:01 np0005548789.localdomain sudo[88167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:51:01 np0005548789.localdomain sudo[88167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:01 np0005548789.localdomain sudo[88167]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:01 np0005548789.localdomain sudo[88182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 08:51:01 np0005548789.localdomain sudo[88182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:02 np0005548789.localdomain sudo[88182]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:02 np0005548789.localdomain sudo[88218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:51:02 np0005548789.localdomain sudo[88218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:02 np0005548789.localdomain sudo[88218]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:02 np0005548789.localdomain sudo[88233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:51:02 np0005548789.localdomain sudo[88233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:02 np0005548789.localdomain sudo[88233]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:03 np0005548789.localdomain sudo[88280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:51:03 np0005548789.localdomain sudo[88280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:51:03 np0005548789.localdomain sudo[88280]: pam_unix(sudo:session): session closed for user root
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: tmp-crun.CmgBkp.mount: Deactivated successfully.
Dec 06 08:51:09 np0005548789.localdomain podman[88296]: 2025-12-06 08:51:09.943804958 +0000 UTC m=+0.103546207 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:51:28Z, container_name=collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 08:51:09 np0005548789.localdomain podman[88310]: 2025-12-06 08:51:09.977884371 +0000 UTC m=+0.121430235 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3)
Dec 06 08:51:09 np0005548789.localdomain podman[88296]: 2025-12-06 08:51:09.982039128 +0000 UTC m=+0.141780367 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:51:09 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:51:10 np0005548789.localdomain podman[88310]: 2025-12-06 08:51:10.013983434 +0000 UTC m=+0.157529308 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:51:10 np0005548789.localdomain podman[88297]: 2025-12-06 08:51:10.0328161 +0000 UTC m=+0.181482021 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:51:10 np0005548789.localdomain podman[88316]: 2025-12-06 08:51:10.12927153 +0000 UTC m=+0.266843371 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12)
Dec 06 08:51:10 np0005548789.localdomain podman[88295]: 2025-12-06 08:51:10.151531861 +0000 UTC m=+0.310126675 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:51:10 np0005548789.localdomain podman[88295]: 2025-12-06 08:51:10.159109532 +0000 UTC m=+0.317704376 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true)
Dec 06 08:51:10 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:51:10 np0005548789.localdomain podman[88316]: 2025-12-06 08:51:10.187038996 +0000 UTC m=+0.324610817 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1)
Dec 06 08:51:10 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:51:10 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:51:10 np0005548789.localdomain podman[88303]: 2025-12-06 08:51:10.211736532 +0000 UTC m=+0.358477453 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute)
Dec 06 08:51:10 np0005548789.localdomain podman[88303]: 2025-12-06 08:51:10.239054127 +0000 UTC m=+0.385795038 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 08:51:10 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:51:10 np0005548789.localdomain podman[88297]: 2025-12-06 08:51:10.331052951 +0000 UTC m=+0.479718882 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:51:10 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:51:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:51:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:51:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:51:14 np0005548789.localdomain podman[88427]: 2025-12-06 08:51:14.918570712 +0000 UTC m=+0.082217205 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:51:14 np0005548789.localdomain systemd[1]: tmp-crun.q5HjcA.mount: Deactivated successfully.
Dec 06 08:51:14 np0005548789.localdomain podman[88429]: 2025-12-06 08:51:14.975230735 +0000 UTC m=+0.135371621 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:51:15 np0005548789.localdomain podman[88429]: 2025-12-06 08:51:15.021111088 +0000 UTC m=+0.181251954 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com)
Dec 06 08:51:15 np0005548789.localdomain systemd[1]: tmp-crun.hQFVcb.mount: Deactivated successfully.
Dec 06 08:51:15 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:51:15 np0005548789.localdomain podman[88428]: 2025-12-06 08:51:15.036864729 +0000 UTC m=+0.199161211 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z)
Dec 06 08:51:15 np0005548789.localdomain podman[88427]: 2025-12-06 08:51:15.042179142 +0000 UTC m=+0.205825585 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, distribution-scope=public, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:51:15 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:51:15 np0005548789.localdomain podman[88428]: 2025-12-06 08:51:15.256233728 +0000 UTC m=+0.418530260 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 08:51:15 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:51:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:51:19 np0005548789.localdomain podman[88504]: 2025-12-06 08:51:19.916888958 +0000 UTC m=+0.072878190 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:51:19 np0005548789.localdomain podman[88504]: 2025-12-06 08:51:19.945990418 +0000 UTC m=+0.101979590 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12)
Dec 06 08:51:19 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:51:40 np0005548789.localdomain podman[88577]: 2025-12-06 08:51:40.930584702 +0000 UTC m=+0.086153355 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:51:40 np0005548789.localdomain podman[88577]: 2025-12-06 08:51:40.93933879 +0000 UTC m=+0.094907463 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com)
Dec 06 08:51:40 np0005548789.localdomain systemd[1]: tmp-crun.BSFwMn.mount: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain podman[88579]: 2025-12-06 08:51:41.028128704 +0000 UTC m=+0.179312644 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc.)
Dec 06 08:51:41 np0005548789.localdomain podman[88578]: 2025-12-06 08:51:40.993831366 +0000 UTC m=+0.142787938 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64)
Dec 06 08:51:41 np0005548789.localdomain podman[88585]: 2025-12-06 08:51:41.083062685 +0000 UTC m=+0.231890253 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 08:51:41 np0005548789.localdomain podman[88576]: 2025-12-06 08:51:41.146911607 +0000 UTC m=+0.303449081 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:51:41 np0005548789.localdomain podman[88576]: 2025-12-06 08:51:41.156111758 +0000 UTC m=+0.312649262 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64)
Dec 06 08:51:41 np0005548789.localdomain podman[88579]: 2025-12-06 08:51:41.166783235 +0000 UTC m=+0.317967155 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain podman[88592]: 2025-12-06 08:51:41.119888781 +0000 UTC m=+0.261024093 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:51:41 np0005548789.localdomain podman[88592]: 2025-12-06 08:51:41.201210707 +0000 UTC m=+0.342345979 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=)
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain podman[88585]: 2025-12-06 08:51:41.220366443 +0000 UTC m=+0.369194001 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:51:41 np0005548789.localdomain podman[88578]: 2025-12-06 08:51:41.363702696 +0000 UTC m=+0.512659268 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64)
Dec 06 08:51:41 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:51:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:51:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:51:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:51:45 np0005548789.localdomain podman[88704]: 2025-12-06 08:51:45.914431066 +0000 UTC m=+0.075938983 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:51:45 np0005548789.localdomain podman[88704]: 2025-12-06 08:51:45.941285067 +0000 UTC m=+0.102792984 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:51:45 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:51:46 np0005548789.localdomain podman[88705]: 2025-12-06 08:51:46.021588673 +0000 UTC m=+0.179934713 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:51:46 np0005548789.localdomain podman[88706]: 2025-12-06 08:51:46.073849871 +0000 UTC m=+0.229133598 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:51:46 np0005548789.localdomain podman[88706]: 2025-12-06 08:51:46.119063654 +0000 UTC m=+0.274347391 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Dec 06 08:51:46 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:51:46 np0005548789.localdomain podman[88705]: 2025-12-06 08:51:46.281162071 +0000 UTC m=+0.439508101 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Dec 06 08:51:46 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:51:46 np0005548789.localdomain systemd[1]: tmp-crun.yseGHU.mount: Deactivated successfully.
Dec 06 08:51:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:51:50 np0005548789.localdomain podman[88778]: 2025-12-06 08:51:50.914159284 +0000 UTC m=+0.080369328 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, distribution-scope=public, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:51:50 np0005548789.localdomain podman[88778]: 2025-12-06 08:51:50.933880477 +0000 UTC m=+0.100090561 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:51:50 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:52:04 np0005548789.localdomain sudo[88804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:52:04 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:52:04 np0005548789.localdomain sudo[88804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:04 np0005548789.localdomain sudo[88804]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:04 np0005548789.localdomain recover_tripleo_nova_virtqemud[88820]: 61814
Dec 06 08:52:04 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:52:04 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:52:04 np0005548789.localdomain sudo[88821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:52:04 np0005548789.localdomain sudo[88821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:04 np0005548789.localdomain sudo[88821]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:05 np0005548789.localdomain sudo[88868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:52:05 np0005548789.localdomain sudo[88868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:52:05 np0005548789.localdomain sudo[88868]: pam_unix(sudo:session): session closed for user root
Dec 06 08:52:10 np0005548789.localdomain sshd[88883]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: tmp-crun.RznnjS.mount: Deactivated successfully.
Dec 06 08:52:11 np0005548789.localdomain podman[88886]: 2025-12-06 08:52:11.930828554 +0000 UTC m=+0.088645002 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container, release=1761123044, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:52:11 np0005548789.localdomain podman[88886]: 2025-12-06 08:52:11.940981484 +0000 UTC m=+0.098797952 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: tmp-crun.4n5bBq.mount: Deactivated successfully.
Dec 06 08:52:11 np0005548789.localdomain podman[88906]: 2025-12-06 08:52:11.955819327 +0000 UTC m=+0.093571552 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:52:11 np0005548789.localdomain podman[88906]: 2025-12-06 08:52:11.96702822 +0000 UTC m=+0.104780415 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:52:11 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:52:11 np0005548789.localdomain podman[88887]: 2025-12-06 08:52:11.9833928 +0000 UTC m=+0.136249896 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com)
Dec 06 08:52:12 np0005548789.localdomain podman[88885]: 2025-12-06 08:52:12.032531304 +0000 UTC m=+0.191426005 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:52:12 np0005548789.localdomain podman[88885]: 2025-12-06 08:52:12.03797432 +0000 UTC m=+0.196869061 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 08:52:12 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:52:12 np0005548789.localdomain podman[88898]: 2025-12-06 08:52:12.088395032 +0000 UTC m=+0.232572653 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Dec 06 08:52:12 np0005548789.localdomain podman[88913]: 2025-12-06 08:52:12.140160294 +0000 UTC m=+0.275076632 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:52:12 np0005548789.localdomain podman[88898]: 2025-12-06 08:52:12.148356205 +0000 UTC m=+0.292533876 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, distribution-scope=public, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=)
Dec 06 08:52:12 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:52:12 np0005548789.localdomain podman[88913]: 2025-12-06 08:52:12.168297855 +0000 UTC m=+0.303214193 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, container_name=ceilometer_agent_ipmi)
Dec 06 08:52:12 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:52:12 np0005548789.localdomain podman[88887]: 2025-12-06 08:52:12.329983719 +0000 UTC m=+0.482840815 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target)
Dec 06 08:52:12 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:52:12 np0005548789.localdomain sshd[88883]: Connection reset by authenticating user root 91.202.233.33 port 59526 [preauth]
Dec 06 08:52:12 np0005548789.localdomain sshd[89009]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:14 np0005548789.localdomain sshd[89009]: Connection reset by authenticating user root 91.202.233.33 port 45146 [preauth]
Dec 06 08:52:14 np0005548789.localdomain sshd[89011]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:52:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:52:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:52:16 np0005548789.localdomain podman[89013]: 2025-12-06 08:52:16.944380945 +0000 UTC m=+0.098852414 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z)
Dec 06 08:52:16 np0005548789.localdomain systemd[1]: tmp-crun.Zf9pxE.mount: Deactivated successfully.
Dec 06 08:52:17 np0005548789.localdomain podman[89015]: 2025-12-06 08:52:17.000869792 +0000 UTC m=+0.148796631 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:52:17 np0005548789.localdomain podman[89014]: 2025-12-06 08:52:17.043824826 +0000 UTC m=+0.195506900 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:52:17 np0005548789.localdomain podman[89015]: 2025-12-06 08:52:17.063059264 +0000 UTC m=+0.210986053 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 08:52:17 np0005548789.localdomain podman[89013]: 2025-12-06 08:52:17.071328767 +0000 UTC m=+0.225800286 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 08:52:17 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:52:17 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:52:17 np0005548789.localdomain podman[89014]: 2025-12-06 08:52:17.226378508 +0000 UTC m=+0.378060652 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:52:17 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:52:21 np0005548789.localdomain sshd[89011]: Connection reset by authenticating user root 91.202.233.33 port 45160 [preauth]
Dec 06 08:52:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:52:21 np0005548789.localdomain systemd[1]: tmp-crun.aKx9SB.mount: Deactivated successfully.
Dec 06 08:52:21 np0005548789.localdomain podman[89090]: 2025-12-06 08:52:21.350924705 +0000 UTC m=+0.099854444 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 08:52:21 np0005548789.localdomain podman[89090]: 2025-12-06 08:52:21.384217473 +0000 UTC m=+0.133147162 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5)
Dec 06 08:52:21 np0005548789.localdomain sshd[89115]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:21 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:52:23 np0005548789.localdomain sshd[89115]: Invalid user 12345 from 91.202.233.33 port 60792
Dec 06 08:52:23 np0005548789.localdomain sshd[89115]: Connection reset by invalid user 12345 91.202.233.33 port 60792 [preauth]
Dec 06 08:52:24 np0005548789.localdomain sshd[89120]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:26 np0005548789.localdomain sshd[89120]: Invalid user postgres from 91.202.233.33 port 60814
Dec 06 08:52:27 np0005548789.localdomain sshd[89120]: Connection reset by invalid user postgres 91.202.233.33 port 60814 [preauth]
Dec 06 08:52:27 np0005548789.localdomain sshd[89122]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:52:28 np0005548789.localdomain sshd[89122]: Invalid user ethereum from 92.118.39.95 port 41548
Dec 06 08:52:28 np0005548789.localdomain sshd[89122]: Connection closed by invalid user ethereum 92.118.39.95 port 41548 [preauth]
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:52:42 np0005548789.localdomain podman[89148]: 2025-12-06 08:52:42.938725576 +0000 UTC m=+0.086637770 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:52:42 np0005548789.localdomain podman[89148]: 2025-12-06 08:52:42.950190276 +0000 UTC m=+0.098102510 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=collectd)
Dec 06 08:52:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:52:42 np0005548789.localdomain podman[89161]: 2025-12-06 08:52:42.992495441 +0000 UTC m=+0.129953385 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044)
Dec 06 08:52:43 np0005548789.localdomain podman[89161]: 2025-12-06 08:52:43.00623315 +0000 UTC m=+0.143691144 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1)
Dec 06 08:52:43 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:52:43 np0005548789.localdomain podman[89162]: 2025-12-06 08:52:43.048276095 +0000 UTC m=+0.179740866 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:52:43 np0005548789.localdomain podman[89149]: 2025-12-06 08:52:43.107624631 +0000 UTC m=+0.249057086 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc.)
Dec 06 08:52:43 np0005548789.localdomain podman[89162]: 2025-12-06 08:52:43.131972135 +0000 UTC m=+0.263436936 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, vcs-type=git)
Dec 06 08:52:43 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:52:43 np0005548789.localdomain podman[89147]: 2025-12-06 08:52:43.195655772 +0000 UTC m=+0.345072572 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Dec 06 08:52:43 np0005548789.localdomain podman[89150]: 2025-12-06 08:52:43.206291497 +0000 UTC m=+0.342346348 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, release=1761123044)
Dec 06 08:52:43 np0005548789.localdomain podman[89147]: 2025-12-06 08:52:43.207011629 +0000 UTC m=+0.356428429 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc.)
Dec 06 08:52:43 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:52:43 np0005548789.localdomain podman[89150]: 2025-12-06 08:52:43.31202284 +0000 UTC m=+0.448077701 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:52:43 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:52:43 np0005548789.localdomain podman[89149]: 2025-12-06 08:52:43.477297954 +0000 UTC m=+0.618730389 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:52:43 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:52:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:52:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:52:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:52:47 np0005548789.localdomain podman[89281]: 2025-12-06 08:52:47.926801933 +0000 UTC m=+0.082246666 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, release=1761123044, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com)
Dec 06 08:52:47 np0005548789.localdomain podman[89281]: 2025-12-06 08:52:47.95122666 +0000 UTC m=+0.106671363 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:52:47 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:52:48 np0005548789.localdomain systemd[1]: tmp-crun.SiD1qJ.mount: Deactivated successfully.
Dec 06 08:52:48 np0005548789.localdomain podman[89282]: 2025-12-06 08:52:48.037198038 +0000 UTC m=+0.192250129 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=)
Dec 06 08:52:48 np0005548789.localdomain podman[89283]: 2025-12-06 08:52:48.100470343 +0000 UTC m=+0.250098798 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:52:48 np0005548789.localdomain podman[89283]: 2025-12-06 08:52:48.133030457 +0000 UTC m=+0.282658912 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-type=git)
Dec 06 08:52:48 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:52:48 np0005548789.localdomain podman[89282]: 2025-12-06 08:52:48.26032532 +0000 UTC m=+0.415377461 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:52:48 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:52:48 np0005548789.localdomain systemd[1]: tmp-crun.IzyFP2.mount: Deactivated successfully.
Dec 06 08:52:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:52:51 np0005548789.localdomain podman[89359]: 2025-12-06 08:52:51.922593667 +0000 UTC m=+0.081032139 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible)
Dec 06 08:52:51 np0005548789.localdomain podman[89359]: 2025-12-06 08:52:51.955317748 +0000 UTC m=+0.113756240 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:52:51 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:53:05 np0005548789.localdomain sudo[89386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:53:05 np0005548789.localdomain sudo[89386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:05 np0005548789.localdomain sudo[89386]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:05 np0005548789.localdomain sudo[89401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 08:53:05 np0005548789.localdomain sudo[89401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:06 np0005548789.localdomain podman[89489]: 2025-12-06 08:53:06.318215245 +0000 UTC m=+0.062394259 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64)
Dec 06 08:53:06 np0005548789.localdomain podman[89489]: 2025-12-06 08:53:06.410570069 +0000 UTC m=+0.154749113 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc.)
Dec 06 08:53:06 np0005548789.localdomain sudo[89401]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:06 np0005548789.localdomain sudo[89561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:53:06 np0005548789.localdomain sudo[89561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:06 np0005548789.localdomain sudo[89561]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:06 np0005548789.localdomain sudo[89576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:53:06 np0005548789.localdomain sudo[89576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:07 np0005548789.localdomain sudo[89576]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:08 np0005548789.localdomain sudo[89623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:53:08 np0005548789.localdomain sudo[89623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:53:08 np0005548789.localdomain sudo[89623]: pam_unix(sudo:session): session closed for user root
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: tmp-crun.QNTgWa.mount: Deactivated successfully.
Dec 06 08:53:13 np0005548789.localdomain podman[89654]: 2025-12-06 08:53:13.958998298 +0000 UTC m=+0.098336917 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12)
Dec 06 08:53:13 np0005548789.localdomain systemd[1]: tmp-crun.rIKUrw.mount: Deactivated successfully.
Dec 06 08:53:13 np0005548789.localdomain podman[89638]: 2025-12-06 08:53:13.992108991 +0000 UTC m=+0.147706057 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team)
Dec 06 08:53:14 np0005548789.localdomain podman[89638]: 2025-12-06 08:53:14.004012916 +0000 UTC m=+0.159609972 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:53:14 np0005548789.localdomain podman[89645]: 2025-12-06 08:53:14.044630217 +0000 UTC m=+0.186530975 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 08:53:14 np0005548789.localdomain podman[89654]: 2025-12-06 08:53:14.074426228 +0000 UTC m=+0.213764927 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:53:14 np0005548789.localdomain podman[89645]: 2025-12-06 08:53:14.081247626 +0000 UTC m=+0.223148434 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:53:14 np0005548789.localdomain podman[89640]: 2025-12-06 08:53:14.151996779 +0000 UTC m=+0.302436048 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 08:53:14 np0005548789.localdomain podman[89639]: 2025-12-06 08:53:14.213482029 +0000 UTC m=+0.368510658 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044)
Dec 06 08:53:14 np0005548789.localdomain podman[89639]: 2025-12-06 08:53:14.223650781 +0000 UTC m=+0.378679420 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:53:14 np0005548789.localdomain podman[89641]: 2025-12-06 08:53:14.297836669 +0000 UTC m=+0.441596713 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute)
Dec 06 08:53:14 np0005548789.localdomain podman[89641]: 2025-12-06 08:53:14.318069438 +0000 UTC m=+0.461829562 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible)
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:53:14 np0005548789.localdomain podman[89640]: 2025-12-06 08:53:14.597200272 +0000 UTC m=+0.747639481 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_migration_target)
Dec 06 08:53:14 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:53:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:53:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:53:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:53:18 np0005548789.localdomain systemd[1]: tmp-crun.yTqNwE.mount: Deactivated successfully.
Dec 06 08:53:18 np0005548789.localdomain podman[89770]: 2025-12-06 08:53:18.939813553 +0000 UTC m=+0.097701637 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:53:18 np0005548789.localdomain podman[89769]: 2025-12-06 08:53:18.896466238 +0000 UTC m=+0.059936663 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 08:53:18 np0005548789.localdomain podman[89769]: 2025-12-06 08:53:18.975272218 +0000 UTC m=+0.138742623 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:53:18 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:53:18 np0005548789.localdomain podman[89771]: 2025-12-06 08:53:18.988261665 +0000 UTC m=+0.142110326 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Dec 06 08:53:19 np0005548789.localdomain podman[89771]: 2025-12-06 08:53:19.046088053 +0000 UTC m=+0.199936674 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 08:53:19 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:53:19 np0005548789.localdomain podman[89770]: 2025-12-06 08:53:19.118229569 +0000 UTC m=+0.276117703 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:53:19 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:53:19 np0005548789.localdomain systemd[1]: tmp-crun.I8bopW.mount: Deactivated successfully.
Dec 06 08:53:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:53:22 np0005548789.localdomain systemd[1]: tmp-crun.Qet4nb.mount: Deactivated successfully.
Dec 06 08:53:22 np0005548789.localdomain podman[89846]: 2025-12-06 08:53:22.930975457 +0000 UTC m=+0.092321384 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:53:22 np0005548789.localdomain podman[89846]: 2025-12-06 08:53:22.987111383 +0000 UTC m=+0.148457260 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, release=1761123044, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:53:23 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:53:44 np0005548789.localdomain recover_tripleo_nova_virtqemud[89927]: 61814
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:53:44 np0005548789.localdomain systemd[1]: tmp-crun.9j0JNa.mount: Deactivated successfully.
Dec 06 08:53:44 np0005548789.localdomain podman[89895]: 2025-12-06 08:53:44.968722297 +0000 UTC m=+0.118890696 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible)
Dec 06 08:53:44 np0005548789.localdomain podman[89904]: 2025-12-06 08:53:44.981294521 +0000 UTC m=+0.116152342 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64)
Dec 06 08:53:45 np0005548789.localdomain podman[89896]: 2025-12-06 08:53:45.016271291 +0000 UTC m=+0.161547541 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 06 08:53:45 np0005548789.localdomain podman[89914]: 2025-12-06 08:53:45.023295025 +0000 UTC m=+0.152222275 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:53:45 np0005548789.localdomain podman[89904]: 2025-12-06 08:53:45.037124258 +0000 UTC m=+0.171982159 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:53:45 np0005548789.localdomain podman[89896]: 2025-12-06 08:53:45.048150145 +0000 UTC m=+0.193426335 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:53:45 np0005548789.localdomain podman[89914]: 2025-12-06 08:53:45.091681817 +0000 UTC m=+0.220609087 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1)
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:53:45 np0005548789.localdomain podman[89895]: 2025-12-06 08:53:45.112178033 +0000 UTC m=+0.262346422 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:53:45 np0005548789.localdomain podman[89903]: 2025-12-06 08:53:45.169185626 +0000 UTC m=+0.307278356 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:53:45 np0005548789.localdomain podman[89903]: 2025-12-06 08:53:45.193145799 +0000 UTC m=+0.331238539 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:53:45 np0005548789.localdomain podman[89897]: 2025-12-06 08:53:45.042807982 +0000 UTC m=+0.185972927 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:53:45 np0005548789.localdomain podman[89897]: 2025-12-06 08:53:45.385855182 +0000 UTC m=+0.529020157 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 08:53:45 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:53:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:53:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:53:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:53:49 np0005548789.localdomain podman[90029]: 2025-12-06 08:53:49.918640015 +0000 UTC m=+0.071299691 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.12, io.openshift.expose-services=)
Dec 06 08:53:49 np0005548789.localdomain podman[90027]: 2025-12-06 08:53:49.973578875 +0000 UTC m=+0.130940114 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 08:53:49 np0005548789.localdomain podman[90029]: 2025-12-06 08:53:49.981125516 +0000 UTC m=+0.133785182 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:53:49 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:53:49 np0005548789.localdomain podman[90027]: 2025-12-06 08:53:49.997125915 +0000 UTC m=+0.154487204 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Dec 06 08:53:50 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:53:50 np0005548789.localdomain podman[90028]: 2025-12-06 08:53:50.079618357 +0000 UTC m=+0.235608085 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z)
Dec 06 08:53:50 np0005548789.localdomain podman[90028]: 2025-12-06 08:53:50.278062095 +0000 UTC m=+0.434051823 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, vcs-type=git, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:53:50 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:53:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:53:53 np0005548789.localdomain podman[90102]: 2025-12-06 08:53:53.939088874 +0000 UTC m=+0.100112112 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 06 08:53:53 np0005548789.localdomain podman[90102]: 2025-12-06 08:53:53.968237835 +0000 UTC m=+0.129261053 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 08:53:53 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:54:08 np0005548789.localdomain sudo[90129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:54:08 np0005548789.localdomain sudo[90129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:08 np0005548789.localdomain sudo[90129]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:08 np0005548789.localdomain sudo[90144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:54:08 np0005548789.localdomain sudo[90144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:09 np0005548789.localdomain sudo[90144]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:09 np0005548789.localdomain sudo[90190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:54:09 np0005548789.localdomain sudo[90190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:54:09 np0005548789.localdomain sudo[90190]: pam_unix(sudo:session): session closed for user root
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: tmp-crun.4n3QPH.mount: Deactivated successfully.
Dec 06 08:54:15 np0005548789.localdomain podman[90210]: 2025-12-06 08:54:15.930903713 +0000 UTC m=+0.087979830 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target)
Dec 06 08:54:15 np0005548789.localdomain podman[90205]: 2025-12-06 08:54:15.97982872 +0000 UTC m=+0.143401666 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Dec 06 08:54:15 np0005548789.localdomain podman[90205]: 2025-12-06 08:54:15.987971529 +0000 UTC m=+0.151544495 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 08:54:15 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:54:16 np0005548789.localdomain podman[90225]: 2025-12-06 08:54:16.025096324 +0000 UTC m=+0.167805162 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com)
Dec 06 08:54:16 np0005548789.localdomain podman[90222]: 2025-12-06 08:54:16.041880287 +0000 UTC m=+0.184264895 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:54:16 np0005548789.localdomain podman[90222]: 2025-12-06 08:54:16.05308759 +0000 UTC m=+0.195472208 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public)
Dec 06 08:54:16 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:54:16 np0005548789.localdomain podman[90225]: 2025-12-06 08:54:16.077086593 +0000 UTC m=+0.219795431 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12)
Dec 06 08:54:16 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:54:16 np0005548789.localdomain podman[90213]: 2025-12-06 08:54:16.094042641 +0000 UTC m=+0.244365652 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:54:16 np0005548789.localdomain podman[90213]: 2025-12-06 08:54:16.114140736 +0000 UTC m=+0.264463828 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:54:16 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:54:16 np0005548789.localdomain podman[90206]: 2025-12-06 08:54:16.201131856 +0000 UTC m=+0.358153252 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z)
Dec 06 08:54:16 np0005548789.localdomain podman[90206]: 2025-12-06 08:54:16.211040389 +0000 UTC m=+0.368061735 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com)
Dec 06 08:54:16 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:54:16 np0005548789.localdomain podman[90210]: 2025-12-06 08:54:16.275155519 +0000 UTC m=+0.432231706 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64)
Dec 06 08:54:16 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:54:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:54:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:54:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:54:20 np0005548789.localdomain podman[90336]: 2025-12-06 08:54:20.915318457 +0000 UTC m=+0.081003678 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:54:20 np0005548789.localdomain systemd[1]: tmp-crun.EBOJs9.mount: Deactivated successfully.
Dec 06 08:54:20 np0005548789.localdomain podman[90335]: 2025-12-06 08:54:20.980746047 +0000 UTC m=+0.145228981 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 08:54:21 np0005548789.localdomain podman[90335]: 2025-12-06 08:54:21.009077604 +0000 UTC m=+0.173560498 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:54:21 np0005548789.localdomain podman[90337]: 2025-12-06 08:54:21.022712361 +0000 UTC m=+0.180910323 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 08:54:21 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:54:21 np0005548789.localdomain podman[90337]: 2025-12-06 08:54:21.096334031 +0000 UTC m=+0.254532023 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:54:21 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:54:21 np0005548789.localdomain podman[90336]: 2025-12-06 08:54:21.111235887 +0000 UTC m=+0.276921198 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:54:21 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:54:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:54:24 np0005548789.localdomain systemd[1]: tmp-crun.OiNcAd.mount: Deactivated successfully.
Dec 06 08:54:24 np0005548789.localdomain podman[90414]: 2025-12-06 08:54:24.936840017 +0000 UTC m=+0.096840802 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step5)
Dec 06 08:54:24 np0005548789.localdomain podman[90414]: 2025-12-06 08:54:24.992236171 +0000 UTC m=+0.152236916 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Dec 06 08:54:25 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:54:33 np0005548789.localdomain sshd[90462]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:54:33 np0005548789.localdomain sshd[90462]: Invalid user eth from 92.118.39.95 port 56564
Dec 06 08:54:33 np0005548789.localdomain sshd[90462]: Connection closed by invalid user eth 92.118.39.95 port 56564 [preauth]
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:54:46 np0005548789.localdomain podman[90466]: 2025-12-06 08:54:46.947804529 +0000 UTC m=+0.087971140 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc.)
Dec 06 08:54:46 np0005548789.localdomain systemd[1]: tmp-crun.O6V8cV.mount: Deactivated successfully.
Dec 06 08:54:46 np0005548789.localdomain podman[90467]: 2025-12-06 08:54:46.965505241 +0000 UTC m=+0.094549222 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:47 np0005548789.localdomain podman[90464]: 2025-12-06 08:54:47.010773245 +0000 UTC m=+0.152479244 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-cron-container, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:54:47 np0005548789.localdomain podman[90467]: 2025-12-06 08:54:47.013954972 +0000 UTC m=+0.142998883 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:54:47 np0005548789.localdomain podman[90464]: 2025-12-06 08:54:47.067453998 +0000 UTC m=+0.209159957 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-type=git)
Dec 06 08:54:47 np0005548789.localdomain podman[90465]: 2025-12-06 08:54:47.074441271 +0000 UTC m=+0.214021925 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:54:47 np0005548789.localdomain podman[90483]: 2025-12-06 08:54:47.100633442 +0000 UTC m=+0.224251917 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:54:47 np0005548789.localdomain podman[90465]: 2025-12-06 08:54:47.107043678 +0000 UTC m=+0.246624332 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-collectd-container)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:54:47 np0005548789.localdomain podman[90474]: 2025-12-06 08:54:47.16042654 +0000 UTC m=+0.290607017 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container)
Dec 06 08:54:47 np0005548789.localdomain podman[90474]: 2025-12-06 08:54:47.1904898 +0000 UTC m=+0.320670307 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:54:47 np0005548789.localdomain podman[90483]: 2025-12-06 08:54:47.246230994 +0000 UTC m=+0.369849549 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, managed_by=tripleo_ansible)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:54:47 np0005548789.localdomain podman[90466]: 2025-12-06 08:54:47.329847971 +0000 UTC m=+0.470014652 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:54:47 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:54:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:54:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:54:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:54:51 np0005548789.localdomain podman[90595]: 2025-12-06 08:54:51.928600272 +0000 UTC m=+0.085967720 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:54:52 np0005548789.localdomain systemd[1]: tmp-crun.hSHBbG.mount: Deactivated successfully.
Dec 06 08:54:52 np0005548789.localdomain podman[90596]: 2025-12-06 08:54:52.01225621 +0000 UTC m=+0.164714117 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:54:52 np0005548789.localdomain podman[90594]: 2025-12-06 08:54:52.053074238 +0000 UTC m=+0.212134237 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, managed_by=tripleo_ansible)
Dec 06 08:54:52 np0005548789.localdomain podman[90596]: 2025-12-06 08:54:52.058744992 +0000 UTC m=+0.211202939 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:54:52 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:54:52 np0005548789.localdomain podman[90594]: 2025-12-06 08:54:52.081075625 +0000 UTC m=+0.240135604 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:54:52 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:54:52 np0005548789.localdomain podman[90595]: 2025-12-06 08:54:52.158033247 +0000 UTC m=+0.315400675 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:54:52 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:54:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:54:55 np0005548789.localdomain systemd[1]: tmp-crun.VjVYvR.mount: Deactivated successfully.
Dec 06 08:54:55 np0005548789.localdomain podman[90669]: 2025-12-06 08:54:55.92249661 +0000 UTC m=+0.085410623 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:54:55 np0005548789.localdomain podman[90669]: 2025-12-06 08:54:55.949136334 +0000 UTC m=+0.112050347 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute)
Dec 06 08:54:55 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:55:04 np0005548789.localdomain sshd[90693]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:05 np0005548789.localdomain sshd[90693]: Connection reset by authenticating user root 45.140.17.124 port 63964 [preauth]
Dec 06 08:55:06 np0005548789.localdomain sshd[90695]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:08 np0005548789.localdomain sshd[90695]: Connection reset by authenticating user root 45.140.17.124 port 63986 [preauth]
Dec 06 08:55:08 np0005548789.localdomain sshd[90697]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:09 np0005548789.localdomain sudo[90699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:55:09 np0005548789.localdomain sudo[90699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:09 np0005548789.localdomain sudo[90699]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:10 np0005548789.localdomain sshd[90697]: Connection reset by authenticating user root 45.140.17.124 port 64012 [preauth]
Dec 06 08:55:10 np0005548789.localdomain sudo[90714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:55:10 np0005548789.localdomain sudo[90714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:10 np0005548789.localdomain sshd[90729]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:10 np0005548789.localdomain sudo[90714]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:11 np0005548789.localdomain sudo[90764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:55:11 np0005548789.localdomain sudo[90764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:55:11 np0005548789.localdomain sudo[90764]: pam_unix(sudo:session): session closed for user root
Dec 06 08:55:12 np0005548789.localdomain sshd[90729]: Connection reset by authenticating user root 45.140.17.124 port 64028 [preauth]
Dec 06 08:55:12 np0005548789.localdomain sshd[90779]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:55:14 np0005548789.localdomain sshd[90779]: Connection reset by authenticating user root 45.140.17.124 port 36606 [preauth]
Dec 06 08:55:16 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:55:16 np0005548789.localdomain recover_tripleo_nova_virtqemud[90782]: 61814
Dec 06 08:55:16 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:55:16 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: tmp-crun.N10JAa.mount: Deactivated successfully.
Dec 06 08:55:17 np0005548789.localdomain podman[90784]: 2025-12-06 08:55:17.943031436 +0000 UTC m=+0.096732938 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:51:28Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 08:55:17 np0005548789.localdomain systemd[1]: tmp-crun.G9rcL4.mount: Deactivated successfully.
Dec 06 08:55:17 np0005548789.localdomain podman[90786]: 2025-12-06 08:55:17.988828357 +0000 UTC m=+0.136663900 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:55:17 np0005548789.localdomain podman[90783]: 2025-12-06 08:55:17.996168071 +0000 UTC m=+0.149847032 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, container_name=logrotate_crond, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 08:55:18 np0005548789.localdomain podman[90784]: 2025-12-06 08:55:18.009104137 +0000 UTC m=+0.162805649 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:55:18 np0005548789.localdomain podman[90783]: 2025-12-06 08:55:18.033037719 +0000 UTC m=+0.186716690 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:55:18 np0005548789.localdomain podman[90785]: 2025-12-06 08:55:18.047887683 +0000 UTC m=+0.197245183 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=nova_migration_target, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:55:18 np0005548789.localdomain podman[90786]: 2025-12-06 08:55:18.065965685 +0000 UTC m=+0.213801218 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:55:18 np0005548789.localdomain podman[90805]: 2025-12-06 08:55:18.145236599 +0000 UTC m=+0.285572772 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Dec 06 08:55:18 np0005548789.localdomain podman[90798]: 2025-12-06 08:55:18.191000298 +0000 UTC m=+0.332805587 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container)
Dec 06 08:55:18 np0005548789.localdomain podman[90798]: 2025-12-06 08:55:18.222559913 +0000 UTC m=+0.364365222 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:55:18 np0005548789.localdomain podman[90805]: 2025-12-06 08:55:18.273187641 +0000 UTC m=+0.413523784 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:55:18 np0005548789.localdomain podman[90785]: 2025-12-06 08:55:18.41115654 +0000 UTC m=+0.560514130 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 06 08:55:18 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:55:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:55:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:55:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:55:22 np0005548789.localdomain podman[90911]: 2025-12-06 08:55:22.916113451 +0000 UTC m=+0.078765930 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team)
Dec 06 08:55:22 np0005548789.localdomain podman[90911]: 2025-12-06 08:55:22.968053308 +0000 UTC m=+0.130705817 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ovn_controller, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:55:22 np0005548789.localdomain podman[90912]: 2025-12-06 08:55:22.979125757 +0000 UTC m=+0.139110475 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd)
Dec 06 08:55:22 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:55:23 np0005548789.localdomain podman[90913]: 2025-12-06 08:55:23.023786072 +0000 UTC m=+0.180517150 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com)
Dec 06 08:55:23 np0005548789.localdomain podman[90913]: 2025-12-06 08:55:23.068211061 +0000 UTC m=+0.224942149 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:55:23 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:55:23 np0005548789.localdomain podman[90912]: 2025-12-06 08:55:23.136321873 +0000 UTC m=+0.296306651 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Dec 06 08:55:23 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:55:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:55:26 np0005548789.localdomain systemd[1]: tmp-crun.SfimPh.mount: Deactivated successfully.
Dec 06 08:55:26 np0005548789.localdomain podman[90985]: 2025-12-06 08:55:26.925124499 +0000 UTC m=+0.087481887 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1)
Dec 06 08:55:26 np0005548789.localdomain podman[90985]: 2025-12-06 08:55:26.982158222 +0000 UTC m=+0.144515550 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, container_name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team)
Dec 06 08:55:26 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:55:48 np0005548789.localdomain podman[91041]: 2025-12-06 08:55:48.946029073 +0000 UTC m=+0.094853290 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=nova_migration_target)
Dec 06 08:55:48 np0005548789.localdomain systemd[1]: tmp-crun.USg5WS.mount: Deactivated successfully.
Dec 06 08:55:48 np0005548789.localdomain podman[91035]: 2025-12-06 08:55:48.996587619 +0000 UTC m=+0.147526411 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3)
Dec 06 08:55:49 np0005548789.localdomain podman[91035]: 2025-12-06 08:55:49.004573003 +0000 UTC m=+0.155511796 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:55:49 np0005548789.localdomain podman[91034]: 2025-12-06 08:55:49.055703178 +0000 UTC m=+0.215916914 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com)
Dec 06 08:55:49 np0005548789.localdomain podman[91034]: 2025-12-06 08:55:49.062954649 +0000 UTC m=+0.223168435 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, tcib_managed=true)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:55:49 np0005548789.localdomain podman[91050]: 2025-12-06 08:55:49.13821688 +0000 UTC m=+0.278643760 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, tcib_managed=true, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:55:49 np0005548789.localdomain podman[91043]: 2025-12-06 08:55:49.196448711 +0000 UTC m=+0.338835091 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid)
Dec 06 08:55:49 np0005548789.localdomain podman[91043]: 2025-12-06 08:55:49.235160105 +0000 UTC m=+0.377546435 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 06 08:55:49 np0005548789.localdomain podman[91042]: 2025-12-06 08:55:49.242191419 +0000 UTC m=+0.387260231 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:55:49 np0005548789.localdomain podman[91050]: 2025-12-06 08:55:49.265595024 +0000 UTC m=+0.406022004 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:55:49 np0005548789.localdomain podman[91042]: 2025-12-06 08:55:49.320997439 +0000 UTC m=+0.466066291 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:55:49 np0005548789.localdomain podman[91041]: 2025-12-06 08:55:49.35930749 +0000 UTC m=+0.508131647 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, container_name=nova_migration_target, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=)
Dec 06 08:55:49 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:55:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:55:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:55:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:55:53 np0005548789.localdomain systemd[1]: tmp-crun.WhmE2A.mount: Deactivated successfully.
Dec 06 08:55:53 np0005548789.localdomain podman[91172]: 2025-12-06 08:55:53.933935195 +0000 UTC m=+0.096646827 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.12, vcs-type=git, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Dec 06 08:55:53 np0005548789.localdomain podman[91174]: 2025-12-06 08:55:53.979945991 +0000 UTC m=+0.137046671 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4)
Dec 06 08:55:54 np0005548789.localdomain podman[91174]: 2025-12-06 08:55:54.03125695 +0000 UTC m=+0.188357580 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:55:54 np0005548789.localdomain podman[91173]: 2025-12-06 08:55:54.034283213 +0000 UTC m=+0.193080745 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public)
Dec 06 08:55:54 np0005548789.localdomain podman[91172]: 2025-12-06 08:55:54.082839677 +0000 UTC m=+0.245551309 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:55:54 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:55:54 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:55:54 np0005548789.localdomain podman[91173]: 2025-12-06 08:55:54.219196787 +0000 UTC m=+0.377994369 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true)
Dec 06 08:55:54 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:55:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:55:57 np0005548789.localdomain podman[91248]: 2025-12-06 08:55:57.897741721 +0000 UTC m=+0.057812339 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true)
Dec 06 08:55:57 np0005548789.localdomain podman[91248]: 2025-12-06 08:55:57.928957436 +0000 UTC m=+0.089028074 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:55:57 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:56:11 np0005548789.localdomain sudo[91274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:56:11 np0005548789.localdomain sudo[91274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:11 np0005548789.localdomain sudo[91274]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:11 np0005548789.localdomain sudo[91289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:56:11 np0005548789.localdomain sudo[91289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:12 np0005548789.localdomain sudo[91289]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:12 np0005548789.localdomain sudo[91336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:56:12 np0005548789.localdomain sudo[91336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:56:12 np0005548789.localdomain sudo[91336]: pam_unix(sudo:session): session closed for user root
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:56:19 np0005548789.localdomain podman[91351]: 2025-12-06 08:56:19.94451985 +0000 UTC m=+0.106893439 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team)
Dec 06 08:56:19 np0005548789.localdomain podman[91351]: 2025-12-06 08:56:19.980195392 +0000 UTC m=+0.142568981 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, url=https://www.redhat.com, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:56:19 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain podman[91352]: 2025-12-06 08:56:19.999821182 +0000 UTC m=+0.153798614 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 08:56:20 np0005548789.localdomain podman[91352]: 2025-12-06 08:56:20.038126902 +0000 UTC m=+0.192104364 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com)
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain podman[91363]: 2025-12-06 08:56:20.055969978 +0000 UTC m=+0.203672248 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 08:56:20 np0005548789.localdomain podman[91353]: 2025-12-06 08:56:20.088847403 +0000 UTC m=+0.240271827 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container)
Dec 06 08:56:20 np0005548789.localdomain podman[91363]: 2025-12-06 08:56:20.109066542 +0000 UTC m=+0.256768782 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain podman[91359]: 2025-12-06 08:56:20.165899609 +0000 UTC m=+0.314439045 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:56:20 np0005548789.localdomain podman[91360]: 2025-12-06 08:56:20.207327626 +0000 UTC m=+0.354149690 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=)
Dec 06 08:56:20 np0005548789.localdomain podman[91359]: 2025-12-06 08:56:20.231297089 +0000 UTC m=+0.379836535 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, distribution-scope=public)
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain podman[91360]: 2025-12-06 08:56:20.244458781 +0000 UTC m=+0.391280845 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain podman[91353]: 2025-12-06 08:56:20.461153287 +0000 UTC m=+0.612577751 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:56:20 np0005548789.localdomain systemd[1]: tmp-crun.XhvPoy.mount: Deactivated successfully.
Dec 06 08:56:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:56:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:56:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:56:24 np0005548789.localdomain podman[91488]: 2025-12-06 08:56:24.912072017 +0000 UTC m=+0.079615405 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, vcs-type=git, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 08:56:24 np0005548789.localdomain systemd[1]: tmp-crun.qik24J.mount: Deactivated successfully.
Dec 06 08:56:24 np0005548789.localdomain podman[91487]: 2025-12-06 08:56:24.971235366 +0000 UTC m=+0.137139624 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12)
Dec 06 08:56:25 np0005548789.localdomain podman[91489]: 2025-12-06 08:56:25.021467502 +0000 UTC m=+0.177300522 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, version=17.1.12, vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Dec 06 08:56:25 np0005548789.localdomain podman[91487]: 2025-12-06 08:56:25.027252479 +0000 UTC m=+0.193156707 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:56:25 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:56:25 np0005548789.localdomain podman[91489]: 2025-12-06 08:56:25.071248344 +0000 UTC m=+0.227081354 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ovn_metadata_agent, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:56:25 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:56:25 np0005548789.localdomain podman[91488]: 2025-12-06 08:56:25.118246682 +0000 UTC m=+0.285790080 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-qdrouterd)
Dec 06 08:56:25 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:56:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:56:28 np0005548789.localdomain systemd[1]: tmp-crun.Z6fk8q.mount: Deactivated successfully.
Dec 06 08:56:28 np0005548789.localdomain podman[91563]: 2025-12-06 08:56:28.918032583 +0000 UTC m=+0.084414941 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step5, vcs-type=git, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, container_name=nova_compute)
Dec 06 08:56:28 np0005548789.localdomain podman[91563]: 2025-12-06 08:56:28.970526419 +0000 UTC m=+0.136908777 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:56:28 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:56:40 np0005548789.localdomain sshd[91589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:56:40 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:56:40 np0005548789.localdomain recover_tripleo_nova_virtqemud[91592]: 61814
Dec 06 08:56:40 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:56:40 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:56:41 np0005548789.localdomain sshd[91589]: Invalid user jito from 92.118.39.95 port 43314
Dec 06 08:56:41 np0005548789.localdomain sshd[91589]: Connection closed by invalid user jito 92.118.39.95 port 43314 [preauth]
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:56:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:56:50 np0005548789.localdomain podman[91594]: 2025-12-06 08:56:50.961721125 +0000 UTC m=+0.109492199 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Dec 06 08:56:51 np0005548789.localdomain podman[91594]: 2025-12-06 08:56:51.000413188 +0000 UTC m=+0.148184262 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: tmp-crun.teLjuu.mount: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91596]: 2025-12-06 08:56:51.021149102 +0000 UTC m=+0.160482708 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_compute)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91596]: 2025-12-06 08:56:51.052236282 +0000 UTC m=+0.191569928 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91595]: 2025-12-06 08:56:51.071844402 +0000 UTC m=+0.214702535 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:56:51 np0005548789.localdomain podman[91593]: 2025-12-06 08:56:51.119840259 +0000 UTC m=+0.269256943 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T22:49:32Z, release=1761123044)
Dec 06 08:56:51 np0005548789.localdomain podman[91601]: 2025-12-06 08:56:51.164933938 +0000 UTC m=+0.300866510 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, container_name=iscsid, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible)
Dec 06 08:56:51 np0005548789.localdomain podman[91593]: 2025-12-06 08:56:51.186057594 +0000 UTC m=+0.335474278 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91601]: 2025-12-06 08:56:51.20326412 +0000 UTC m=+0.339196692 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91608]: 2025-12-06 08:56:51.26963765 +0000 UTC m=+0.400502407 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 08:56:51 np0005548789.localdomain podman[91608]: 2025-12-06 08:56:51.325355084 +0000 UTC m=+0.456219881 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain podman[91595]: 2025-12-06 08:56:51.51232928 +0000 UTC m=+0.655187443 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:56:51 np0005548789.localdomain systemd[1]: tmp-crun.aQgUDi.mount: Deactivated successfully.
Dec 06 08:56:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:56:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:56:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:56:55 np0005548789.localdomain podman[91728]: 2025-12-06 08:56:55.940442503 +0000 UTC m=+0.088890909 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 08:56:56 np0005548789.localdomain systemd[1]: tmp-crun.S7j7LO.mount: Deactivated successfully.
Dec 06 08:56:56 np0005548789.localdomain podman[91726]: 2025-12-06 08:56:56.011453164 +0000 UTC m=+0.165369688 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Dec 06 08:56:56 np0005548789.localdomain podman[91728]: 2025-12-06 08:56:56.019287973 +0000 UTC m=+0.167736339 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, version=17.1.12, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 06 08:56:56 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:56:56 np0005548789.localdomain podman[91726]: 2025-12-06 08:56:56.042332528 +0000 UTC m=+0.196249052 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64)
Dec 06 08:56:56 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:56:56 np0005548789.localdomain podman[91727]: 2025-12-06 08:56:56.109082329 +0000 UTC m=+0.258252777 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:56:56 np0005548789.localdomain podman[91727]: 2025-12-06 08:56:56.310194688 +0000 UTC m=+0.459365086 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:56:56 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:56:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:56:59 np0005548789.localdomain podman[91802]: 2025-12-06 08:56:59.944066306 +0000 UTC m=+0.100090371 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, version=17.1.12, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Dec 06 08:56:59 np0005548789.localdomain podman[91802]: 2025-12-06 08:56:59.996669965 +0000 UTC m=+0.152694020 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, container_name=nova_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12)
Dec 06 08:57:00 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:57:12 np0005548789.localdomain sudo[91828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:57:12 np0005548789.localdomain sudo[91828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:12 np0005548789.localdomain sudo[91828]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:13 np0005548789.localdomain sudo[91843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:57:13 np0005548789.localdomain sudo[91843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:13 np0005548789.localdomain sudo[91843]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:14 np0005548789.localdomain sudo[91890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:57:14 np0005548789.localdomain sudo[91890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:57:14 np0005548789.localdomain sudo[91890]: pam_unix(sudo:session): session closed for user root
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:57:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:57:21 np0005548789.localdomain podman[91908]: 2025-12-06 08:57:21.931450202 +0000 UTC m=+0.089796767 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, tcib_managed=true, vcs-type=git, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 08:57:21 np0005548789.localdomain podman[91907]: 2025-12-06 08:57:21.987680061 +0000 UTC m=+0.147333586 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target)
Dec 06 08:57:22 np0005548789.localdomain podman[91908]: 2025-12-06 08:57:22.015142741 +0000 UTC m=+0.173489276 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain podman[91906]: 2025-12-06 08:57:22.030883302 +0000 UTC m=+0.187680510 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:57:22 np0005548789.localdomain podman[91906]: 2025-12-06 08:57:22.044924841 +0000 UTC m=+0.201722019 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain podman[91915]: 2025-12-06 08:57:21.951221856 +0000 UTC m=+0.101381770 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 08:57:22 np0005548789.localdomain podman[91905]: 2025-12-06 08:57:22.127201587 +0000 UTC m=+0.290945657 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 08:57:22 np0005548789.localdomain podman[91905]: 2025-12-06 08:57:22.135159041 +0000 UTC m=+0.298903131 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain podman[91919]: 2025-12-06 08:57:22.193010349 +0000 UTC m=+0.341567505 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1)
Dec 06 08:57:22 np0005548789.localdomain podman[91915]: 2025-12-06 08:57:22.236285383 +0000 UTC m=+0.386445327 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain podman[91919]: 2025-12-06 08:57:22.250381164 +0000 UTC m=+0.398938320 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain podman[91907]: 2025-12-06 08:57:22.366312518 +0000 UTC m=+0.525965973 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.)
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:57:22 np0005548789.localdomain systemd[1]: tmp-crun.hXVwMA.mount: Deactivated successfully.
Dec 06 08:57:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:57:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:57:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:57:26 np0005548789.localdomain systemd[1]: tmp-crun.uJ4Pla.mount: Deactivated successfully.
Dec 06 08:57:26 np0005548789.localdomain podman[92040]: 2025-12-06 08:57:26.93860221 +0000 UTC m=+0.098577235 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:57:26 np0005548789.localdomain podman[92040]: 2025-12-06 08:57:26.960876242 +0000 UTC m=+0.120851327 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:57:26 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:57:27 np0005548789.localdomain podman[92041]: 2025-12-06 08:57:27.047404807 +0000 UTC m=+0.200317805 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_id=tripleo_step1, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 08:57:27 np0005548789.localdomain podman[92042]: 2025-12-06 08:57:27.096456638 +0000 UTC m=+0.250540122 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:57:27 np0005548789.localdomain podman[92042]: 2025-12-06 08:57:27.14528103 +0000 UTC m=+0.299364514 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent)
Dec 06 08:57:27 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:57:27 np0005548789.localdomain podman[92041]: 2025-12-06 08:57:27.244129892 +0000 UTC m=+0.397042840 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container)
Dec 06 08:57:27 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:57:27 np0005548789.localdomain systemd[1]: tmp-crun.MWUd7S.mount: Deactivated successfully.
Dec 06 08:57:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:57:30 np0005548789.localdomain podman[92116]: 2025-12-06 08:57:30.929816137 +0000 UTC m=+0.085516655 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Dec 06 08:57:30 np0005548789.localdomain podman[92116]: 2025-12-06 08:57:30.987290235 +0000 UTC m=+0.142990723 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:57:30 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:57:52 np0005548789.localdomain podman[92142]: 2025-12-06 08:57:52.933724019 +0000 UTC m=+0.097437260 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 08:57:52 np0005548789.localdomain podman[92142]: 2025-12-06 08:57:52.942231929 +0000 UTC m=+0.105945250 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, release=1761123044, architecture=x86_64, container_name=collectd)
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:57:52 np0005548789.localdomain systemd[1]: tmp-crun.z0Gbkl.mount: Deactivated successfully.
Dec 06 08:57:52 np0005548789.localdomain podman[92141]: 2025-12-06 08:57:52.984687967 +0000 UTC m=+0.149376868 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, version=17.1.12, distribution-scope=public)
Dec 06 08:57:52 np0005548789.localdomain podman[92143]: 2025-12-06 08:57:52.991106293 +0000 UTC m=+0.151078620 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:57:53 np0005548789.localdomain podman[92147]: 2025-12-06 08:57:53.034774799 +0000 UTC m=+0.189500896 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:57:53 np0005548789.localdomain podman[92144]: 2025-12-06 08:57:53.09174557 +0000 UTC m=+0.249679015 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc.)
Dec 06 08:57:53 np0005548789.localdomain podman[92147]: 2025-12-06 08:57:53.112057931 +0000 UTC m=+0.266784048 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044)
Dec 06 08:57:53 np0005548789.localdomain podman[92141]: 2025-12-06 08:57:53.116823807 +0000 UTC m=+0.281512698 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1761123044, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:57:53 np0005548789.localdomain podman[92144]: 2025-12-06 08:57:53.15421367 +0000 UTC m=+0.312147135 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:57:53 np0005548789.localdomain podman[92145]: 2025-12-06 08:57:53.147295729 +0000 UTC m=+0.302319145 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.)
Dec 06 08:57:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:57:53 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:57:53 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:57:53 np0005548789.localdomain podman[92145]: 2025-12-06 08:57:53.237022872 +0000 UTC m=+0.392046258 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc.)
Dec 06 08:57:53 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:57:53 np0005548789.localdomain podman[92143]: 2025-12-06 08:57:53.363299383 +0000 UTC m=+0.523271710 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git)
Dec 06 08:57:53 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:57:55 np0005548789.localdomain sshd[92272]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:57:55 np0005548789.localdomain sshd[92272]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 08:57:55 np0005548789.localdomain sshd[92272]: Connection closed by 161.248.200.221 port 56394
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:57:57 np0005548789.localdomain recover_tripleo_nova_virtqemud[92288]: 61814
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: tmp-crun.VJAFCE.mount: Deactivated successfully.
Dec 06 08:57:57 np0005548789.localdomain podman[92275]: 2025-12-06 08:57:57.93214901 +0000 UTC m=+0.083428762 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 08:57:57 np0005548789.localdomain podman[92275]: 2025-12-06 08:57:57.969048928 +0000 UTC m=+0.120328670 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: tmp-crun.h7llIe.mount: Deactivated successfully.
Dec 06 08:57:57 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:57:57 np0005548789.localdomain podman[92274]: 2025-12-06 08:57:57.985098619 +0000 UTC m=+0.139110744 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 08:57:58 np0005548789.localdomain podman[92273]: 2025-12-06 08:57:58.037966776 +0000 UTC m=+0.194153388 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 08:57:58 np0005548789.localdomain podman[92273]: 2025-12-06 08:57:58.087266773 +0000 UTC m=+0.243453365 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4)
Dec 06 08:57:58 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:57:58 np0005548789.localdomain podman[92274]: 2025-12-06 08:57:58.18921473 +0000 UTC m=+0.343226835 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:49:46Z, tcib_managed=true, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 08:57:58 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:58:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:58:01 np0005548789.localdomain podman[92354]: 2025-12-06 08:58:01.924592562 +0000 UTC m=+0.084947538 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64)
Dec 06 08:58:01 np0005548789.localdomain podman[92354]: 2025-12-06 08:58:01.952460304 +0000 UTC m=+0.112815270 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:58:01 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:58:07 np0005548789.localdomain sshd[92379]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:08 np0005548789.localdomain sshd[92379]: Received disconnect from 81.192.46.35 port 45226:11: Bye Bye [preauth]
Dec 06 08:58:08 np0005548789.localdomain sshd[92379]: Disconnected from authenticating user root 81.192.46.35 port 45226 [preauth]
Dec 06 08:58:14 np0005548789.localdomain sudo[92381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:58:14 np0005548789.localdomain sudo[92381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:14 np0005548789.localdomain sudo[92381]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:14 np0005548789.localdomain sudo[92396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:58:14 np0005548789.localdomain sudo[92396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:15 np0005548789.localdomain sudo[92396]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:16 np0005548789.localdomain sudo[92444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:58:16 np0005548789.localdomain sudo[92444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:58:16 np0005548789.localdomain sudo[92444]: pam_unix(sudo:session): session closed for user root
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:58:23 np0005548789.localdomain systemd[1]: tmp-crun.0OhMO0.mount: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain podman[92459]: 2025-12-06 08:58:23.999624425 +0000 UTC m=+0.148982886 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, container_name=logrotate_crond, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Dec 06 08:58:24 np0005548789.localdomain podman[92462]: 2025-12-06 08:58:23.959001844 +0000 UTC m=+0.107251961 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:58:24 np0005548789.localdomain podman[92460]: 2025-12-06 08:58:24.019211654 +0000 UTC m=+0.168523513 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:58:24 np0005548789.localdomain podman[92462]: 2025-12-06 08:58:24.04524097 +0000 UTC m=+0.193491077 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 06 08:58:24 np0005548789.localdomain podman[92472]: 2025-12-06 08:58:24.0524122 +0000 UTC m=+0.189162006 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain podman[92472]: 2025-12-06 08:58:24.080058095 +0000 UTC m=+0.216807911 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 08:58:24 np0005548789.localdomain podman[92463]: 2025-12-06 08:58:24.090493034 +0000 UTC m=+0.233695557 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, url=https://www.redhat.com)
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain podman[92463]: 2025-12-06 08:58:24.125097992 +0000 UTC m=+0.268300535 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4)
Dec 06 08:58:24 np0005548789.localdomain podman[92459]: 2025-12-06 08:58:24.132573891 +0000 UTC m=+0.281932352 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain podman[92460]: 2025-12-06 08:58:24.185729186 +0000 UTC m=+0.335041005 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 08:58:24 np0005548789.localdomain podman[92461]: 2025-12-06 08:58:24.031429458 +0000 UTC m=+0.181651605 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-type=git, container_name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:58:24 np0005548789.localdomain podman[92461]: 2025-12-06 08:58:24.425782025 +0000 UTC m=+0.576004142 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:58:24 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:58:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:58:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:58:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:58:28 np0005548789.localdomain systemd[1]: tmp-crun.3T3mx0.mount: Deactivated successfully.
Dec 06 08:58:28 np0005548789.localdomain podman[92592]: 2025-12-06 08:58:28.936284829 +0000 UTC m=+0.096054198 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Dec 06 08:58:28 np0005548789.localdomain systemd[1]: tmp-crun.IlDEci.mount: Deactivated successfully.
Dec 06 08:58:28 np0005548789.localdomain podman[92593]: 2025-12-06 08:58:28.991946021 +0000 UTC m=+0.146088638 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044)
Dec 06 08:58:29 np0005548789.localdomain podman[92591]: 2025-12-06 08:58:29.035377299 +0000 UTC m=+0.195883020 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, version=17.1.12, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:58:29 np0005548789.localdomain podman[92593]: 2025-12-06 08:58:29.072862295 +0000 UTC m=+0.227004922 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4)
Dec 06 08:58:29 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:58:29 np0005548789.localdomain podman[92591]: 2025-12-06 08:58:29.086661487 +0000 UTC m=+0.247167168 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:58:29 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:58:29 np0005548789.localdomain podman[92592]: 2025-12-06 08:58:29.139019058 +0000 UTC m=+0.298788477 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, release=1761123044, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:58:29 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:58:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:58:32 np0005548789.localdomain systemd[1]: tmp-crun.LYfBei.mount: Deactivated successfully.
Dec 06 08:58:32 np0005548789.localdomain podman[92665]: 2025-12-06 08:58:32.929903938 +0000 UTC m=+0.091666724 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:58:32 np0005548789.localdomain podman[92665]: 2025-12-06 08:58:32.957306866 +0000 UTC m=+0.119069652 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.41.4, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:58:32 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:58:43 np0005548789.localdomain sshd[92691]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 08:58:43 np0005548789.localdomain sshd[92691]: Invalid user jito from 92.118.39.95 port 58332
Dec 06 08:58:43 np0005548789.localdomain sshd[92691]: Connection closed by invalid user jito 92.118.39.95 port 58332 [preauth]
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:58:54 np0005548789.localdomain systemd[1]: tmp-crun.eKJFFB.mount: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain podman[92697]: 2025-12-06 08:58:55.000519076 +0000 UTC m=+0.146969454 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 08:58:55 np0005548789.localdomain podman[92697]: 2025-12-06 08:58:55.007666685 +0000 UTC m=+0.154117033 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, release=1761123044, config_id=tripleo_step3)
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain podman[92693]: 2025-12-06 08:58:55.052554207 +0000 UTC m=+0.206675930 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=logrotate_crond, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:58:55 np0005548789.localdomain podman[92694]: 2025-12-06 08:58:55.089084294 +0000 UTC m=+0.241703891 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64)
Dec 06 08:58:55 np0005548789.localdomain podman[92696]: 2025-12-06 08:58:55.108736145 +0000 UTC m=+0.257457413 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:58:55 np0005548789.localdomain podman[92704]: 2025-12-06 08:58:54.962723511 +0000 UTC m=+0.104640460 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 08:58:55 np0005548789.localdomain podman[92704]: 2025-12-06 08:58:55.143397855 +0000 UTC m=+0.285314774 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, release=1761123044, config_id=tripleo_step4)
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain podman[92695]: 2025-12-06 08:58:55.153738431 +0000 UTC m=+0.305483401 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public)
Dec 06 08:58:55 np0005548789.localdomain podman[92696]: 2025-12-06 08:58:55.163016794 +0000 UTC m=+0.311738022 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:58:55 np0005548789.localdomain podman[92694]: 2025-12-06 08:58:55.174872698 +0000 UTC m=+0.327492265 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain podman[92693]: 2025-12-06 08:58:55.218241973 +0000 UTC m=+0.372363726 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., distribution-scope=public, container_name=logrotate_crond, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:49:32Z)
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:58:55 np0005548789.localdomain podman[92695]: 2025-12-06 08:58:55.536191045 +0000 UTC m=+0.687935975 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 08:58:55 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:58:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:58:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:58:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:58:59 np0005548789.localdomain podman[92825]: 2025-12-06 08:58:59.924326498 +0000 UTC m=+0.081192164 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:58:59 np0005548789.localdomain podman[92825]: 2025-12-06 08:58:59.974642956 +0000 UTC m=+0.131508662 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 08:58:59 np0005548789.localdomain podman[92826]: 2025-12-06 08:58:59.990356277 +0000 UTC m=+0.145349185 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12)
Dec 06 08:58:59 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Deactivated successfully.
Dec 06 08:59:00 np0005548789.localdomain podman[92827]: 2025-12-06 08:59:00.044156962 +0000 UTC m=+0.196448567 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 08:59:00 np0005548789.localdomain podman[92827]: 2025-12-06 08:59:00.091278663 +0000 UTC m=+0.243570298 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 08:59:00 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:59:00 np0005548789.localdomain podman[92826]: 2025-12-06 08:59:00.222746263 +0000 UTC m=+0.377739191 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, version=17.1.12, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 08:59:00 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:59:00 np0005548789.localdomain systemd[1]: tmp-crun.ECPrAw.mount: Deactivated successfully.
Dec 06 08:59:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:59:03 np0005548789.localdomain podman[92902]: 2025-12-06 08:59:03.925415035 +0000 UTC m=+0.084134554 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, release=1761123044)
Dec 06 08:59:03 np0005548789.localdomain podman[92902]: 2025-12-06 08:59:03.960434886 +0000 UTC m=+0.119154365 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 08:59:03 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:59:16 np0005548789.localdomain sudo[92927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:59:16 np0005548789.localdomain sudo[92927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:16 np0005548789.localdomain sudo[92927]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:16 np0005548789.localdomain sudo[92942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 08:59:16 np0005548789.localdomain sudo[92942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:17 np0005548789.localdomain sudo[92942]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:17 np0005548789.localdomain sudo[92988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 08:59:17 np0005548789.localdomain sudo[92988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:17 np0005548789.localdomain sudo[92988]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:17 np0005548789.localdomain sudo[93003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 08:59:17 np0005548789.localdomain sudo[93003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:18 np0005548789.localdomain sudo[93003]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:21 np0005548789.localdomain sudo[93038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 08:59:21 np0005548789.localdomain sudo[93038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 08:59:21 np0005548789.localdomain sudo[93038]: pam_unix(sudo:session): session closed for user root
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:59:25 np0005548789.localdomain systemd[1]: tmp-crun.MdDEcP.mount: Deactivated successfully.
Dec 06 08:59:25 np0005548789.localdomain podman[93056]: 2025-12-06 08:59:25.959001789 +0000 UTC m=+0.107127597 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Dec 06 08:59:26 np0005548789.localdomain podman[93062]: 2025-12-06 08:59:26.019617462 +0000 UTC m=+0.162634473 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, version=17.1.12, vcs-type=git, release=1761123044, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 08:59:26 np0005548789.localdomain podman[93056]: 2025-12-06 08:59:26.042254404 +0000 UTC m=+0.190380232 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 08:59:26 np0005548789.localdomain podman[93062]: 2025-12-06 08:59:26.05616843 +0000 UTC m=+0.199185421 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-type=git)
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:59:26 np0005548789.localdomain podman[93055]: 2025-12-06 08:59:25.99958537 +0000 UTC m=+0.154131714 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Dec 06 08:59:26 np0005548789.localdomain podman[93074]: 2025-12-06 08:59:26.106696325 +0000 UTC m=+0.247219800 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 08:59:26 np0005548789.localdomain podman[93074]: 2025-12-06 08:59:26.137114445 +0000 UTC m=+0.277637960 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:59:26 np0005548789.localdomain podman[93053]: 2025-12-06 08:59:26.148333748 +0000 UTC m=+0.308050080 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 08:59:26 np0005548789.localdomain podman[93053]: 2025-12-06 08:59:26.158282153 +0000 UTC m=+0.317998535 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 08:59:26 np0005548789.localdomain podman[93054]: 2025-12-06 08:59:26.060957817 +0000 UTC m=+0.217452831 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:59:26 np0005548789.localdomain podman[93054]: 2025-12-06 08:59:26.201161673 +0000 UTC m=+0.357656707 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd)
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:59:26 np0005548789.localdomain podman[93055]: 2025-12-06 08:59:26.365297981 +0000 UTC m=+0.519844325 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git)
Dec 06 08:59:26 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 08:59:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 08:59:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 08:59:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 08:59:30 np0005548789.localdomain systemd[1]: tmp-crun.IE8xSl.mount: Deactivated successfully.
Dec 06 08:59:30 np0005548789.localdomain podman[93186]: 2025-12-06 08:59:30.931156557 +0000 UTC m=+0.090050894 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible)
Dec 06 08:59:30 np0005548789.localdomain podman[93187]: 2025-12-06 08:59:30.977314659 +0000 UTC m=+0.131698158 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 08:59:31 np0005548789.localdomain podman[93185]: 2025-12-06 08:59:31.033187157 +0000 UTC m=+0.190872127 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 08:59:31 np0005548789.localdomain podman[93187]: 2025-12-06 08:59:31.054707505 +0000 UTC m=+0.209090974 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044)
Dec 06 08:59:31 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 08:59:31 np0005548789.localdomain podman[93185]: 2025-12-06 08:59:31.111686268 +0000 UTC m=+0.269371228 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-type=git, release=1761123044, container_name=ovn_controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller)
Dec 06 08:59:31 np0005548789.localdomain podman[93185]: unhealthy
Dec 06 08:59:31 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 08:59:31 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 08:59:31 np0005548789.localdomain podman[93186]: 2025-12-06 08:59:31.15268375 +0000 UTC m=+0.311578077 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public)
Dec 06 08:59:31 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 08:59:31 np0005548789.localdomain systemd[1]: tmp-crun.vCPAmn.mount: Deactivated successfully.
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 08:59:34 np0005548789.localdomain recover_tripleo_nova_virtqemud[93272]: 61814
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: tmp-crun.yh1WPA.mount: Deactivated successfully.
Dec 06 08:59:34 np0005548789.localdomain podman[93265]: 2025-12-06 08:59:34.918174344 +0000 UTC m=+0.079866243 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, url=https://www.redhat.com)
Dec 06 08:59:34 np0005548789.localdomain podman[93265]: 2025-12-06 08:59:34.950935105 +0000 UTC m=+0.112627074 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Dec 06 08:59:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 08:59:56 np0005548789.localdomain podman[93306]: 2025-12-06 08:59:56.957958556 +0000 UTC m=+0.095990906 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 08:59:56 np0005548789.localdomain podman[93306]: 2025-12-06 08:59:56.980268328 +0000 UTC m=+0.118300638 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1)
Dec 06 08:59:56 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 08:59:57 np0005548789.localdomain podman[93291]: 2025-12-06 08:59:57.000251189 +0000 UTC m=+0.156793405 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z)
Dec 06 08:59:57 np0005548789.localdomain podman[93299]: 2025-12-06 08:59:56.93815868 +0000 UTC m=+0.085440134 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 08:59:57 np0005548789.localdomain podman[93292]: 2025-12-06 08:59:57.049686661 +0000 UTC m=+0.203274977 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Dec 06 08:59:57 np0005548789.localdomain podman[93291]: 2025-12-06 08:59:57.061513632 +0000 UTC m=+0.218055878 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 08:59:57 np0005548789.localdomain podman[93299]: 2025-12-06 08:59:57.076192951 +0000 UTC m=+0.223474455 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_compute, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 08:59:57 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 08:59:57 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 08:59:57 np0005548789.localdomain podman[93293]: 2025-12-06 08:59:57.096317776 +0000 UTC m=+0.244587089 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 08:59:57 np0005548789.localdomain podman[93292]: 2025-12-06 08:59:57.113894134 +0000 UTC m=+0.267482400 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, tcib_managed=true, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible)
Dec 06 08:59:57 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 08:59:57 np0005548789.localdomain podman[93305]: 2025-12-06 08:59:57.161030084 +0000 UTC m=+0.301174289 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 08:59:57 np0005548789.localdomain podman[93305]: 2025-12-06 08:59:57.196146769 +0000 UTC m=+0.336290974 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044)
Dec 06 08:59:57 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 08:59:57 np0005548789.localdomain podman[93293]: 2025-12-06 08:59:57.424000765 +0000 UTC m=+0.572270088 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 08:59:57 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:00:01 np0005548789.localdomain CROND[93424]: (root) CMD (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 06 09:00:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:00:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:00:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:00:01 np0005548789.localdomain podman[93427]: 2025-12-06 09:00:01.928799643 +0000 UTC m=+0.084480614 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, url=https://www.redhat.com, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 09:00:01 np0005548789.localdomain podman[93427]: 2025-12-06 09:00:01.986947811 +0000 UTC m=+0.142628762 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 09:00:01 np0005548789.localdomain systemd[1]: tmp-crun.K0DruU.mount: Deactivated successfully.
Dec 06 09:00:01 np0005548789.localdomain podman[93427]: unhealthy
Dec 06 09:00:02 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:00:02 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:00:02 np0005548789.localdomain podman[93428]: 2025-12-06 09:00:01.990645634 +0000 UTC m=+0.146121389 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 09:00:02 np0005548789.localdomain podman[93429]: 2025-12-06 09:00:02.052208656 +0000 UTC m=+0.201016867 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:00:02 np0005548789.localdomain podman[93429]: 2025-12-06 09:00:02.094278822 +0000 UTC m=+0.243087053 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com)
Dec 06 09:00:02 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Deactivated successfully.
Dec 06 09:00:02 np0005548789.localdomain podman[93428]: 2025-12-06 09:00:02.187161123 +0000 UTC m=+0.342636918 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1)
Dec 06 09:00:02 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:00:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:00:05 np0005548789.localdomain systemd[1]: tmp-crun.Deqc42.mount: Deactivated successfully.
Dec 06 09:00:05 np0005548789.localdomain podman[93504]: 2025-12-06 09:00:05.929294552 +0000 UTC m=+0.093413197 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 06 09:00:05 np0005548789.localdomain podman[93504]: 2025-12-06 09:00:05.956809733 +0000 UTC m=+0.120928368 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044)
Dec 06 09:00:05 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:00:21 np0005548789.localdomain sudo[93530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:00:21 np0005548789.localdomain sudo[93530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:21 np0005548789.localdomain sudo[93530]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:21 np0005548789.localdomain sudo[93545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:00:21 np0005548789.localdomain sudo[93545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:21 np0005548789.localdomain sudo[93545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:23 np0005548789.localdomain sshd[93592]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:26 np0005548789.localdomain sudo[93593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:00:26 np0005548789.localdomain sudo[93593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:00:26 np0005548789.localdomain sudo[93593]: pam_unix(sudo:session): session closed for user root
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:00:27 np0005548789.localdomain systemd[1]: tmp-crun.N4G8Ks.mount: Deactivated successfully.
Dec 06 09:00:27 np0005548789.localdomain podman[93608]: 2025-12-06 09:00:27.960014232 +0000 UTC m=+0.107583801 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4)
Dec 06 09:00:27 np0005548789.localdomain podman[93611]: 2025-12-06 09:00:27.966012325 +0000 UTC m=+0.103488595 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z)
Dec 06 09:00:28 np0005548789.localdomain podman[93622]: 2025-12-06 09:00:28.00574956 +0000 UTC m=+0.141191918 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team)
Dec 06 09:00:28 np0005548789.localdomain podman[93622]: 2025-12-06 09:00:28.013208708 +0000 UTC m=+0.148651056 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:00:28 np0005548789.localdomain podman[93608]: 2025-12-06 09:00:28.018090547 +0000 UTC m=+0.165660076 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public)
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain podman[93609]: 2025-12-06 09:00:28.060712471 +0000 UTC m=+0.207713573 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 09:00:28 np0005548789.localdomain podman[93623]: 2025-12-06 09:00:28.016492529 +0000 UTC m=+0.141201009 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1)
Dec 06 09:00:28 np0005548789.localdomain podman[93610]: 2025-12-06 09:00:28.113512584 +0000 UTC m=+0.252846331 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:00:28 np0005548789.localdomain podman[93611]: 2025-12-06 09:00:28.118796546 +0000 UTC m=+0.256272816 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, release=1761123044, architecture=x86_64)
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain podman[93609]: 2025-12-06 09:00:28.143197572 +0000 UTC m=+0.290198654 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, architecture=x86_64)
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain podman[93623]: 2025-12-06 09:00:28.195855733 +0000 UTC m=+0.320564273 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64)
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain podman[93610]: 2025-12-06 09:00:28.517203578 +0000 UTC m=+0.656537325 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_migration_target, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com)
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:00:28 np0005548789.localdomain systemd[1]: tmp-crun.AYP1Ff.mount: Deactivated successfully.
Dec 06 09:00:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:00:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:00:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:00:32 np0005548789.localdomain podman[93737]: 2025-12-06 09:00:32.929673153 +0000 UTC m=+0.092075007 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, architecture=x86_64)
Dec 06 09:00:32 np0005548789.localdomain podman[93737]: 2025-12-06 09:00:32.981471006 +0000 UTC m=+0.143872830 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller)
Dec 06 09:00:32 np0005548789.localdomain podman[93737]: unhealthy
Dec 06 09:00:32 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:00:32 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:00:33 np0005548789.localdomain podman[93738]: 2025-12-06 09:00:33.062994858 +0000 UTC m=+0.225060672 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, container_name=metrics_qdr)
Dec 06 09:00:33 np0005548789.localdomain podman[93739]: 2025-12-06 09:00:32.989725288 +0000 UTC m=+0.151462832 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1)
Dec 06 09:00:33 np0005548789.localdomain podman[93739]: 2025-12-06 09:00:33.12224758 +0000 UTC m=+0.283985114 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:00:33 np0005548789.localdomain podman[93739]: unhealthy
Dec 06 09:00:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:00:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:00:33 np0005548789.localdomain podman[93738]: 2025-12-06 09:00:33.254073611 +0000 UTC m=+0.416139365 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044)
Dec 06 09:00:33 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:00:34 np0005548789.localdomain sshd[93592]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:00:34 np0005548789.localdomain sshd[93592]: banner exchange: Connection from 115.190.14.84 port 33028: Connection timed out
Dec 06 09:00:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:00:36 np0005548789.localdomain podman[93807]: 2025-12-06 09:00:36.902861106 +0000 UTC m=+0.063186413 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 09:00:36 np0005548789.localdomain podman[93807]: 2025-12-06 09:00:36.933114271 +0000 UTC m=+0.093439628 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:00:36 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:00:38 np0005548789.localdomain sshd[93834]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:40 np0005548789.localdomain sshd[93834]: Received disconnect from 179.33.210.213 port 51316:11: Bye Bye [preauth]
Dec 06 09:00:40 np0005548789.localdomain sshd[93834]: Disconnected from authenticating user root 179.33.210.213 port 51316 [preauth]
Dec 06 09:00:40 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:00:40 np0005548789.localdomain recover_tripleo_nova_virtqemud[93837]: 61814
Dec 06 09:00:40 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:00:40 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:00:50 np0005548789.localdomain sshd[93838]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:00:51 np0005548789.localdomain sshd[93838]: Connection closed by authenticating user root 92.118.39.95 port 45078 [preauth]
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:00:58 np0005548789.localdomain podman[93845]: 2025-12-06 09:00:58.941771726 +0000 UTC m=+0.095664006 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64)
Dec 06 09:00:58 np0005548789.localdomain podman[93845]: 2025-12-06 09:00:58.973117834 +0000 UTC m=+0.127010054 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: tmp-crun.pXTOOh.mount: Deactivated successfully.
Dec 06 09:00:58 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:00:58 np0005548789.localdomain podman[93842]: 2025-12-06 09:00:58.99422903 +0000 UTC m=+0.149879454 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:00:59 np0005548789.localdomain podman[93841]: 2025-12-06 09:00:59.05374834 +0000 UTC m=+0.209718894 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:00:59 np0005548789.localdomain podman[93841]: 2025-12-06 09:00:59.089153553 +0000 UTC m=+0.245124057 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=)
Dec 06 09:00:59 np0005548789.localdomain podman[93844]: 2025-12-06 09:00:59.089104101 +0000 UTC m=+0.242352782 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Dec 06 09:00:59 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:00:59 np0005548789.localdomain podman[93843]: 2025-12-06 09:00:59.147904469 +0000 UTC m=+0.297681154 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute)
Dec 06 09:00:59 np0005548789.localdomain podman[93840]: 2025-12-06 09:00:59.197716231 +0000 UTC m=+0.358171951 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=)
Dec 06 09:00:59 np0005548789.localdomain podman[93843]: 2025-12-06 09:00:59.201835038 +0000 UTC m=+0.351611633 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, release=1761123044, tcib_managed=true, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:00:59 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:00:59 np0005548789.localdomain podman[93844]: 2025-12-06 09:00:59.220371644 +0000 UTC m=+0.373620315 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z)
Dec 06 09:00:59 np0005548789.localdomain podman[93840]: 2025-12-06 09:00:59.230462363 +0000 UTC m=+0.390918043 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:32Z)
Dec 06 09:00:59 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:00:59 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:00:59 np0005548789.localdomain podman[93842]: 2025-12-06 09:00:59.335541966 +0000 UTC m=+0.491192380 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:00:59 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:01:01 np0005548789.localdomain CROND[93975]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548789.localdomain run-parts[93978]: (/etc/cron.hourly) starting 0anacron
Dec 06 09:01:01 np0005548789.localdomain run-parts[93984]: (/etc/cron.hourly) finished 0anacron
Dec 06 09:01:01 np0005548789.localdomain CROND[93974]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548789.localdomain CROND[93986]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548789.localdomain run-parts[93989]: (/etc/cron.hourly) starting 0anacron
Dec 06 09:01:01 np0005548789.localdomain anacron[93997]: Anacron started on 2025-12-06
Dec 06 09:01:01 np0005548789.localdomain anacron[93997]: Will run job `cron.daily' in 25 min.
Dec 06 09:01:01 np0005548789.localdomain anacron[93997]: Will run job `cron.weekly' in 45 min.
Dec 06 09:01:01 np0005548789.localdomain anacron[93997]: Will run job `cron.monthly' in 65 min.
Dec 06 09:01:01 np0005548789.localdomain anacron[93997]: Jobs will be executed sequentially
Dec 06 09:01:01 np0005548789.localdomain run-parts[93999]: (/etc/cron.hourly) finished 0anacron
Dec 06 09:01:01 np0005548789.localdomain CROND[93985]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 09:01:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:01:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:01:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:01:03 np0005548789.localdomain podman[94000]: 2025-12-06 09:01:03.931143899 +0000 UTC m=+0.072122636 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 09:01:03 np0005548789.localdomain podman[94001]: 2025-12-06 09:01:03.994037873 +0000 UTC m=+0.129192962 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=metrics_qdr, config_id=tripleo_step1)
Dec 06 09:01:04 np0005548789.localdomain podman[94000]: 2025-12-06 09:01:04.017923683 +0000 UTC m=+0.158902480 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:01:04 np0005548789.localdomain podman[94000]: unhealthy
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:01:04 np0005548789.localdomain podman[94005]: 2025-12-06 09:01:04.115643651 +0000 UTC m=+0.246209290 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Dec 06 09:01:04 np0005548789.localdomain podman[94005]: 2025-12-06 09:01:04.132789165 +0000 UTC m=+0.263354884 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_metadata_agent, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, distribution-scope=public)
Dec 06 09:01:04 np0005548789.localdomain podman[94005]: unhealthy
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:01:04 np0005548789.localdomain podman[94001]: 2025-12-06 09:01:04.160589725 +0000 UTC m=+0.295744844 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr)
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:01:04 np0005548789.localdomain systemd[1]: tmp-crun.KTq1EJ.mount: Deactivated successfully.
Dec 06 09:01:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:01:07 np0005548789.localdomain systemd[1]: tmp-crun.5EzbAX.mount: Deactivated successfully.
Dec 06 09:01:07 np0005548789.localdomain podman[94069]: 2025-12-06 09:01:07.915844297 +0000 UTC m=+0.078111361 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute, tcib_managed=true)
Dec 06 09:01:07 np0005548789.localdomain podman[94069]: 2025-12-06 09:01:07.970186838 +0000 UTC m=+0.132453902 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:01:07 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:01:13 np0005548789.localdomain sshd[94097]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:01:15 np0005548789.localdomain CROND[93423]: (root) CMDEND (sleep `expr ${RANDOM} % 90`; /usr/sbin/logrotate -s /var/lib/logrotate/logrotate-crond.status /etc/logrotate-crond.conf 2>&1|logger -t logrotate-crond)
Dec 06 09:01:23 np0005548789.localdomain sshd[94101]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:01:24 np0005548789.localdomain sshd[94101]: Received disconnect from 81.192.46.35 port 38418:11: Bye Bye [preauth]
Dec 06 09:01:24 np0005548789.localdomain sshd[94101]: Disconnected from authenticating user root 81.192.46.35 port 38418 [preauth]
Dec 06 09:01:26 np0005548789.localdomain sudo[94103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:26 np0005548789.localdomain sudo[94103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:26 np0005548789.localdomain sudo[94103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:26 np0005548789.localdomain sudo[94118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:01:26 np0005548789.localdomain sudo[94118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:27 np0005548789.localdomain sudo[94118]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:27 np0005548789.localdomain sudo[94153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:27 np0005548789.localdomain sudo[94153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:27 np0005548789.localdomain sudo[94153]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:27 np0005548789.localdomain sudo[94168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:01:27 np0005548789.localdomain sudo[94168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:28 np0005548789.localdomain sudo[94168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:28 np0005548789.localdomain sudo[94215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:01:28 np0005548789.localdomain sudo[94215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:28 np0005548789.localdomain sudo[94215]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:28 np0005548789.localdomain sudo[94230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:01:28 np0005548789.localdomain sudo[94230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.19031188 +0000 UTC m=+0.074429259 container create 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started libpod-conmon-7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb.scope.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.161431892 +0000 UTC m=+0.045549301 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.272191956 +0000 UTC m=+0.156309345 container init 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.286521807 +0000 UTC m=+0.170639186 container start 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.286836956 +0000 UTC m=+0.170954405 container attach 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 09:01:29 np0005548789.localdomain agitated_bartik[94308]: 167 167
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: libpod-7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb.scope: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94285]: 2025-12-06 09:01:29.293644766 +0000 UTC m=+0.177762165 container died 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:01:29 np0005548789.localdomain podman[94325]: 2025-12-06 09:01:29.344929152 +0000 UTC m=+0.068364362 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-cron)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:01:29 np0005548789.localdomain podman[94324]: 2025-12-06 09:01:29.377869575 +0000 UTC m=+0.112991505 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, release=1761123044)
Dec 06 09:01:29 np0005548789.localdomain podman[94325]: 2025-12-06 09:01:29.383944751 +0000 UTC m=+0.107379951 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 09:01:29 np0005548789.localdomain podman[94301]: 2025-12-06 09:01:29.397164018 +0000 UTC m=+0.155950525 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94300]: 2025-12-06 09:01:29.381818996 +0000 UTC m=+0.149542077 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, tcib_managed=true)
Dec 06 09:01:29 np0005548789.localdomain podman[94300]: 2025-12-06 09:01:29.466014983 +0000 UTC m=+0.233738054 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044, managed_by=tripleo_ansible, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94348]: 2025-12-06 09:01:29.500852915 +0000 UTC m=+0.199469763 container remove 7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_bartik, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: libpod-conmon-7f3dca6ac65f1d31830e218d7dcb68e508e052d0fdc564e048a996a8a3fa03bb.scope: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94324]: 2025-12-06 09:01:29.514244045 +0000 UTC m=+0.249365935 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:01:29 np0005548789.localdomain podman[94307]: 2025-12-06 09:01:29.485169652 +0000 UTC m=+0.237929714 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94307]: 2025-12-06 09:01:29.568972488 +0000 UTC m=+0.321732490 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible)
Dec 06 09:01:29 np0005548789.localdomain podman[94392]: 2025-12-06 09:01:29.584589147 +0000 UTC m=+0.198769009 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94301]: 2025-12-06 09:01:29.60127528 +0000 UTC m=+0.360061767 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:29.700184171 +0000 UTC m=+0.056225189 container create 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, io.openshift.expose-services=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started libpod-conmon-7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e.scope.
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:01:29 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1acac956fccca1a85ff31a1bd7a29aed0c43ba5133fb96f91825c244fd91b6a5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1acac956fccca1a85ff31a1bd7a29aed0c43ba5133fb96f91825c244fd91b6a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1acac956fccca1a85ff31a1bd7a29aed0c43ba5133fb96f91825c244fd91b6a5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:29.762526687 +0000 UTC m=+0.118567725 container init 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, ceph=True, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:29.675152162 +0000 UTC m=+0.031193210 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:29.775677791 +0000 UTC m=+0.131718809 container start 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, architecture=x86_64, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Dec 06 09:01:29 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:29.775832106 +0000 UTC m=+0.131873124 container attach 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, version=7, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, GIT_CLEAN=True)
Dec 06 09:01:29 np0005548789.localdomain podman[94392]: 2025-12-06 09:01:29.944266783 +0000 UTC m=+0.558446615 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Dec 06 09:01:29 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:01:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-aba8d85722a36c2d5d486b4e2905099ea4ad38b03e7cd3b4be6862ca6ac2936f-merged.mount: Deactivated successfully.
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]: [
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:     {
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "available": false,
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "ceph_device": false,
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "lsm_data": {},
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "lvs": [],
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "path": "/dev/sr0",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "rejected_reasons": [
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "Has a FileSystem",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "Insufficient space (<5GB)"
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         ],
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         "sys_api": {
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "actuators": null,
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "device_nodes": "sr0",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "human_readable_size": "482.00 KB",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "id_bus": "ata",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "model": "QEMU DVD-ROM",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "nr_requests": "2",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "partitions": {},
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "path": "/dev/sr0",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "removable": "1",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "rev": "2.5+",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "ro": "0",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "rotational": "1",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "sas_address": "",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "sas_device_handle": "",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "scheduler_mode": "mq-deadline",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "sectors": 0,
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "sectorsize": "2048",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "size": 493568.0,
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "support_discard": "0",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "type": "disk",
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:             "vendor": "QEMU"
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:         }
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]:     }
Dec 06 09:01:30 np0005548789.localdomain adoring_meitner[94473]: ]
Dec 06 09:01:30 np0005548789.localdomain systemd[1]: libpod-7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e.scope: Deactivated successfully.
Dec 06 09:01:30 np0005548789.localdomain podman[94458]: 2025-12-06 09:01:30.748450059 +0000 UTC m=+1.104491087 container died 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:01:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1acac956fccca1a85ff31a1bd7a29aed0c43ba5133fb96f91825c244fd91b6a5-merged.mount: Deactivated successfully.
Dec 06 09:01:30 np0005548789.localdomain podman[96588]: 2025-12-06 09:01:30.833181943 +0000 UTC m=+0.076317616 container remove 7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_meitner, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 06 09:01:30 np0005548789.localdomain systemd[1]: libpod-conmon-7fce6dc53bca70caf5d6f4d45f30d1c9be5d53588a91b413dc475e2a7b2b867e.scope: Deactivated successfully.
Dec 06 09:01:30 np0005548789.localdomain sudo[94230]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:31 np0005548789.localdomain sudo[96602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:01:31 np0005548789.localdomain sudo[96602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:01:31 np0005548789.localdomain sudo[96602]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:01:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:01:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:01:34 np0005548789.localdomain podman[96737]: 2025-12-06 09:01:34.913918424 +0000 UTC m=+0.073466328 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12)
Dec 06 09:01:34 np0005548789.localdomain podman[96736]: 2025-12-06 09:01:34.953285364 +0000 UTC m=+0.113228131 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:01:34 np0005548789.localdomain podman[96736]: 2025-12-06 09:01:34.969051899 +0000 UTC m=+0.128994646 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller)
Dec 06 09:01:34 np0005548789.localdomain podman[96736]: unhealthy
Dec 06 09:01:34 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:34 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:01:35 np0005548789.localdomain systemd[1]: tmp-crun.S4ORjR.mount: Deactivated successfully.
Dec 06 09:01:35 np0005548789.localdomain podman[96738]: 2025-12-06 09:01:35.022521953 +0000 UTC m=+0.179246731 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:01:35 np0005548789.localdomain podman[96738]: 2025-12-06 09:01:35.040126223 +0000 UTC m=+0.196850991 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Dec 06 09:01:35 np0005548789.localdomain podman[96738]: unhealthy
Dec 06 09:01:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:01:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:01:35 np0005548789.localdomain podman[96737]: 2025-12-06 09:01:35.130380317 +0000 UTC m=+0.289928281 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, version=17.1.12, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:01:35 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:01:36 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 09:01:37 np0005548789.localdomain rhsm-service[6617]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Dec 06 09:01:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:01:38 np0005548789.localdomain podman[96862]: 2025-12-06 09:01:38.905058233 +0000 UTC m=+0.071987524 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:01:38 np0005548789.localdomain podman[96862]: 2025-12-06 09:01:38.932375942 +0000 UTC m=+0.099305263 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, release=1761123044, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:01:38 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:01:49 np0005548789.localdomain sshd[96888]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:01:51 np0005548789.localdomain sshd[96888]: Received disconnect from 64.227.156.63 port 50698:11: Bye Bye [preauth]
Dec 06 09:01:51 np0005548789.localdomain sshd[96888]: Disconnected from authenticating user root 64.227.156.63 port 50698 [preauth]
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:01:59 np0005548789.localdomain podman[96890]: 2025-12-06 09:01:59.914456595 +0000 UTC m=+0.073640084 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond)
Dec 06 09:01:59 np0005548789.localdomain podman[96890]: 2025-12-06 09:01:59.92601737 +0000 UTC m=+0.085200829 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 09:01:59 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:02:00 np0005548789.localdomain podman[96902]: 2025-12-06 09:02:00.007213335 +0000 UTC m=+0.148864476 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:12:45Z)
Dec 06 09:02:00 np0005548789.localdomain podman[96892]: 2025-12-06 09:02:00.036267749 +0000 UTC m=+0.185214763 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Dec 06 09:02:00 np0005548789.localdomain podman[96898]: 2025-12-06 09:01:59.986456518 +0000 UTC m=+0.132896056 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-type=git)
Dec 06 09:02:00 np0005548789.localdomain podman[96964]: 2025-12-06 09:02:00.075978039 +0000 UTC m=+0.066087332 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:02:00 np0005548789.localdomain podman[96891]: 2025-12-06 09:02:00.084511292 +0000 UTC m=+0.237003916 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:02:00 np0005548789.localdomain podman[96891]: 2025-12-06 09:02:00.097016696 +0000 UTC m=+0.249509330 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:02:00 np0005548789.localdomain podman[96898]: 2025-12-06 09:02:00.1202343 +0000 UTC m=+0.266673798 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git)
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:02:00 np0005548789.localdomain podman[96902]: 2025-12-06 09:02:00.14922322 +0000 UTC m=+0.290874341 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:02:00 np0005548789.localdomain podman[96892]: 2025-12-06 09:02:00.169104641 +0000 UTC m=+0.318051645 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4)
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:02:00 np0005548789.localdomain podman[96964]: 2025-12-06 09:02:00.427016398 +0000 UTC m=+0.417125671 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 09:02:00 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:02:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:02:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:02:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:02:05 np0005548789.localdomain systemd[1]: tmp-crun.DcARIZ.mount: Deactivated successfully.
Dec 06 09:02:05 np0005548789.localdomain podman[97021]: 2025-12-06 09:02:05.937144033 +0000 UTC m=+0.096589369 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:02:05 np0005548789.localdomain podman[97021]: 2025-12-06 09:02:05.956109566 +0000 UTC m=+0.115554882 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Dec 06 09:02:05 np0005548789.localdomain podman[97023]: 2025-12-06 09:02:05.974667756 +0000 UTC m=+0.128117548 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:02:06 np0005548789.localdomain podman[97021]: unhealthy
Dec 06 09:02:06 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:06 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:02:06 np0005548789.localdomain podman[97022]: 2025-12-06 09:02:06.083423189 +0000 UTC m=+0.239810041 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, architecture=x86_64, container_name=metrics_qdr, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:02:06 np0005548789.localdomain podman[97023]: 2025-12-06 09:02:06.116920988 +0000 UTC m=+0.270370750 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:02:06 np0005548789.localdomain podman[97023]: unhealthy
Dec 06 09:02:06 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:06 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:02:06 np0005548789.localdomain podman[97022]: 2025-12-06 09:02:06.282125936 +0000 UTC m=+0.438512828 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd)
Dec 06 09:02:06 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:02:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:02:09 np0005548789.localdomain systemd[1]: tmp-crun.V6oP0B.mount: Deactivated successfully.
Dec 06 09:02:09 np0005548789.localdomain podman[97087]: 2025-12-06 09:02:09.928686034 +0000 UTC m=+0.088881193 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z)
Dec 06 09:02:09 np0005548789.localdomain podman[97087]: 2025-12-06 09:02:09.961326437 +0000 UTC m=+0.121521616 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:02:09 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:02:21 np0005548789.localdomain sshd[97114]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:26 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:02:26 np0005548789.localdomain recover_tripleo_nova_virtqemud[97116]: 61814
Dec 06 09:02:26 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:02:26 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:02:30 np0005548789.localdomain systemd[1]: tmp-crun.q4nTia.mount: Deactivated successfully.
Dec 06 09:02:30 np0005548789.localdomain podman[97132]: 2025-12-06 09:02:30.965654785 +0000 UTC m=+0.110520587 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:02:31 np0005548789.localdomain podman[97119]: 2025-12-06 09:02:31.007856062 +0000 UTC m=+0.164136035 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:02:31 np0005548789.localdomain podman[97132]: 2025-12-06 09:02:31.011126103 +0000 UTC m=+0.155991945 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1)
Dec 06 09:02:31 np0005548789.localdomain podman[97117]: 2025-12-06 09:02:31.020043857 +0000 UTC m=+0.181539811 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain podman[97117]: 2025-12-06 09:02:31.028977512 +0000 UTC m=+0.190473466 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain podman[97118]: 2025-12-06 09:02:30.950059576 +0000 UTC m=+0.108879457 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:02:31 np0005548789.localdomain podman[97120]: 2025-12-06 09:02:31.074696247 +0000 UTC m=+0.227247036 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64)
Dec 06 09:02:31 np0005548789.localdomain podman[97118]: 2025-12-06 09:02:31.136024022 +0000 UTC m=+0.294843893 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain podman[97126]: 2025-12-06 09:02:31.103626936 +0000 UTC m=+0.255981649 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, container_name=iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 09:02:31 np0005548789.localdomain podman[97120]: 2025-12-06 09:02:31.16298041 +0000 UTC m=+0.315531209 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain podman[97126]: 2025-12-06 09:02:31.186132852 +0000 UTC m=+0.338487565 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain podman[97119]: 2025-12-06 09:02:31.379079612 +0000 UTC m=+0.535359575 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public)
Dec 06 09:02:31 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:02:31 np0005548789.localdomain sudo[97245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:02:31 np0005548789.localdomain sudo[97245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:31 np0005548789.localdomain sudo[97245]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:31 np0005548789.localdomain sudo[97260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:02:31 np0005548789.localdomain sudo[97260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:31 np0005548789.localdomain sshd[97114]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:02:31 np0005548789.localdomain sshd[97114]: banner exchange: Connection from 106.13.69.159 port 34126: Connection timed out
Dec 06 09:02:32 np0005548789.localdomain sudo[97260]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:33 np0005548789.localdomain sudo[97308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:02:33 np0005548789.localdomain sudo[97308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:02:33 np0005548789.localdomain sudo[97308]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:35 np0005548789.localdomain sshd[97323]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:36 np0005548789.localdomain sshd[97325]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:36 np0005548789.localdomain sshd[97323]: Received disconnect from 45.78.222.162 port 34738:11: Bye Bye [preauth]
Dec 06 09:02:36 np0005548789.localdomain sshd[97323]: Disconnected from authenticating user root 45.78.222.162 port 34738 [preauth]
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:02:36 np0005548789.localdomain podman[97329]: 2025-12-06 09:02:36.839965101 +0000 UTC m=+0.088977825 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:02:36 np0005548789.localdomain podman[97327]: 2025-12-06 09:02:36.879000171 +0000 UTC m=+0.130632645 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1761123044)
Dec 06 09:02:36 np0005548789.localdomain podman[97329]: 2025-12-06 09:02:36.909664084 +0000 UTC m=+0.158676818 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12)
Dec 06 09:02:36 np0005548789.localdomain podman[97327]: 2025-12-06 09:02:36.928171863 +0000 UTC m=+0.179804387 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com)
Dec 06 09:02:36 np0005548789.localdomain podman[97327]: unhealthy
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:02:36 np0005548789.localdomain podman[97329]: unhealthy
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:02:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:02:36 np0005548789.localdomain podman[97328]: 2025-12-06 09:02:36.935524328 +0000 UTC m=+0.187203734 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:02:37 np0005548789.localdomain podman[97328]: 2025-12-06 09:02:37.18902188 +0000 UTC m=+0.440701276 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 09:02:37 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:02:37 np0005548789.localdomain sshd[97325]: Received disconnect from 81.192.46.35 port 36734:11: Bye Bye [preauth]
Dec 06 09:02:37 np0005548789.localdomain sshd[97325]: Disconnected from authenticating user root 81.192.46.35 port 36734 [preauth]
Dec 06 09:02:37 np0005548789.localdomain systemd[1]: tmp-crun.ah9q9I.mount: Deactivated successfully.
Dec 06 09:02:38 np0005548789.localdomain sshd[97398]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:40 np0005548789.localdomain sshd[97398]: Received disconnect from 103.192.152.59 port 49040:11: Bye Bye [preauth]
Dec 06 09:02:40 np0005548789.localdomain sshd[97398]: Disconnected from authenticating user root 103.192.152.59 port 49040 [preauth]
Dec 06 09:02:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:02:40 np0005548789.localdomain podman[97400]: 2025-12-06 09:02:40.521157742 +0000 UTC m=+0.086920872 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:02:40 np0005548789.localdomain podman[97400]: 2025-12-06 09:02:40.548107651 +0000 UTC m=+0.113870781 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1)
Dec 06 09:02:40 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:02:58 np0005548789.localdomain sshd[97426]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:02:59 np0005548789.localdomain sshd[97426]: Received disconnect from 103.234.151.178 port 57460:11: Bye Bye [preauth]
Dec 06 09:02:59 np0005548789.localdomain sshd[97426]: Disconnected from authenticating user root 103.234.151.178 port 57460 [preauth]
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:03:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:03:01 np0005548789.localdomain podman[97440]: 2025-12-06 09:03:01.942709306 +0000 UTC m=+0.091661018 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z)
Dec 06 09:03:01 np0005548789.localdomain podman[97429]: 2025-12-06 09:03:01.926035774 +0000 UTC m=+0.084531509 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:03:01 np0005548789.localdomain podman[97440]: 2025-12-06 09:03:01.992032742 +0000 UTC m=+0.140984444 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12)
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain podman[97428]: 2025-12-06 09:03:01.977914969 +0000 UTC m=+0.135478436 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z)
Dec 06 09:03:02 np0005548789.localdomain podman[97431]: 2025-12-06 09:03:02.037918752 +0000 UTC m=+0.187179694 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 09:03:02 np0005548789.localdomain podman[97432]: 2025-12-06 09:03:02.084583107 +0000 UTC m=+0.237035227 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:03:02 np0005548789.localdomain podman[97431]: 2025-12-06 09:03:02.08828276 +0000 UTC m=+0.237543692 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, version=17.1.12, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain podman[97428]: 2025-12-06 09:03:02.110475703 +0000 UTC m=+0.268039230 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:03:02 np0005548789.localdomain podman[97432]: 2025-12-06 09:03:02.119118068 +0000 UTC m=+0.271570198 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain podman[97430]: 2025-12-06 09:03:02.189655726 +0000 UTC m=+0.343357484 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 09:03:02 np0005548789.localdomain podman[97429]: 2025-12-06 09:03:02.213704475 +0000 UTC m=+0.372200250 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain podman[97430]: 2025-12-06 09:03:02.592318932 +0000 UTC m=+0.746020730 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:03:02 np0005548789.localdomain systemd[1]: tmp-crun.iPBtZC.mount: Deactivated successfully.
Dec 06 09:03:04 np0005548789.localdomain sshd[97561]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:05 np0005548789.localdomain sshd[97561]: Connection closed by authenticating user root 92.118.39.95 port 60078 [preauth]
Dec 06 09:03:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:03:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:03:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:03:07 np0005548789.localdomain systemd[1]: tmp-crun.ZpAJTr.mount: Deactivated successfully.
Dec 06 09:03:07 np0005548789.localdomain podman[97564]: 2025-12-06 09:03:07.929908532 +0000 UTC m=+0.094216406 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:03:07 np0005548789.localdomain systemd[1]: tmp-crun.O52ROa.mount: Deactivated successfully.
Dec 06 09:03:07 np0005548789.localdomain podman[97565]: 2025-12-06 09:03:07.979027062 +0000 UTC m=+0.141302014 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Dec 06 09:03:08 np0005548789.localdomain podman[97565]: 2025-12-06 09:03:08.024459449 +0000 UTC m=+0.186734411 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:03:08 np0005548789.localdomain podman[97563]: 2025-12-06 09:03:08.02451742 +0000 UTC m=+0.188942097 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:03:08 np0005548789.localdomain podman[97565]: unhealthy
Dec 06 09:03:08 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:08 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:03:08 np0005548789.localdomain podman[97563]: 2025-12-06 09:03:08.110040789 +0000 UTC m=+0.274465426 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Dec 06 09:03:08 np0005548789.localdomain podman[97563]: unhealthy
Dec 06 09:03:08 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:08 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:03:08 np0005548789.localdomain podman[97564]: 2025-12-06 09:03:08.167601998 +0000 UTC m=+0.331909912 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z)
Dec 06 09:03:08 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:03:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:03:10 np0005548789.localdomain podman[97629]: 2025-12-06 09:03:10.919017793 +0000 UTC m=+0.072764807 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, container_name=nova_compute, distribution-scope=public, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:03:10 np0005548789.localdomain podman[97629]: 2025-12-06 09:03:10.94490543 +0000 UTC m=+0.098652484 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:03:10 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:03:13 np0005548789.localdomain sshd[94097]: fatal: Timeout before authentication for 101.227.203.162 port 34642
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:03:32 np0005548789.localdomain systemd[1]: tmp-crun.K30iuB.mount: Deactivated successfully.
Dec 06 09:03:32 np0005548789.localdomain podman[97658]: 2025-12-06 09:03:32.947917312 +0000 UTC m=+0.094735193 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:03:32 np0005548789.localdomain podman[97660]: 2025-12-06 09:03:32.987630702 +0000 UTC m=+0.135614719 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 09:03:33 np0005548789.localdomain podman[97660]: 2025-12-06 09:03:33.009239447 +0000 UTC m=+0.157223464 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1761123044, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=)
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain podman[97656]: 2025-12-06 09:03:33.056665354 +0000 UTC m=+0.214391521 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:03:33 np0005548789.localdomain podman[97656]: 2025-12-06 09:03:33.093200377 +0000 UTC m=+0.250926514 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, container_name=logrotate_crond, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4)
Dec 06 09:03:33 np0005548789.localdomain podman[97665]: 2025-12-06 09:03:33.106044901 +0000 UTC m=+0.249933932 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain podman[97667]: 2025-12-06 09:03:33.143983968 +0000 UTC m=+0.286009982 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:03:33 np0005548789.localdomain podman[97665]: 2025-12-06 09:03:33.167217232 +0000 UTC m=+0.311106233 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:03:33 np0005548789.localdomain podman[97667]: 2025-12-06 09:03:33.172150324 +0000 UTC m=+0.314176368 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true)
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain podman[97657]: 2025-12-06 09:03:33.240972698 +0000 UTC m=+0.392241766 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, version=17.1.12)
Dec 06 09:03:33 np0005548789.localdomain podman[97657]: 2025-12-06 09:03:33.25404446 +0000 UTC m=+0.405313588 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, container_name=collectd, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain podman[97658]: 2025-12-06 09:03:33.303064347 +0000 UTC m=+0.449882198 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, version=17.1.12)
Dec 06 09:03:33 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:03:33 np0005548789.localdomain sudo[97784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:03:33 np0005548789.localdomain sudo[97784]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:33 np0005548789.localdomain sudo[97784]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:33 np0005548789.localdomain sudo[97799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:03:33 np0005548789.localdomain sudo[97799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:34 np0005548789.localdomain systemd[1]: tmp-crun.Ju61ui.mount: Deactivated successfully.
Dec 06 09:03:34 np0005548789.localdomain podman[97884]: 2025-12-06 09:03:34.421292316 +0000 UTC m=+0.090624707 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218)
Dec 06 09:03:34 np0005548789.localdomain podman[97884]: 2025-12-06 09:03:34.52687144 +0000 UTC m=+0.196203801 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, ceph=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main)
Dec 06 09:03:34 np0005548789.localdomain sudo[97799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:34 np0005548789.localdomain sudo[97951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:03:34 np0005548789.localdomain sudo[97951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:34 np0005548789.localdomain sudo[97951]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:34 np0005548789.localdomain sudo[97966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:03:34 np0005548789.localdomain sudo[97966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:35 np0005548789.localdomain sudo[97966]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:36 np0005548789.localdomain sudo[98013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:03:36 np0005548789.localdomain sudo[98013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:03:36 np0005548789.localdomain sudo[98013]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:03:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:03:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:03:38 np0005548789.localdomain systemd[1]: tmp-crun.RUo0Es.mount: Deactivated successfully.
Dec 06 09:03:38 np0005548789.localdomain podman[98028]: 2025-12-06 09:03:38.947840888 +0000 UTC m=+0.102997086 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible)
Dec 06 09:03:38 np0005548789.localdomain systemd[1]: tmp-crun.TITrZe.mount: Deactivated successfully.
Dec 06 09:03:38 np0005548789.localdomain podman[98030]: 2025-12-06 09:03:38.993251834 +0000 UTC m=+0.144503823 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4)
Dec 06 09:03:39 np0005548789.localdomain podman[98030]: 2025-12-06 09:03:39.00809894 +0000 UTC m=+0.159350919 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:03:39 np0005548789.localdomain podman[98030]: unhealthy
Dec 06 09:03:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:03:39 np0005548789.localdomain podman[98028]: 2025-12-06 09:03:39.032253603 +0000 UTC m=+0.187409801 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 09:03:39 np0005548789.localdomain podman[98028]: unhealthy
Dec 06 09:03:39 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:03:39 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:03:39 np0005548789.localdomain podman[98029]: 2025-12-06 09:03:39.088500391 +0000 UTC m=+0.244223577 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:03:39 np0005548789.localdomain podman[98029]: 2025-12-06 09:03:39.286174886 +0000 UTC m=+0.441898102 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Dec 06 09:03:39 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:03:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:03:41 np0005548789.localdomain podman[98099]: 2025-12-06 09:03:41.94027335 +0000 UTC m=+0.096885099 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:03:41 np0005548789.localdomain podman[98099]: 2025-12-06 09:03:41.996648533 +0000 UTC m=+0.153260222 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, tcib_managed=true)
Dec 06 09:03:42 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:03:45 np0005548789.localdomain sshd[98125]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:46 np0005548789.localdomain sshd[98127]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:03:47 np0005548789.localdomain sshd[98125]: Received disconnect from 103.157.25.60 port 34514:11: Bye Bye [preauth]
Dec 06 09:03:47 np0005548789.localdomain sshd[98125]: Disconnected from authenticating user root 103.157.25.60 port 34514 [preauth]
Dec 06 09:03:47 np0005548789.localdomain sshd[98127]: Received disconnect from 81.192.46.35 port 35052:11: Bye Bye [preauth]
Dec 06 09:03:47 np0005548789.localdomain sshd[98127]: Disconnected from authenticating user root 81.192.46.35 port 35052 [preauth]
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:04:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:04:04 np0005548789.localdomain podman[98143]: 2025-12-06 09:04:03.968799596 +0000 UTC m=+0.107496155 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4)
Dec 06 09:04:04 np0005548789.localdomain podman[98131]: 2025-12-06 09:04:04.026956814 +0000 UTC m=+0.166794858 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.12, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com)
Dec 06 09:04:04 np0005548789.localdomain podman[98130]: 2025-12-06 09:04:03.938996741 +0000 UTC m=+0.085697106 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:04:04 np0005548789.localdomain podman[98143]: 2025-12-06 09:04:04.048254509 +0000 UTC m=+0.186951058 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain podman[98130]: 2025-12-06 09:04:04.129092292 +0000 UTC m=+0.275792697 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 06 09:04:04 np0005548789.localdomain podman[98144]: 2025-12-06 09:04:04.137283184 +0000 UTC m=+0.266307486 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain podman[98132]: 2025-12-06 09:04:04.002373428 +0000 UTC m=+0.140211160 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:04:04 np0005548789.localdomain podman[98129]: 2025-12-06 09:04:04.103945429 +0000 UTC m=+0.251106698 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=)
Dec 06 09:04:04 np0005548789.localdomain podman[98144]: 2025-12-06 09:04:04.162009974 +0000 UTC m=+0.291034316 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain podman[98132]: 2025-12-06 09:04:04.182870806 +0000 UTC m=+0.320708498 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain podman[98129]: 2025-12-06 09:04:04.237126663 +0000 UTC m=+0.384287952 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-cron, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain podman[98131]: 2025-12-06 09:04:04.421598493 +0000 UTC m=+0.561436547 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:04:04 np0005548789.localdomain systemd[1]: tmp-crun.seJCPj.mount: Deactivated successfully.
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:04:09 np0005548789.localdomain sshd[98281]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: tmp-crun.4qnkEq.mount: Deactivated successfully.
Dec 06 09:04:09 np0005548789.localdomain podman[98259]: 2025-12-06 09:04:09.915802957 +0000 UTC m=+0.078797312 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:04:09 np0005548789.localdomain podman[98259]: 2025-12-06 09:04:09.92729452 +0000 UTC m=+0.090288855 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, version=17.1.12)
Dec 06 09:04:09 np0005548789.localdomain podman[98259]: unhealthy
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:04:09 np0005548789.localdomain systemd[1]: tmp-crun.pib6Sr.mount: Deactivated successfully.
Dec 06 09:04:10 np0005548789.localdomain podman[98260]: 2025-12-06 09:04:10.020891228 +0000 UTC m=+0.179286302 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:04:10 np0005548789.localdomain podman[98261]: 2025-12-06 09:04:09.989133931 +0000 UTC m=+0.143813791 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 06 09:04:10 np0005548789.localdomain podman[98261]: 2025-12-06 09:04:10.072087261 +0000 UTC m=+0.226767141 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1)
Dec 06 09:04:10 np0005548789.localdomain podman[98261]: unhealthy
Dec 06 09:04:10 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:10 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:04:10 np0005548789.localdomain podman[98260]: 2025-12-06 09:04:10.232000516 +0000 UTC m=+0.390395590 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:04:10 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:04:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:04:12 np0005548789.localdomain systemd[1]: tmp-crun.YD4fZM.mount: Deactivated successfully.
Dec 06 09:04:12 np0005548789.localdomain podman[98331]: 2025-12-06 09:04:12.92450612 +0000 UTC m=+0.087527471 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Dec 06 09:04:12 np0005548789.localdomain podman[98331]: 2025-12-06 09:04:12.954150971 +0000 UTC m=+0.117172322 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute)
Dec 06 09:04:12 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:04:14 np0005548789.localdomain sshd[98281]: Received disconnect from 179.33.210.213 port 45856:11: Bye Bye [preauth]
Dec 06 09:04:14 np0005548789.localdomain sshd[98281]: Disconnected from authenticating user root 179.33.210.213 port 45856 [preauth]
Dec 06 09:04:14 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:04:14 np0005548789.localdomain recover_tripleo_nova_virtqemud[98359]: 61814
Dec 06 09:04:14 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:04:14 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:04:24 np0005548789.localdomain sshd[98360]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:24 np0005548789.localdomain sshd[98360]: Received disconnect from 12.156.67.18 port 33204:11: Bye Bye [preauth]
Dec 06 09:04:24 np0005548789.localdomain sshd[98360]: Disconnected from authenticating user root 12.156.67.18 port 33204 [preauth]
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: tmp-crun.y8WNVu.mount: Deactivated successfully.
Dec 06 09:04:34 np0005548789.localdomain podman[98364]: 2025-12-06 09:04:34.942538345 +0000 UTC m=+0.092866775 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team)
Dec 06 09:04:34 np0005548789.localdomain systemd[1]: tmp-crun.TBBO4Q.mount: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain podman[98363]: 2025-12-06 09:04:35.002731474 +0000 UTC m=+0.157734338 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:04:35 np0005548789.localdomain podman[98363]: 2025-12-06 09:04:35.012151015 +0000 UTC m=+0.167153849 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain podman[98362]: 2025-12-06 09:04:35.090995488 +0000 UTC m=+0.245734034 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, tcib_managed=true)
Dec 06 09:04:35 np0005548789.localdomain podman[98368]: 2025-12-06 09:04:35.15159939 +0000 UTC m=+0.296467582 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, container_name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible)
Dec 06 09:04:35 np0005548789.localdomain podman[98368]: 2025-12-06 09:04:35.160087801 +0000 UTC m=+0.304956083 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain podman[98379]: 2025-12-06 09:04:35.193962853 +0000 UTC m=+0.336656679 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:04:35 np0005548789.localdomain podman[98365]: 2025-12-06 09:04:35.245566278 +0000 UTC m=+0.393273488 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4)
Dec 06 09:04:35 np0005548789.localdomain podman[98379]: 2025-12-06 09:04:35.254066709 +0000 UTC m=+0.396760556 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain podman[98364]: 2025-12-06 09:04:35.267494682 +0000 UTC m=+0.417823072 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4)
Dec 06 09:04:35 np0005548789.localdomain podman[98362]: 2025-12-06 09:04:35.275168618 +0000 UTC m=+0.429907114 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:04:35 np0005548789.localdomain podman[98365]: 2025-12-06 09:04:35.294041268 +0000 UTC m=+0.441748468 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:04:35 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:04:36 np0005548789.localdomain sudo[98497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:04:36 np0005548789.localdomain sudo[98497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:36 np0005548789.localdomain sudo[98497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:36 np0005548789.localdomain sudo[98512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:04:36 np0005548789.localdomain sudo[98512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:37 np0005548789.localdomain sudo[98512]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:37 np0005548789.localdomain sudo[98559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:04:37 np0005548789.localdomain sudo[98559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:04:37 np0005548789.localdomain sudo[98559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: tmp-crun.bc2sfZ.mount: Deactivated successfully.
Dec 06 09:04:40 np0005548789.localdomain podman[98574]: 2025-12-06 09:04:40.940568894 +0000 UTC m=+0.102124260 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:04:40 np0005548789.localdomain podman[98574]: 2025-12-06 09:04:40.954045399 +0000 UTC m=+0.115600775 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller)
Dec 06 09:04:40 np0005548789.localdomain podman[98574]: unhealthy
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:40 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:04:41 np0005548789.localdomain podman[98575]: 2025-12-06 09:04:41.038387071 +0000 UTC m=+0.197653397 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:04:41 np0005548789.localdomain podman[98576]: 2025-12-06 09:04:41.080900667 +0000 UTC m=+0.236453948 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, version=17.1.12, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent)
Dec 06 09:04:41 np0005548789.localdomain podman[98576]: 2025-12-06 09:04:41.096108474 +0000 UTC m=+0.251661775 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:04:41 np0005548789.localdomain podman[98576]: unhealthy
Dec 06 09:04:41 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:04:41 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:04:41 np0005548789.localdomain podman[98575]: 2025-12-06 09:04:41.26221741 +0000 UTC m=+0.421483726 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:04:41 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:04:41 np0005548789.localdomain systemd[1]: tmp-crun.72pldl.mount: Deactivated successfully.
Dec 06 09:04:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:04:43 np0005548789.localdomain podman[98643]: 2025-12-06 09:04:43.956742797 +0000 UTC m=+0.109814497 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:04:43 np0005548789.localdomain podman[98643]: 2025-12-06 09:04:43.98547899 +0000 UTC m=+0.138550710 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:04:44 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:04:50 np0005548789.localdomain sshd[98669]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:51 np0005548789.localdomain sshd[98669]: Received disconnect from 64.227.156.63 port 43718:11: Bye Bye [preauth]
Dec 06 09:04:51 np0005548789.localdomain sshd[98669]: Disconnected from authenticating user root 64.227.156.63 port 43718 [preauth]
Dec 06 09:04:53 np0005548789.localdomain sshd[98671]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:04:54 np0005548789.localdomain sshd[98671]: Received disconnect from 81.192.46.35 port 33370:11: Bye Bye [preauth]
Dec 06 09:04:54 np0005548789.localdomain sshd[98671]: Disconnected from authenticating user root 81.192.46.35 port 33370 [preauth]
Dec 06 09:05:03 np0005548789.localdomain sshd[98673]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:04 np0005548789.localdomain sshd[98673]: Received disconnect from 118.193.38.207 port 40094:11: Bye Bye [preauth]
Dec 06 09:05:04 np0005548789.localdomain sshd[98673]: Disconnected from authenticating user root 118.193.38.207 port 40094 [preauth]
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:05:05 np0005548789.localdomain systemd[1]: tmp-crun.8ZQNse.mount: Deactivated successfully.
Dec 06 09:05:05 np0005548789.localdomain podman[98695]: 2025-12-06 09:05:05.962109471 +0000 UTC m=+0.097556800 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:05:05 np0005548789.localdomain podman[98678]: 2025-12-06 09:05:05.933181611 +0000 UTC m=+0.081024182 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:05:05 np0005548789.localdomain podman[98695]: 2025-12-06 09:05:05.991113372 +0000 UTC m=+0.126560721 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain podman[98689]: 2025-12-06 09:05:06.03367507 +0000 UTC m=+0.172925387 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:05:06 np0005548789.localdomain podman[98676]: 2025-12-06 09:05:05.987265154 +0000 UTC m=+0.139711115 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1)
Dec 06 09:05:06 np0005548789.localdomain podman[98689]: 2025-12-06 09:05:06.068030185 +0000 UTC m=+0.207280482 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 09:05:06 np0005548789.localdomain podman[98675]: 2025-12-06 09:05:06.087850465 +0000 UTC m=+0.240765371 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:05:06 np0005548789.localdomain podman[98676]: 2025-12-06 09:05:06.118942351 +0000 UTC m=+0.271388312 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain podman[98678]: 2025-12-06 09:05:06.170147584 +0000 UTC m=+0.317990225 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:05:06 np0005548789.localdomain podman[98675]: 2025-12-06 09:05:06.170578837 +0000 UTC m=+0.323493793 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git)
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain podman[98677]: 2025-12-06 09:05:06.25689152 +0000 UTC m=+0.396578120 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain podman[98677]: 2025-12-06 09:05:06.644536484 +0000 UTC m=+0.784223054 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:05:06 np0005548789.localdomain systemd[1]: tmp-crun.UK9czI.mount: Deactivated successfully.
Dec 06 09:05:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:05:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:05:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:05:11 np0005548789.localdomain systemd[1]: tmp-crun.TwmfMp.mount: Deactivated successfully.
Dec 06 09:05:11 np0005548789.localdomain podman[98806]: 2025-12-06 09:05:11.954072574 +0000 UTC m=+0.084870380 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:05:11 np0005548789.localdomain systemd[1]: tmp-crun.9lUYNJ.mount: Deactivated successfully.
Dec 06 09:05:12 np0005548789.localdomain podman[98807]: 2025-12-06 09:05:11.999724667 +0000 UTC m=+0.127653165 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:05:12 np0005548789.localdomain podman[98805]: 2025-12-06 09:05:11.929060105 +0000 UTC m=+0.065816224 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, release=1761123044, container_name=ovn_controller, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4)
Dec 06 09:05:12 np0005548789.localdomain podman[98807]: 2025-12-06 09:05:12.037434075 +0000 UTC m=+0.165362563 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 09:05:12 np0005548789.localdomain podman[98807]: unhealthy
Dec 06 09:05:12 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:12 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:05:12 np0005548789.localdomain podman[98805]: 2025-12-06 09:05:12.062102374 +0000 UTC m=+0.198858453 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4)
Dec 06 09:05:12 np0005548789.localdomain podman[98805]: unhealthy
Dec 06 09:05:12 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:12 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:05:12 np0005548789.localdomain podman[98806]: 2025-12-06 09:05:12.124085538 +0000 UTC m=+0.254883324 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, version=17.1.12, architecture=x86_64, release=1761123044, com.redhat.component=openstack-qdrouterd-container)
Dec 06 09:05:12 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:05:13 np0005548789.localdomain sshd[98871]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:13 np0005548789.localdomain sshd[98871]: Connection closed by authenticating user root 92.118.39.95 port 46822 [preauth]
Dec 06 09:05:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:05:14 np0005548789.localdomain podman[98873]: 2025-12-06 09:05:14.914293316 +0000 UTC m=+0.075383928 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:05:14 np0005548789.localdomain podman[98873]: 2025-12-06 09:05:14.946235108 +0000 UTC m=+0.107325720 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com)
Dec 06 09:05:14 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:05:23 np0005548789.localdomain sshd[98901]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:26 np0005548789.localdomain sshd[98901]: Connection reset by authenticating user root 45.135.232.92 port 53220 [preauth]
Dec 06 09:05:27 np0005548789.localdomain sshd[98903]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:28 np0005548789.localdomain sshd[98903]: Invalid user admin from 45.135.232.92 port 46066
Dec 06 09:05:28 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:05:28 np0005548789.localdomain recover_tripleo_nova_virtqemud[98906]: 61814
Dec 06 09:05:28 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:05:28 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:05:28 np0005548789.localdomain sshd[98903]: Connection reset by invalid user admin 45.135.232.92 port 46066 [preauth]
Dec 06 09:05:29 np0005548789.localdomain sshd[98907]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:30 np0005548789.localdomain sshd[98907]: Invalid user admin from 45.135.232.92 port 46076
Dec 06 09:05:31 np0005548789.localdomain sshd[98907]: Connection reset by invalid user admin 45.135.232.92 port 46076 [preauth]
Dec 06 09:05:31 np0005548789.localdomain sshd[98909]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:33 np0005548789.localdomain sshd[98909]: Connection reset by authenticating user root 45.135.232.92 port 46104 [preauth]
Dec 06 09:05:33 np0005548789.localdomain sshd[98911]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:05:36 np0005548789.localdomain podman[98913]: 2025-12-06 09:05:36.952955625 +0000 UTC m=+0.108543766 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-cron-container, tcib_managed=true, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, container_name=logrotate_crond, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1)
Dec 06 09:05:36 np0005548789.localdomain systemd[1]: tmp-crun.ookG0d.mount: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain podman[98915]: 2025-12-06 09:05:37.001667373 +0000 UTC m=+0.148365281 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=)
Dec 06 09:05:37 np0005548789.localdomain podman[98914]: 2025-12-06 09:05:37.073739418 +0000 UTC m=+0.226865514 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4)
Dec 06 09:05:37 np0005548789.localdomain podman[98914]: 2025-12-06 09:05:37.084101526 +0000 UTC m=+0.237227622 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain podman[98928]: 2025-12-06 09:05:37.131289947 +0000 UTC m=+0.269325559 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=)
Dec 06 09:05:37 np0005548789.localdomain podman[98913]: 2025-12-06 09:05:37.201923768 +0000 UTC m=+0.357511939 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:05:37 np0005548789.localdomain podman[98928]: 2025-12-06 09:05:37.207140788 +0000 UTC m=+0.345176390 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain podman[98917]: 2025-12-06 09:05:37.284802395 +0000 UTC m=+0.429238013 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:05:37 np0005548789.localdomain podman[98927]: 2025-12-06 09:05:37.343943283 +0000 UTC m=+0.484414690 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid)
Dec 06 09:05:37 np0005548789.localdomain podman[98927]: 2025-12-06 09:05:37.351287108 +0000 UTC m=+0.491758505 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, url=https://www.redhat.com)
Dec 06 09:05:37 np0005548789.localdomain podman[98917]: 2025-12-06 09:05:37.364633798 +0000 UTC m=+0.509069426 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git)
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:05:37 np0005548789.localdomain podman[98915]: 2025-12-06 09:05:37.433205236 +0000 UTC m=+0.579902964 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z)
Dec 06 09:05:37 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:05:38 np0005548789.localdomain sudo[99047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:05:38 np0005548789.localdomain sudo[99047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:38 np0005548789.localdomain sudo[99047]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:38 np0005548789.localdomain sudo[99062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:05:38 np0005548789.localdomain sudo[99062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:38 np0005548789.localdomain sudo[99062]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:39 np0005548789.localdomain sshd[99110]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:39 np0005548789.localdomain sudo[99112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:05:39 np0005548789.localdomain sudo[99112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:05:39 np0005548789.localdomain sudo[99112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:05:39 np0005548789.localdomain sshd[98911]: Invalid user admin from 45.135.232.92 port 46106
Dec 06 09:05:39 np0005548789.localdomain sshd[98911]: Connection reset by invalid user admin 45.135.232.92 port 46106 [preauth]
Dec 06 09:05:40 np0005548789.localdomain sshd[99110]: Received disconnect from 103.157.25.60 port 40916:11: Bye Bye [preauth]
Dec 06 09:05:40 np0005548789.localdomain sshd[99110]: Disconnected from authenticating user root 103.157.25.60 port 40916 [preauth]
Dec 06 09:05:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:05:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:05:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:05:42 np0005548789.localdomain systemd[1]: tmp-crun.NWhy5H.mount: Deactivated successfully.
Dec 06 09:05:42 np0005548789.localdomain podman[99129]: 2025-12-06 09:05:42.94133991 +0000 UTC m=+0.094128935 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:05:43 np0005548789.localdomain podman[99129]: 2025-12-06 09:05:43.031854531 +0000 UTC m=+0.184643546 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 09:05:43 np0005548789.localdomain podman[99129]: unhealthy
Dec 06 09:05:43 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:43 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:05:43 np0005548789.localdomain podman[99127]: 2025-12-06 09:05:43.1475968 +0000 UTC m=+0.303842960 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64)
Dec 06 09:05:43 np0005548789.localdomain podman[99127]: 2025-12-06 09:05:43.191171298 +0000 UTC m=+0.347417448 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-18T23:34:05Z, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1)
Dec 06 09:05:43 np0005548789.localdomain podman[99127]: unhealthy
Dec 06 09:05:43 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:05:43 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:05:43 np0005548789.localdomain podman[99128]: 2025-12-06 09:05:43.194218142 +0000 UTC m=+0.347316476 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 09:05:43 np0005548789.localdomain podman[99128]: 2025-12-06 09:05:43.403181825 +0000 UTC m=+0.556280149 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:05:43 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:05:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:05:45 np0005548789.localdomain podman[99197]: 2025-12-06 09:05:45.924194898 +0000 UTC m=+0.078302588 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:05:45 np0005548789.localdomain podman[99197]: 2025-12-06 09:05:45.949203657 +0000 UTC m=+0.103311377 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 09:05:45 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:05:58 np0005548789.localdomain sshd[99221]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:05:59 np0005548789.localdomain sshd[99221]: Received disconnect from 81.192.46.35 port 59924:11: Bye Bye [preauth]
Dec 06 09:05:59 np0005548789.localdomain sshd[99221]: Disconnected from authenticating user root 81.192.46.35 port 59924 [preauth]
Dec 06 09:06:02 np0005548789.localdomain sshd[99223]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:06:07 np0005548789.localdomain systemd[1]: tmp-crun.dinvyc.mount: Deactivated successfully.
Dec 06 09:06:07 np0005548789.localdomain podman[99242]: 2025-12-06 09:06:07.969967464 +0000 UTC m=+0.105731380 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z)
Dec 06 09:06:07 np0005548789.localdomain podman[99224]: 2025-12-06 09:06:07.925828318 +0000 UTC m=+0.084344484 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron)
Dec 06 09:06:08 np0005548789.localdomain podman[99224]: 2025-12-06 09:06:08.010414498 +0000 UTC m=+0.168930624 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:06:08 np0005548789.localdomain podman[99242]: 2025-12-06 09:06:08.020444896 +0000 UTC m=+0.156208822 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:06:08 np0005548789.localdomain podman[99233]: 2025-12-06 09:06:07.956058637 +0000 UTC m=+0.100610354 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:06:08 np0005548789.localdomain podman[99225]: 2025-12-06 09:06:08.082318947 +0000 UTC m=+0.237074026 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:06:08 np0005548789.localdomain podman[99225]: 2025-12-06 09:06:08.092969255 +0000 UTC m=+0.247724344 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:06:08 np0005548789.localdomain podman[99226]: 2025-12-06 09:06:08.135304576 +0000 UTC m=+0.287319181 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:06:08 np0005548789.localdomain podman[99233]: 2025-12-06 09:06:08.142155687 +0000 UTC m=+0.286707374 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:06:08 np0005548789.localdomain podman[99231]: 2025-12-06 09:06:08.189370868 +0000 UTC m=+0.338466414 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.)
Dec 06 09:06:08 np0005548789.localdomain podman[99231]: 2025-12-06 09:06:08.212247961 +0000 UTC m=+0.361343527 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:06:08 np0005548789.localdomain podman[99226]: 2025-12-06 09:06:08.515425939 +0000 UTC m=+0.667440574 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:06:08 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:06:12 np0005548789.localdomain sshd[99353]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:12 np0005548789.localdomain sshd[99353]: Received disconnect from 12.156.67.18 port 34832:11: Bye Bye [preauth]
Dec 06 09:06:12 np0005548789.localdomain sshd[99353]: Disconnected from authenticating user root 12.156.67.18 port 34832 [preauth]
Dec 06 09:06:12 np0005548789.localdomain sshd[99223]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:06:12 np0005548789.localdomain sshd[99223]: banner exchange: Connection from 14.103.138.132 port 49592: Connection timed out
Dec 06 09:06:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:06:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:06:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:06:13 np0005548789.localdomain systemd[1]: tmp-crun.pYxPR9.mount: Deactivated successfully.
Dec 06 09:06:13 np0005548789.localdomain podman[99356]: 2025-12-06 09:06:13.935411843 +0000 UTC m=+0.095489906 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:06:13 np0005548789.localdomain systemd[1]: tmp-crun.lE5FC5.mount: Deactivated successfully.
Dec 06 09:06:13 np0005548789.localdomain podman[99355]: 2025-12-06 09:06:13.990165405 +0000 UTC m=+0.150057672 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z)
Dec 06 09:06:14 np0005548789.localdomain podman[99357]: 2025-12-06 09:06:14.050133619 +0000 UTC m=+0.202612739 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:06:14 np0005548789.localdomain podman[99355]: 2025-12-06 09:06:14.059486976 +0000 UTC m=+0.219379283 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, container_name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:06:14 np0005548789.localdomain podman[99355]: unhealthy
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:06:14 np0005548789.localdomain podman[99357]: 2025-12-06 09:06:14.073349022 +0000 UTC m=+0.225828172 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:06:14 np0005548789.localdomain podman[99357]: unhealthy
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:06:14 np0005548789.localdomain podman[99356]: 2025-12-06 09:06:14.131149019 +0000 UTC m=+0.291227122 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, architecture=x86_64)
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:06:14 np0005548789.localdomain systemd[1]: tmp-crun.4V3ikd.mount: Deactivated successfully.
Dec 06 09:06:16 np0005548789.localdomain sshd[99424]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:06:16 np0005548789.localdomain podman[99426]: 2025-12-06 09:06:16.92503008 +0000 UTC m=+0.086389836 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044)
Dec 06 09:06:16 np0005548789.localdomain podman[99426]: 2025-12-06 09:06:16.974930134 +0000 UTC m=+0.136289870 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Dec 06 09:06:16 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:06:17 np0005548789.localdomain sshd[99424]: Received disconnect from 64.227.156.63 port 57258:11: Bye Bye [preauth]
Dec 06 09:06:17 np0005548789.localdomain sshd[99424]: Disconnected from authenticating user root 64.227.156.63 port 57258 [preauth]
Dec 06 09:06:37 np0005548789.localdomain sshd[99453]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:38 np0005548789.localdomain sshd[99455]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:38 np0005548789.localdomain sshd[99457]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:38 np0005548789.localdomain sshd[99453]: Received disconnect from 118.193.38.207 port 54076:11: Bye Bye [preauth]
Dec 06 09:06:38 np0005548789.localdomain sshd[99453]: Disconnected from authenticating user root 118.193.38.207 port 54076 [preauth]
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:06:38 np0005548789.localdomain podman[99468]: 2025-12-06 09:06:38.888161916 +0000 UTC m=+0.088697948 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, container_name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, com.redhat.component=openstack-iscsid-container, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step3)
Dec 06 09:06:38 np0005548789.localdomain podman[99468]: 2025-12-06 09:06:38.904680624 +0000 UTC m=+0.105216676 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044)
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: tmp-crun.Uoykr4.mount: Deactivated successfully.
Dec 06 09:06:38 np0005548789.localdomain podman[99475]: 2025-12-06 09:06:38.914279678 +0000 UTC m=+0.106106182 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64)
Dec 06 09:06:38 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:06:38 np0005548789.localdomain podman[99460]: 2025-12-06 09:06:38.989598943 +0000 UTC m=+0.197284954 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Dec 06 09:06:39 np0005548789.localdomain podman[99460]: 2025-12-06 09:06:39.011106545 +0000 UTC m=+0.218792556 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Dec 06 09:06:39 np0005548789.localdomain podman[99475]: 2025-12-06 09:06:39.018213392 +0000 UTC m=+0.210039886 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain podman[99459]: 2025-12-06 09:06:39.087403829 +0000 UTC m=+0.299462975 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container)
Dec 06 09:06:39 np0005548789.localdomain podman[99459]: 2025-12-06 09:06:39.0913181 +0000 UTC m=+0.303377256 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-cron-container, version=17.1.12, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain podman[99467]: 2025-12-06 09:06:39.136689085 +0000 UTC m=+0.335573796 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4)
Dec 06 09:06:39 np0005548789.localdomain podman[99467]: 2025-12-06 09:06:39.159051182 +0000 UTC m=+0.357935923 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Dec 06 09:06:39 np0005548789.localdomain podman[99461]: 2025-12-06 09:06:38.959820679 +0000 UTC m=+0.164139427 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target)
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain podman[99461]: 2025-12-06 09:06:39.365174856 +0000 UTC m=+0.569493604 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain sudo[99592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:06:39 np0005548789.localdomain sudo[99592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:39 np0005548789.localdomain sudo[99592]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:39 np0005548789.localdomain sudo[99607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:06:39 np0005548789.localdomain sudo[99607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:39 np0005548789.localdomain systemd[1]: tmp-crun.IGpqc3.mount: Deactivated successfully.
Dec 06 09:06:39 np0005548789.localdomain sshd[99457]: Received disconnect from 103.234.151.178 port 41538:11: Bye Bye [preauth]
Dec 06 09:06:39 np0005548789.localdomain sshd[99457]: Disconnected from authenticating user root 103.234.151.178 port 41538 [preauth]
Dec 06 09:06:40 np0005548789.localdomain sudo[99607]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:41 np0005548789.localdomain sshd[99455]: Received disconnect from 45.78.222.162 port 53476:11: Bye Bye [preauth]
Dec 06 09:06:41 np0005548789.localdomain sshd[99455]: Disconnected from authenticating user root 45.78.222.162 port 53476 [preauth]
Dec 06 09:06:41 np0005548789.localdomain sudo[99654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:06:41 np0005548789.localdomain sudo[99654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:06:41 np0005548789.localdomain sudo[99654]: pam_unix(sudo:session): session closed for user root
Dec 06 09:06:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:06:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:06:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:06:44 np0005548789.localdomain podman[99671]: 2025-12-06 09:06:44.941633639 +0000 UTC m=+0.089779390 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: tmp-crun.fVUDJm.mount: Deactivated successfully.
Dec 06 09:06:45 np0005548789.localdomain podman[99671]: 2025-12-06 09:06:45.01224781 +0000 UTC m=+0.160393591 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z)
Dec 06 09:06:45 np0005548789.localdomain podman[99670]: 2025-12-06 09:06:45.055005183 +0000 UTC m=+0.203553316 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, container_name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Dec 06 09:06:45 np0005548789.localdomain podman[99669]: 2025-12-06 09:06:45.013416376 +0000 UTC m=+0.162002490 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 09:06:45 np0005548789.localdomain podman[99671]: unhealthy
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:06:45 np0005548789.localdomain podman[99669]: 2025-12-06 09:06:45.148268671 +0000 UTC m=+0.296854775 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z)
Dec 06 09:06:45 np0005548789.localdomain podman[99669]: unhealthy
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:06:45 np0005548789.localdomain podman[99670]: 2025-12-06 09:06:45.244501078 +0000 UTC m=+0.393049241 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:06:45 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:06:45 np0005548789.localdomain sshd[99739]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:06:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:06:47 np0005548789.localdomain systemd[1]: tmp-crun.DxoMrc.mount: Deactivated successfully.
Dec 06 09:06:47 np0005548789.localdomain podman[99741]: 2025-12-06 09:06:47.922055172 +0000 UTC m=+0.083859458 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:06:47 np0005548789.localdomain podman[99741]: 2025-12-06 09:06:47.957309395 +0000 UTC m=+0.119113671 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:06:47 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:06:59 np0005548789.localdomain sshd[99768]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:03 np0005548789.localdomain sshd[99768]: Received disconnect from 179.33.210.213 port 48100:11: Bye Bye [preauth]
Dec 06 09:07:03 np0005548789.localdomain sshd[99768]: Disconnected from authenticating user root 179.33.210.213 port 48100 [preauth]
Dec 06 09:07:03 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:07:03 np0005548789.localdomain recover_tripleo_nova_virtqemud[99771]: 61814
Dec 06 09:07:03 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:07:03 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:07:05 np0005548789.localdomain sshd[99772]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:06 np0005548789.localdomain sshd[99772]: Received disconnect from 81.192.46.35 port 58242:11: Bye Bye [preauth]
Dec 06 09:07:06 np0005548789.localdomain sshd[99772]: Disconnected from authenticating user root 81.192.46.35 port 58242 [preauth]
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: tmp-crun.RO2BP8.mount: Deactivated successfully.
Dec 06 09:07:09 np0005548789.localdomain systemd[1]: tmp-crun.mJglJB.mount: Deactivated successfully.
Dec 06 09:07:09 np0005548789.localdomain podman[99776]: 2025-12-06 09:07:09.965241376 +0000 UTC m=+0.116775779 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1)
Dec 06 09:07:09 np0005548789.localdomain podman[99782]: 2025-12-06 09:07:09.999378556 +0000 UTC m=+0.144287066 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 09:07:10 np0005548789.localdomain podman[99788]: 2025-12-06 09:07:10.00634861 +0000 UTC m=+0.149675431 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:07:10 np0005548789.localdomain podman[99782]: 2025-12-06 09:07:10.025637543 +0000 UTC m=+0.170546063 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:07:10 np0005548789.localdomain podman[99774]: 2025-12-06 09:07:09.980404853 +0000 UTC m=+0.141741778 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain podman[99788]: 2025-12-06 09:07:10.045266286 +0000 UTC m=+0.188593117 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain podman[99774]: 2025-12-06 09:07:10.059444352 +0000 UTC m=+0.220781307 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain podman[99775]: 2025-12-06 09:07:09.929010033 +0000 UTC m=+0.088861522 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:07:10 np0005548789.localdomain podman[99793]: 2025-12-06 09:07:10.111988357 +0000 UTC m=+0.249612193 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:10 np0005548789.localdomain podman[99793]: 2025-12-06 09:07:10.162359705 +0000 UTC m=+0.299983531 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain podman[99775]: 2025-12-06 09:07:10.213813026 +0000 UTC m=+0.373664545 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain podman[99776]: 2025-12-06 09:07:10.334354301 +0000 UTC m=+0.485888724 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true)
Dec 06 09:07:10 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:07:10 np0005548789.localdomain sshd[99906]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:12 np0005548789.localdomain sshd[99906]: Received disconnect from 103.157.25.60 port 42590:11: Bye Bye [preauth]
Dec 06 09:07:12 np0005548789.localdomain sshd[99906]: Disconnected from authenticating user root 103.157.25.60 port 42590 [preauth]
Dec 06 09:07:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:07:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:07:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:07:15 np0005548789.localdomain systemd[1]: tmp-crun.1ZjeTy.mount: Deactivated successfully.
Dec 06 09:07:15 np0005548789.localdomain podman[99909]: 2025-12-06 09:07:15.935700769 +0000 UTC m=+0.096428286 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 09:07:16 np0005548789.localdomain podman[99910]: 2025-12-06 09:07:16.024471296 +0000 UTC m=+0.182074487 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:07:16 np0005548789.localdomain podman[99910]: 2025-12-06 09:07:16.040266732 +0000 UTC m=+0.197869943 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4)
Dec 06 09:07:16 np0005548789.localdomain podman[99910]: unhealthy
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:07:16 np0005548789.localdomain podman[99909]: 2025-12-06 09:07:16.129083982 +0000 UTC m=+0.289811499 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:16 np0005548789.localdomain podman[99908]: 2025-12-06 09:07:16.137828941 +0000 UTC m=+0.300693574 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, release=1761123044, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:07:16 np0005548789.localdomain podman[99908]: 2025-12-06 09:07:16.17295839 +0000 UTC m=+0.335823043 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:34:05Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, release=1761123044)
Dec 06 09:07:16 np0005548789.localdomain podman[99908]: unhealthy
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:07:16 np0005548789.localdomain systemd[1]: tmp-crun.15w74S.mount: Deactivated successfully.
Dec 06 09:07:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:07:18 np0005548789.localdomain systemd[1]: tmp-crun.MsJQtC.mount: Deactivated successfully.
Dec 06 09:07:18 np0005548789.localdomain podman[99978]: 2025-12-06 09:07:18.920981611 +0000 UTC m=+0.083909700 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4)
Dec 06 09:07:18 np0005548789.localdomain podman[99978]: 2025-12-06 09:07:18.952070486 +0000 UTC m=+0.114998616 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, container_name=nova_compute)
Dec 06 09:07:18 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:07:23 np0005548789.localdomain sshd[100005]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:23 np0005548789.localdomain sshd[100005]: Received disconnect from 12.156.67.18 port 45874:11: Bye Bye [preauth]
Dec 06 09:07:23 np0005548789.localdomain sshd[100005]: Disconnected from authenticating user root 12.156.67.18 port 45874 [preauth]
Dec 06 09:07:24 np0005548789.localdomain sshd[100007]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:24 np0005548789.localdomain sshd[100007]: Connection closed by authenticating user root 92.118.39.95 port 33562 [preauth]
Dec 06 09:07:26 np0005548789.localdomain sshd[100009]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:28 np0005548789.localdomain sshd[100009]: Received disconnect from 103.192.152.59 port 49680:11: Bye Bye [preauth]
Dec 06 09:07:28 np0005548789.localdomain sshd[100009]: Disconnected from authenticating user root 103.192.152.59 port 49680 [preauth]
Dec 06 09:07:37 np0005548789.localdomain sshd[35921]: Received disconnect from 192.168.122.100 port 58206:11: disconnected by user
Dec 06 09:07:37 np0005548789.localdomain sshd[35921]: Disconnected from user tripleo-admin 192.168.122.100 port 58206
Dec 06 09:07:37 np0005548789.localdomain sshd[35900]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 09:07:37 np0005548789.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Dec 06 09:07:37 np0005548789.localdomain systemd[1]: session-28.scope: Consumed 7min 1.517s CPU time.
Dec 06 09:07:37 np0005548789.localdomain systemd-logind[766]: Session 28 logged out. Waiting for processes to exit.
Dec 06 09:07:37 np0005548789.localdomain systemd-logind[766]: Removed session 28.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:07:40 np0005548789.localdomain systemd[1]: tmp-crun.mLHnxh.mount: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain podman[100011]: 2025-12-06 09:07:40.942316387 +0000 UTC m=+0.096657712 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64)
Dec 06 09:07:41 np0005548789.localdomain podman[100013]: 2025-12-06 09:07:41.052304837 +0000 UTC m=+0.199806332 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public)
Dec 06 09:07:41 np0005548789.localdomain podman[100012]: 2025-12-06 09:07:40.995739398 +0000 UTC m=+0.147592506 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z)
Dec 06 09:07:41 np0005548789.localdomain podman[100035]: 2025-12-06 09:07:41.107313478 +0000 UTC m=+0.232818747 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, release=1761123044, architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com)
Dec 06 09:07:41 np0005548789.localdomain podman[100035]: 2025-12-06 09:07:41.114881161 +0000 UTC m=+0.240386350 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, io.openshift.expose-services=, container_name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 09:07:41 np0005548789.localdomain podman[100036]: 2025-12-06 09:07:40.971016259 +0000 UTC m=+0.092519325 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain podman[100011]: 2025-12-06 09:07:41.129231952 +0000 UTC m=+0.283573267 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4)
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain podman[100036]: 2025-12-06 09:07:41.157076188 +0000 UTC m=+0.278579224 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain podman[100029]: 2025-12-06 09:07:41.024811893 +0000 UTC m=+0.155379707 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute)
Dec 06 09:07:41 np0005548789.localdomain podman[100012]: 2025-12-06 09:07:41.181455297 +0000 UTC m=+0.333308445 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 09:07:41 np0005548789.localdomain sudo[100140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain sudo[100140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:41 np0005548789.localdomain sudo[100140]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:41 np0005548789.localdomain podman[100029]: 2025-12-06 09:07:41.209195479 +0000 UTC m=+0.339763283 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain sudo[100162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:07:41 np0005548789.localdomain sudo[100162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:41 np0005548789.localdomain podman[100013]: 2025-12-06 09:07:41.433058809 +0000 UTC m=+0.580560254 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12)
Dec 06 09:07:41 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:07:41 np0005548789.localdomain sudo[100162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:42 np0005548789.localdomain sudo[100209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:07:42 np0005548789.localdomain sudo[100209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:07:42 np0005548789.localdomain sudo[100209]: pam_unix(sudo:session): session closed for user root
Dec 06 09:07:45 np0005548789.localdomain sshd[100224]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:46 np0005548789.localdomain sshd[100224]: Received disconnect from 64.227.156.63 port 58184:11: Bye Bye [preauth]
Dec 06 09:07:46 np0005548789.localdomain sshd[100224]: Disconnected from authenticating user root 64.227.156.63 port 58184 [preauth]
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: tmp-crun.G8FLAy.mount: Deactivated successfully.
Dec 06 09:07:46 np0005548789.localdomain podman[100227]: 2025-12-06 09:07:46.784415774 +0000 UTC m=+0.094299509 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_id=tripleo_step1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: tmp-crun.MwSbCq.mount: Deactivated successfully.
Dec 06 09:07:46 np0005548789.localdomain podman[100226]: 2025-12-06 09:07:46.889969858 +0000 UTC m=+0.201042189 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:07:46 np0005548789.localdomain podman[100228]: 2025-12-06 09:07:46.922046484 +0000 UTC m=+0.228909857 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 09:07:46 np0005548789.localdomain podman[100226]: 2025-12-06 09:07:46.932909108 +0000 UTC m=+0.243981469 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:07:46 np0005548789.localdomain podman[100226]: unhealthy
Dec 06 09:07:46 np0005548789.localdomain podman[100228]: 2025-12-06 09:07:46.941240504 +0000 UTC m=+0.248103847 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:07:46 np0005548789.localdomain podman[100228]: unhealthy
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:07:46 np0005548789.localdomain podman[100227]: 2025-12-06 09:07:46.969040938 +0000 UTC m=+0.278924583 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 09:07:46 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Activating special unit Exit the Session...
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Removed slice User Background Tasks Slice.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped target Main User Target.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped target Basic System.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped target Paths.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped target Sockets.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped target Timers.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Closed D-Bus User Message Bus Socket.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Removed slice User Application Slice.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Reached target Shutdown.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Finished Exit the Session.
Dec 06 09:07:47 np0005548789.localdomain systemd[35904]: Reached target Exit the Session.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: user@1003.service: Consumed 4.248s CPU time, read 0B from disk, written 7.0K to disk.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: user-1003.slice: Consumed 7min 5.795s CPU time.
Dec 06 09:07:47 np0005548789.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 09:07:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:07:49 np0005548789.localdomain podman[100295]: 2025-12-06 09:07:49.929319872 +0000 UTC m=+0.081868007 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044)
Dec 06 09:07:49 np0005548789.localdomain podman[100295]: 2025-12-06 09:07:49.981785015 +0000 UTC m=+0.134333100 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step5, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:07:49 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:07:57 np0005548789.localdomain sshd[100322]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:07:58 np0005548789.localdomain sshd[100322]: Received disconnect from 118.193.38.207 port 55072:11: Bye Bye [preauth]
Dec 06 09:07:58 np0005548789.localdomain sshd[100322]: Disconnected from authenticating user root 118.193.38.207 port 55072 [preauth]
Dec 06 09:07:59 np0005548789.localdomain sshd[100324]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:10 np0005548789.localdomain sshd[100324]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:08:10 np0005548789.localdomain sshd[100324]: banner exchange: Connection from 14.103.115.124 port 41850: Connection timed out
Dec 06 09:08:11 np0005548789.localdomain sshd[100325]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:08:11 np0005548789.localdomain podman[100328]: 2025-12-06 09:08:11.957596493 +0000 UTC m=+0.107031971 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd)
Dec 06 09:08:11 np0005548789.localdomain podman[100328]: 2025-12-06 09:08:11.967072474 +0000 UTC m=+0.116508002 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044)
Dec 06 09:08:11 np0005548789.localdomain systemd[1]: tmp-crun.bgvr0g.mount: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain podman[100335]: 2025-12-06 09:08:12.011930973 +0000 UTC m=+0.150711993 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:11:48Z, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team)
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain podman[100327]: 2025-12-06 09:08:12.067889603 +0000 UTC m=+0.223645455 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git)
Dec 06 09:08:12 np0005548789.localdomain podman[100335]: 2025-12-06 09:08:12.072510925 +0000 UTC m=+0.211291975 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain podman[100347]: 2025-12-06 09:08:11.984308754 +0000 UTC m=+0.115629485 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:08:12 np0005548789.localdomain podman[100327]: 2025-12-06 09:08:12.10422112 +0000 UTC m=+0.259976912 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1)
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain podman[100347]: 2025-12-06 09:08:12.118116007 +0000 UTC m=+0.249436748 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Dec 06 09:08:12 np0005548789.localdomain podman[100341]: 2025-12-06 09:08:12.118730835 +0000 UTC m=+0.247603190 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Dec 06 09:08:12 np0005548789.localdomain podman[100341]: 2025-12-06 09:08:12.127469414 +0000 UTC m=+0.256341779 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container)
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain podman[100333]: 2025-12-06 09:08:12.271042586 +0000 UTC m=+0.415002556 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:08:12 np0005548789.localdomain podman[100333]: 2025-12-06 09:08:12.636371025 +0000 UTC m=+0.780331035 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 06 09:08:12 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:08:12 np0005548789.localdomain sshd[100325]: Received disconnect from 103.234.151.178 port 4146:11: Bye Bye [preauth]
Dec 06 09:08:12 np0005548789.localdomain sshd[100325]: Disconnected from authenticating user root 103.234.151.178 port 4146 [preauth]
Dec 06 09:08:13 np0005548789.localdomain sshd[100456]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:13 np0005548789.localdomain sshd[100456]: Received disconnect from 81.192.46.35 port 56558:11: Bye Bye [preauth]
Dec 06 09:08:13 np0005548789.localdomain sshd[100456]: Disconnected from authenticating user root 81.192.46.35 port 56558 [preauth]
Dec 06 09:08:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:08:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:08:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:08:17 np0005548789.localdomain podman[100458]: 2025-12-06 09:08:17.923024612 +0000 UTC m=+0.079003049 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, tcib_managed=true)
Dec 06 09:08:17 np0005548789.localdomain podman[100458]: 2025-12-06 09:08:17.940106267 +0000 UTC m=+0.096084744 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:08:17 np0005548789.localdomain podman[100458]: unhealthy
Dec 06 09:08:17 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:17 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:08:17 np0005548789.localdomain podman[100460]: 2025-12-06 09:08:17.988108122 +0000 UTC m=+0.137679173 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:08:18 np0005548789.localdomain podman[100460]: 2025-12-06 09:08:18.031279369 +0000 UTC m=+0.180850430 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:08:18 np0005548789.localdomain podman[100460]: unhealthy
Dec 06 09:08:18 np0005548789.localdomain systemd[1]: tmp-crun.jzlcgT.mount: Deactivated successfully.
Dec 06 09:08:18 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:18 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:08:18 np0005548789.localdomain podman[100459]: 2025-12-06 09:08:18.052614864 +0000 UTC m=+0.205519147 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:08:18 np0005548789.localdomain podman[100459]: 2025-12-06 09:08:18.255348586 +0000 UTC m=+0.408252849 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, release=1761123044, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1)
Dec 06 09:08:18 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:08:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:08:20 np0005548789.localdomain podman[100526]: 2025-12-06 09:08:20.916610441 +0000 UTC m=+0.076438801 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:08:20 np0005548789.localdomain podman[100526]: 2025-12-06 09:08:20.948138969 +0000 UTC m=+0.107967339 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team)
Dec 06 09:08:20 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:08:36 np0005548789.localdomain sshd[100550]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:36 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:08:36 np0005548789.localdomain recover_tripleo_nova_virtqemud[100553]: 61814
Dec 06 09:08:36 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:08:36 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:08:36 np0005548789.localdomain sshd[100550]: Received disconnect from 12.156.67.18 port 52788:11: Bye Bye [preauth]
Dec 06 09:08:36 np0005548789.localdomain sshd[100550]: Disconnected from authenticating user root 12.156.67.18 port 52788 [preauth]
Dec 06 09:08:41 np0005548789.localdomain sshd[100554]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:08:42 np0005548789.localdomain sshd[100554]: Received disconnect from 103.157.25.60 port 44268:11: Bye Bye [preauth]
Dec 06 09:08:42 np0005548789.localdomain sshd[100554]: Disconnected from authenticating user root 103.157.25.60 port 44268 [preauth]
Dec 06 09:08:42 np0005548789.localdomain sudo[100556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:08:42 np0005548789.localdomain sudo[100556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:42 np0005548789.localdomain sudo[100556]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:42 np0005548789.localdomain sudo[100614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:08:42 np0005548789.localdomain sudo[100614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:42 np0005548789.localdomain podman[100570]: 2025-12-06 09:08:42.882202873 +0000 UTC m=+0.099488949 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:08:42 np0005548789.localdomain podman[100572]: 2025-12-06 09:08:42.938130601 +0000 UTC m=+0.147926827 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:08:42 np0005548789.localdomain systemd[1]: tmp-crun.3wfapI.mount: Deactivated successfully.
Dec 06 09:08:42 np0005548789.localdomain podman[100584]: 2025-12-06 09:08:42.987983393 +0000 UTC m=+0.182065366 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12)
Dec 06 09:08:43 np0005548789.localdomain podman[100571]: 2025-12-06 09:08:43.045618245 +0000 UTC m=+0.260399874 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, version=17.1.12, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true)
Dec 06 09:08:43 np0005548789.localdomain podman[100571]: 2025-12-06 09:08:43.079581039 +0000 UTC m=+0.294362668 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd)
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain podman[100578]: 2025-12-06 09:08:43.101895845 +0000 UTC m=+0.307537334 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 09:08:43 np0005548789.localdomain podman[100578]: 2025-12-06 09:08:43.155977057 +0000 UTC m=+0.361618516 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:08:43 np0005548789.localdomain podman[100570]: 2025-12-06 09:08:43.169163662 +0000 UTC m=+0.386449808 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron)
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain podman[100594]: 2025-12-06 09:08:43.157881846 +0000 UTC m=+0.347056068 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:08:43 np0005548789.localdomain podman[100584]: 2025-12-06 09:08:43.221574393 +0000 UTC m=+0.415656376 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain podman[100594]: 2025-12-06 09:08:43.238309268 +0000 UTC m=+0.427483460 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain podman[100572]: 2025-12-06 09:08:43.274905243 +0000 UTC m=+0.484701419 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1)
Dec 06 09:08:43 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:08:43 np0005548789.localdomain sudo[100614]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:44 np0005548789.localdomain sudo[100748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:08:44 np0005548789.localdomain sudo[100748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:08:44 np0005548789.localdomain sudo[100748]: pam_unix(sudo:session): session closed for user root
Dec 06 09:08:45 np0005548789.localdomain sshd[99739]: fatal: Timeout before authentication for 121.229.5.171 port 39926
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: tmp-crun.IVyE09.mount: Deactivated successfully.
Dec 06 09:08:48 np0005548789.localdomain podman[100763]: 2025-12-06 09:08:48.922432219 +0000 UTC m=+0.088642135 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:08:48 np0005548789.localdomain podman[100763]: 2025-12-06 09:08:48.961749568 +0000 UTC m=+0.127959454 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public)
Dec 06 09:08:48 np0005548789.localdomain podman[100763]: unhealthy
Dec 06 09:08:48 np0005548789.localdomain podman[100764]: 2025-12-06 09:08:48.972488488 +0000 UTC m=+0.132139523 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:48 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:08:49 np0005548789.localdomain podman[100765]: 2025-12-06 09:08:48.966936457 +0000 UTC m=+0.123734524 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:08:49 np0005548789.localdomain podman[100765]: 2025-12-06 09:08:49.046566404 +0000 UTC m=+0.203364521 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team)
Dec 06 09:08:49 np0005548789.localdomain podman[100765]: unhealthy
Dec 06 09:08:49 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:08:49 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:08:49 np0005548789.localdomain podman[100764]: 2025-12-06 09:08:49.150018164 +0000 UTC m=+0.309669179 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:08:49 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:08:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:08:51 np0005548789.localdomain podman[100830]: 2025-12-06 09:08:51.916179582 +0000 UTC m=+0.080280718 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:08:51 np0005548789.localdomain podman[100830]: 2025-12-06 09:08:51.969217792 +0000 UTC m=+0.133318888 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:08:51 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:08:58 np0005548789.localdomain sshd[100856]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:00 np0005548789.localdomain sshd[100856]: Received disconnect from 103.192.152.59 port 46536:11: Bye Bye [preauth]
Dec 06 09:09:00 np0005548789.localdomain sshd[100856]: Disconnected from authenticating user root 103.192.152.59 port 46536 [preauth]
Dec 06 09:09:11 np0005548789.localdomain sshd[100858]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:09:13 np0005548789.localdomain systemd[1]: tmp-crun.mkkIxW.mount: Deactivated successfully.
Dec 06 09:09:13 np0005548789.localdomain podman[100860]: 2025-12-06 09:09:13.936032102 +0000 UTC m=+0.092284307 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.12, container_name=collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:09:13 np0005548789.localdomain podman[100859]: 2025-12-06 09:09:13.978401774 +0000 UTC m=+0.136390643 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1761123044, container_name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Dec 06 09:09:13 np0005548789.localdomain podman[100874]: 2025-12-06 09:09:13.998486231 +0000 UTC m=+0.140496089 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:09:14 np0005548789.localdomain podman[100860]: 2025-12-06 09:09:14.026886474 +0000 UTC m=+0.183138699 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain podman[100861]: 2025-12-06 09:09:14.045090394 +0000 UTC m=+0.197764749 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true)
Dec 06 09:09:14 np0005548789.localdomain podman[100862]: 2025-12-06 09:09:13.950003442 +0000 UTC m=+0.098441797 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044)
Dec 06 09:09:14 np0005548789.localdomain podman[100874]: 2025-12-06 09:09:14.056218906 +0000 UTC m=+0.198228794 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain podman[100862]: 2025-12-06 09:09:14.083108092 +0000 UTC m=+0.231546507 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain podman[100859]: 2025-12-06 09:09:14.116385695 +0000 UTC m=+0.274374624 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true)
Dec 06 09:09:14 np0005548789.localdomain podman[100869]: 2025-12-06 09:09:14.126384233 +0000 UTC m=+0.273637352 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, release=1761123044, batch=17.1_20251118.1)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain podman[100869]: 2025-12-06 09:09:14.139045262 +0000 UTC m=+0.286298381 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain podman[100861]: 2025-12-06 09:09:14.445084777 +0000 UTC m=+0.597759142 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:09:14 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:09:14 np0005548789.localdomain sshd[100993]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:16 np0005548789.localdomain sshd[100993]: Received disconnect from 64.227.156.63 port 50356:11: Bye Bye [preauth]
Dec 06 09:09:16 np0005548789.localdomain sshd[100993]: Disconnected from authenticating user root 64.227.156.63 port 50356 [preauth]
Dec 06 09:09:16 np0005548789.localdomain sshd[100858]: Connection closed by 45.78.222.162 port 35518 [preauth]
Dec 06 09:09:17 np0005548789.localdomain sshd[100996]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:18 np0005548789.localdomain sshd[100998]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:18 np0005548789.localdomain sshd[100996]: Received disconnect from 118.193.38.207 port 50564:11: Bye Bye [preauth]
Dec 06 09:09:18 np0005548789.localdomain sshd[100996]: Disconnected from authenticating user root 118.193.38.207 port 50564 [preauth]
Dec 06 09:09:18 np0005548789.localdomain sshd[100998]: Received disconnect from 81.192.46.35 port 54874:11: Bye Bye [preauth]
Dec 06 09:09:18 np0005548789.localdomain sshd[100998]: Disconnected from authenticating user root 81.192.46.35 port 54874 [preauth]
Dec 06 09:09:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:09:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:09:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:09:19 np0005548789.localdomain podman[101001]: 2025-12-06 09:09:19.923736474 +0000 UTC m=+0.084667233 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, container_name=metrics_qdr, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=)
Dec 06 09:09:19 np0005548789.localdomain podman[101002]: 2025-12-06 09:09:19.975203816 +0000 UTC m=+0.133346169 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64)
Dec 06 09:09:20 np0005548789.localdomain podman[101002]: 2025-12-06 09:09:20.027139872 +0000 UTC m=+0.185282265 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z)
Dec 06 09:09:20 np0005548789.localdomain podman[101002]: unhealthy
Dec 06 09:09:20 np0005548789.localdomain podman[101000]: 2025-12-06 09:09:20.035689425 +0000 UTC m=+0.199137341 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:09:20 np0005548789.localdomain podman[101000]: 2025-12-06 09:09:20.054273947 +0000 UTC m=+0.217721833 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, container_name=ovn_controller, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 09:09:20 np0005548789.localdomain podman[101000]: unhealthy
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:09:20 np0005548789.localdomain podman[101001]: 2025-12-06 09:09:20.118386867 +0000 UTC m=+0.279317636 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container)
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:09:20 np0005548789.localdomain systemd[1]: tmp-crun.KGT7Mg.mount: Deactivated successfully.
Dec 06 09:09:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:09:22 np0005548789.localdomain podman[101069]: 2025-12-06 09:09:22.917933261 +0000 UTC m=+0.078748531 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true)
Dec 06 09:09:22 np0005548789.localdomain podman[101069]: 2025-12-06 09:09:22.975231112 +0000 UTC m=+0.136046382 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5)
Dec 06 09:09:22 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:09:38 np0005548789.localdomain sshd[101095]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:38 np0005548789.localdomain sshd[101097]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:39 np0005548789.localdomain sshd[101097]: Connection closed by authenticating user root 92.118.39.95 port 48538 [preauth]
Dec 06 09:09:39 np0005548789.localdomain sshd[101095]: Received disconnect from 103.234.151.178 port 30274:11: Bye Bye [preauth]
Dec 06 09:09:39 np0005548789.localdomain sshd[101095]: Disconnected from authenticating user root 103.234.151.178 port 30274 [preauth]
Dec 06 09:09:43 np0005548789.localdomain sshd[101099]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:44 np0005548789.localdomain sudo[101101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:09:44 np0005548789.localdomain sudo[101101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:44 np0005548789.localdomain sudo[101101]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:09:44 np0005548789.localdomain sshd[101099]: Received disconnect from 12.156.67.18 port 46918:11: Bye Bye [preauth]
Dec 06 09:09:44 np0005548789.localdomain sshd[101099]: Disconnected from authenticating user root 12.156.67.18 port 46918 [preauth]
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:09:44 np0005548789.localdomain sudo[101136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:09:44 np0005548789.localdomain sudo[101136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:09:44 np0005548789.localdomain podman[101116]: 2025-12-06 09:09:44.514824738 +0000 UTC m=+0.095414906 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: tmp-crun.jbKgNZ.mount: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain podman[101180]: 2025-12-06 09:09:44.578158909 +0000 UTC m=+0.070410969 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:09:44 np0005548789.localdomain podman[101131]: 2025-12-06 09:09:44.554897367 +0000 UTC m=+0.111044836 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:09:44 np0005548789.localdomain podman[101117]: 2025-12-06 09:09:44.62450528 +0000 UTC m=+0.200946831 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:09:44 np0005548789.localdomain podman[101117]: 2025-12-06 09:09:44.634004772 +0000 UTC m=+0.210446343 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, release=1761123044, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4)
Dec 06 09:09:44 np0005548789.localdomain podman[101131]: 2025-12-06 09:09:44.640027516 +0000 UTC m=+0.196174985 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain podman[101122]: 2025-12-06 09:09:44.680896639 +0000 UTC m=+0.250003586 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public)
Dec 06 09:09:44 np0005548789.localdomain podman[101116]: 2025-12-06 09:09:44.699500189 +0000 UTC m=+0.280090367 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain podman[101129]: 2025-12-06 09:09:44.711141966 +0000 UTC m=+0.276276510 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid)
Dec 06 09:09:44 np0005548789.localdomain podman[101129]: 2025-12-06 09:09:44.718915674 +0000 UTC m=+0.284050238 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1761123044)
Dec 06 09:09:44 np0005548789.localdomain podman[101122]: 2025-12-06 09:09:44.726047593 +0000 UTC m=+0.295154520 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:09:44 np0005548789.localdomain podman[101180]: 2025-12-06 09:09:44.925341163 +0000 UTC m=+0.417593283 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, container_name=nova_migration_target)
Dec 06 09:09:44 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:09:45 np0005548789.localdomain sudo[101136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:45 np0005548789.localdomain sudo[101294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:09:45 np0005548789.localdomain sudo[101294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:09:45 np0005548789.localdomain sudo[101294]: pam_unix(sudo:session): session closed for user root
Dec 06 09:09:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:09:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:09:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:09:50 np0005548789.localdomain systemd[1]: tmp-crun.YokzOp.mount: Deactivated successfully.
Dec 06 09:09:50 np0005548789.localdomain podman[101310]: 2025-12-06 09:09:50.920542784 +0000 UTC m=+0.076124835 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4)
Dec 06 09:09:50 np0005548789.localdomain podman[101309]: 2025-12-06 09:09:50.978391637 +0000 UTC m=+0.133545215 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git)
Dec 06 09:09:50 np0005548789.localdomain podman[101311]: 2025-12-06 09:09:50.947146169 +0000 UTC m=+0.094995573 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:09:51 np0005548789.localdomain podman[101309]: 2025-12-06 09:09:51.012936096 +0000 UTC m=+0.168089674 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, maintainer=OpenStack TripleO Team, container_name=ovn_controller)
Dec 06 09:09:51 np0005548789.localdomain podman[101309]: unhealthy
Dec 06 09:09:51 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:51 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:09:51 np0005548789.localdomain podman[101311]: 2025-12-06 09:09:51.033128425 +0000 UTC m=+0.180977829 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:09:51 np0005548789.localdomain podman[101311]: unhealthy
Dec 06 09:09:51 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:09:51 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:09:51 np0005548789.localdomain podman[101310]: 2025-12-06 09:09:51.134222714 +0000 UTC m=+0.289804775 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-type=git)
Dec 06 09:09:51 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:09:52 np0005548789.localdomain sshd[101374]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:09:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:09:53 np0005548789.localdomain systemd[1]: tmp-crun.uppUns.mount: Deactivated successfully.
Dec 06 09:09:53 np0005548789.localdomain podman[101376]: 2025-12-06 09:09:53.923263176 +0000 UTC m=+0.084334246 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 06 09:09:53 np0005548789.localdomain podman[101376]: 2025-12-06 09:09:53.952600585 +0000 UTC m=+0.113671675 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true)
Dec 06 09:09:53 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:09:56 np0005548789.localdomain sshd[101374]: Received disconnect from 179.33.210.213 port 52190:11: Bye Bye [preauth]
Dec 06 09:09:56 np0005548789.localdomain sshd[101374]: Disconnected from authenticating user root 179.33.210.213 port 52190 [preauth]
Dec 06 09:10:07 np0005548789.localdomain sshd[101401]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:09 np0005548789.localdomain sshd[101401]: Received disconnect from 103.157.25.60 port 45950:11: Bye Bye [preauth]
Dec 06 09:10:09 np0005548789.localdomain sshd[101401]: Disconnected from authenticating user root 103.157.25.60 port 45950 [preauth]
Dec 06 09:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:10:14 np0005548789.localdomain recover_tripleo_nova_virtqemud[101431]: 61814
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:10:14 np0005548789.localdomain podman[101410]: 2025-12-06 09:10:14.935367927 +0000 UTC m=+0.082190790 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64)
Dec 06 09:10:14 np0005548789.localdomain podman[101410]: 2025-12-06 09:10:14.94395289 +0000 UTC m=+0.090775763 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible)
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: tmp-crun.H2WkKn.mount: Deactivated successfully.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:10:14 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:10:14 np0005548789.localdomain podman[101403]: 2025-12-06 09:10:14.98310515 +0000 UTC m=+0.140728625 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z)
Dec 06 09:10:14 np0005548789.localdomain podman[101404]: 2025-12-06 09:10:14.992934692 +0000 UTC m=+0.145053768 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1)
Dec 06 09:10:15 np0005548789.localdomain podman[101404]: 2025-12-06 09:10:15.02811478 +0000 UTC m=+0.180233856 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:10:15 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:10:15 np0005548789.localdomain podman[101412]: 2025-12-06 09:10:14.956542217 +0000 UTC m=+0.099241744 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4)
Dec 06 09:10:15 np0005548789.localdomain podman[101477]: 2025-12-06 09:10:15.031211806 +0000 UTC m=+0.058269177 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, release=1761123044, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:10:15 np0005548789.localdomain podman[101405]: 2025-12-06 09:10:15.085422928 +0000 UTC m=+0.234280424 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:10:15 np0005548789.localdomain podman[101412]: 2025-12-06 09:10:15.088609395 +0000 UTC m=+0.231308862 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:10:15 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:10:15 np0005548789.localdomain podman[101405]: 2025-12-06 09:10:15.101930274 +0000 UTC m=+0.250787780 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:10:15 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:10:15 np0005548789.localdomain podman[101403]: 2025-12-06 09:10:15.118312806 +0000 UTC m=+0.275936261 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:10:15 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:10:15 np0005548789.localdomain podman[101477]: 2025-12-06 09:10:15.371109536 +0000 UTC m=+0.398166967 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:10:15 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:10:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:10:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:10:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:10:21 np0005548789.localdomain podman[101536]: 2025-12-06 09:10:21.93310214 +0000 UTC m=+0.086114441 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 09:10:21 np0005548789.localdomain systemd[1]: tmp-crun.P0TMfw.mount: Deactivated successfully.
Dec 06 09:10:21 np0005548789.localdomain podman[101535]: 2025-12-06 09:10:21.99603967 +0000 UTC m=+0.152574269 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:10:22 np0005548789.localdomain podman[101536]: 2025-12-06 09:10:22.001143566 +0000 UTC m=+0.154155847 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4)
Dec 06 09:10:22 np0005548789.localdomain podman[101536]: unhealthy
Dec 06 09:10:22 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:22 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:10:22 np0005548789.localdomain podman[101534]: 2025-12-06 09:10:22.085339198 +0000 UTC m=+0.245221100 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044)
Dec 06 09:10:22 np0005548789.localdomain podman[101534]: 2025-12-06 09:10:22.107113054 +0000 UTC m=+0.266994946 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:10:22 np0005548789.localdomain podman[101534]: unhealthy
Dec 06 09:10:22 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:22 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:10:22 np0005548789.localdomain podman[101535]: 2025-12-06 09:10:22.216975333 +0000 UTC m=+0.373509882 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-type=git, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:10:22 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:10:23 np0005548789.localdomain sshd[101604]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:24 np0005548789.localdomain sshd[101604]: Received disconnect from 81.192.46.35 port 53190:11: Bye Bye [preauth]
Dec 06 09:10:24 np0005548789.localdomain sshd[101604]: Disconnected from authenticating user root 81.192.46.35 port 53190 [preauth]
Dec 06 09:10:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:10:24 np0005548789.localdomain systemd[1]: tmp-crun.CE9OrP.mount: Deactivated successfully.
Dec 06 09:10:24 np0005548789.localdomain podman[101606]: 2025-12-06 09:10:24.143943535 +0000 UTC m=+0.092086233 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:10:24 np0005548789.localdomain podman[101606]: 2025-12-06 09:10:24.197170967 +0000 UTC m=+0.145313625 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1)
Dec 06 09:10:24 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:10:32 np0005548789.localdomain sshd[101631]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:34 np0005548789.localdomain sshd[101631]: Received disconnect from 103.192.152.59 port 47408:11: Bye Bye [preauth]
Dec 06 09:10:34 np0005548789.localdomain sshd[101631]: Disconnected from authenticating user root 103.192.152.59 port 47408 [preauth]
Dec 06 09:10:38 np0005548789.localdomain sshd[101633]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:39 np0005548789.localdomain sshd[101633]: Received disconnect from 64.227.156.63 port 55430:11: Bye Bye [preauth]
Dec 06 09:10:39 np0005548789.localdomain sshd[101633]: Disconnected from authenticating user root 64.227.156.63 port 55430 [preauth]
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: tmp-crun.WNAUQQ.mount: Deactivated successfully.
Dec 06 09:10:45 np0005548789.localdomain sudo[101686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:10:45 np0005548789.localdomain sudo[101686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:45 np0005548789.localdomain systemd[1]: tmp-crun.lunY6X.mount: Deactivated successfully.
Dec 06 09:10:45 np0005548789.localdomain sudo[101686]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:45 np0005548789.localdomain podman[101636]: 2025-12-06 09:10:45.954169442 +0000 UTC m=+0.115600284 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container)
Dec 06 09:10:45 np0005548789.localdomain podman[101638]: 2025-12-06 09:10:45.998921334 +0000 UTC m=+0.155975863 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:10:46 np0005548789.localdomain sudo[101725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:10:46 np0005548789.localdomain sudo[101725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:46 np0005548789.localdomain podman[101636]: 2025-12-06 09:10:46.042071356 +0000 UTC m=+0.203502208 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain podman[101638]: 2025-12-06 09:10:46.052981291 +0000 UTC m=+0.210035810 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ceilometer_agent_compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64)
Dec 06 09:10:46 np0005548789.localdomain podman[101637]: 2025-12-06 09:10:46.060022927 +0000 UTC m=+0.217282182 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain podman[101639]: 2025-12-06 09:10:45.922061487 +0000 UTC m=+0.080468537 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Dec 06 09:10:46 np0005548789.localdomain podman[101639]: 2025-12-06 09:10:46.110071862 +0000 UTC m=+0.268478922 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain podman[101635]: 2025-12-06 09:10:45.976829177 +0000 UTC m=+0.140282912 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., container_name=logrotate_crond, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:10:46 np0005548789.localdomain podman[101635]: 2025-12-06 09:10:46.161063285 +0000 UTC m=+0.324517030 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team)
Dec 06 09:10:46 np0005548789.localdomain podman[101643]: 2025-12-06 09:10:46.11556926 +0000 UTC m=+0.264769658 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain podman[101643]: 2025-12-06 09:10:46.200251826 +0000 UTC m=+0.349452284 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain podman[101637]: 2025-12-06 09:10:46.452273492 +0000 UTC m=+0.609532787 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:10:46 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:10:46 np0005548789.localdomain sudo[101725]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:46 np0005548789.localdomain sshd[101827]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:10:47 np0005548789.localdomain sshd[101827]: Received disconnect from 12.156.67.18 port 41068:11: Bye Bye [preauth]
Dec 06 09:10:47 np0005548789.localdomain sshd[101827]: Disconnected from authenticating user root 12.156.67.18 port 41068 [preauth]
Dec 06 09:10:49 np0005548789.localdomain sudo[101829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:10:49 np0005548789.localdomain sudo[101829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:10:49 np0005548789.localdomain sudo[101829]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:10:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:10:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:10:52 np0005548789.localdomain podman[101844]: 2025-12-06 09:10:52.92504181 +0000 UTC m=+0.087428901 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Dec 06 09:10:52 np0005548789.localdomain podman[101846]: 2025-12-06 09:10:52.9709949 +0000 UTC m=+0.128794930 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Dec 06 09:10:53 np0005548789.localdomain podman[101844]: 2025-12-06 09:10:53.022324693 +0000 UTC m=+0.184711784 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 09:10:53 np0005548789.localdomain podman[101844]: unhealthy
Dec 06 09:10:53 np0005548789.localdomain podman[101845]: 2025-12-06 09:10:53.032254948 +0000 UTC m=+0.192442191 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, distribution-scope=public, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:10:53 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:53 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:10:53 np0005548789.localdomain podman[101846]: 2025-12-06 09:10:53.060551255 +0000 UTC m=+0.218351255 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:10:53 np0005548789.localdomain podman[101846]: unhealthy
Dec 06 09:10:53 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:10:53 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:10:53 np0005548789.localdomain podman[101845]: 2025-12-06 09:10:53.239199531 +0000 UTC m=+0.399386794 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 09:10:53 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:10:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:10:54 np0005548789.localdomain podman[101910]: 2025-12-06 09:10:54.932857243 +0000 UTC m=+0.084469851 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 09:10:54 np0005548789.localdomain podman[101910]: 2025-12-06 09:10:54.966785483 +0000 UTC m=+0.118398091 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4)
Dec 06 09:10:54 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:10:58 np0005548789.localdomain sshd[101935]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:00 np0005548789.localdomain sshd[101935]: Received disconnect from 118.193.38.207 port 40886:11: Bye Bye [preauth]
Dec 06 09:11:00 np0005548789.localdomain sshd[101935]: Disconnected from authenticating user root 118.193.38.207 port 40886 [preauth]
Dec 06 09:11:02 np0005548789.localdomain sshd[101937]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:04 np0005548789.localdomain sshd[101937]: Received disconnect from 103.234.151.178 port 56410:11: Bye Bye [preauth]
Dec 06 09:11:04 np0005548789.localdomain sshd[101937]: Disconnected from authenticating user root 103.234.151.178 port 56410 [preauth]
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:11:16 np0005548789.localdomain systemd[1]: tmp-crun.OuG76Y.mount: Deactivated successfully.
Dec 06 09:11:16 np0005548789.localdomain podman[101941]: 2025-12-06 09:11:16.929097014 +0000 UTC m=+0.081426897 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:11:17 np0005548789.localdomain podman[101942]: 2025-12-06 09:11:17.002192365 +0000 UTC m=+0.150440983 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, tcib_managed=true, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:11:17 np0005548789.localdomain podman[101939]: 2025-12-06 09:11:17.049121844 +0000 UTC m=+0.205992766 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4)
Dec 06 09:11:17 np0005548789.localdomain podman[101942]: 2025-12-06 09:11:17.050072723 +0000 UTC m=+0.198321271 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:11:17 np0005548789.localdomain podman[101940]: 2025-12-06 09:11:16.955996758 +0000 UTC m=+0.109028292 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:11:17 np0005548789.localdomain podman[101948]: 2025-12-06 09:11:17.06468533 +0000 UTC m=+0.206935674 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T23:44:13Z)
Dec 06 09:11:17 np0005548789.localdomain podman[101960]: 2025-12-06 09:11:17.018070032 +0000 UTC m=+0.156255211 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:11:17 np0005548789.localdomain podman[101940]: 2025-12-06 09:11:17.09205895 +0000 UTC m=+0.245090484 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:11:17 np0005548789.localdomain podman[101939]: 2025-12-06 09:11:17.109585937 +0000 UTC m=+0.266456909 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z)
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:11:17 np0005548789.localdomain podman[101948]: 2025-12-06 09:11:17.148545601 +0000 UTC m=+0.290796005 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:11:17 np0005548789.localdomain podman[101960]: 2025-12-06 09:11:17.204521117 +0000 UTC m=+0.342706266 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:11:17 np0005548789.localdomain podman[101941]: 2025-12-06 09:11:17.261176334 +0000 UTC m=+0.413506307 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Dec 06 09:11:17 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:11:23 np0005548789.localdomain recover_tripleo_nova_virtqemud[102086]: 61814
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: tmp-crun.O0Tccd.mount: Deactivated successfully.
Dec 06 09:11:23 np0005548789.localdomain podman[102068]: 2025-12-06 09:11:23.943789128 +0000 UTC m=+0.098829911 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:11:23 np0005548789.localdomain systemd[1]: tmp-crun.wb1wxm.mount: Deactivated successfully.
Dec 06 09:11:23 np0005548789.localdomain podman[102069]: 2025-12-06 09:11:23.986068614 +0000 UTC m=+0.138593231 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4)
Dec 06 09:11:24 np0005548789.localdomain podman[102067]: 2025-12-06 09:11:24.023194441 +0000 UTC m=+0.182591378 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, version=17.1.12)
Dec 06 09:11:24 np0005548789.localdomain podman[102069]: 2025-12-06 09:11:24.027154193 +0000 UTC m=+0.179678760 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:11:24 np0005548789.localdomain podman[102069]: unhealthy
Dec 06 09:11:24 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:24 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:11:24 np0005548789.localdomain podman[102067]: 2025-12-06 09:11:24.089426862 +0000 UTC m=+0.248823809 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, container_name=ovn_controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:11:24 np0005548789.localdomain podman[102067]: unhealthy
Dec 06 09:11:24 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:24 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:11:24 np0005548789.localdomain podman[102068]: 2025-12-06 09:11:24.165163154 +0000 UTC m=+0.320203867 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:11:24 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:11:25 np0005548789.localdomain sshd[102137]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:11:25 np0005548789.localdomain podman[102139]: 2025-12-06 09:11:25.902648439 +0000 UTC m=+0.069086959 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:11:25 np0005548789.localdomain podman[102139]: 2025-12-06 09:11:25.963200705 +0000 UTC m=+0.129639225 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z)
Dec 06 09:11:25 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:11:26 np0005548789.localdomain sshd[102137]: Received disconnect from 81.192.46.35 port 51508:11: Bye Bye [preauth]
Dec 06 09:11:26 np0005548789.localdomain sshd[102137]: Disconnected from authenticating user root 81.192.46.35 port 51508 [preauth]
Dec 06 09:11:29 np0005548789.localdomain sshd[102165]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:30 np0005548789.localdomain sshd[102165]: Received disconnect from 103.157.25.60 port 47620:11: Bye Bye [preauth]
Dec 06 09:11:30 np0005548789.localdomain sshd[102165]: Disconnected from authenticating user root 103.157.25.60 port 47620 [preauth]
Dec 06 09:11:35 np0005548789.localdomain sshd[102167]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:42 np0005548789.localdomain sshd[102167]: Received disconnect from 45.78.222.162 port 34130:11: Bye Bye [preauth]
Dec 06 09:11:42 np0005548789.localdomain sshd[102167]: Disconnected from authenticating user root 45.78.222.162 port 34130 [preauth]
Dec 06 09:11:47 np0005548789.localdomain sshd[102169]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:11:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:11:47 np0005548789.localdomain podman[102173]: 2025-12-06 09:11:47.929489751 +0000 UTC m=+0.073014590 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=)
Dec 06 09:11:47 np0005548789.localdomain podman[102172]: 2025-12-06 09:11:47.989127079 +0000 UTC m=+0.134732552 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:11:48 np0005548789.localdomain podman[102172]: 2025-12-06 09:11:48.001974053 +0000 UTC m=+0.147579556 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, container_name=collectd, name=rhosp17/openstack-collectd, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain podman[102175]: 2025-12-06 09:11:48.046735475 +0000 UTC m=+0.187586022 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 09:11:48 np0005548789.localdomain sshd[102169]: Connection closed by authenticating user root 92.118.39.95 port 35294 [preauth]
Dec 06 09:11:48 np0005548789.localdomain podman[102171]: 2025-12-06 09:11:48.105261559 +0000 UTC m=+0.254088380 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=)
Dec 06 09:11:48 np0005548789.localdomain podman[102171]: 2025-12-06 09:11:48.136560929 +0000 UTC m=+0.285387750 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, architecture=x86_64, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain podman[102174]: 2025-12-06 09:11:48.159071719 +0000 UTC m=+0.300743440 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git)
Dec 06 09:11:48 np0005548789.localdomain podman[102175]: 2025-12-06 09:11:48.183073534 +0000 UTC m=+0.323924071 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com)
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain podman[102174]: 2025-12-06 09:11:48.215046555 +0000 UTC m=+0.356718266 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, version=17.1.12, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain podman[102181]: 2025-12-06 09:11:48.253120861 +0000 UTC m=+0.390561963 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:11:48 np0005548789.localdomain podman[102181]: 2025-12-06 09:11:48.280891863 +0000 UTC m=+0.418332955 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, vcs-type=git)
Dec 06 09:11:48 np0005548789.localdomain sshd[102300]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain podman[102173]: 2025-12-06 09:11:48.300813233 +0000 UTC m=+0.444338152 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:11:48 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:11:48 np0005548789.localdomain sshd[102300]: Received disconnect from 12.156.67.18 port 50870:11: Bye Bye [preauth]
Dec 06 09:11:48 np0005548789.localdomain sshd[102300]: Disconnected from authenticating user root 12.156.67.18 port 50870 [preauth]
Dec 06 09:11:50 np0005548789.localdomain sudo[102304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:11:50 np0005548789.localdomain sudo[102304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548789.localdomain sudo[102304]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548789.localdomain sudo[102319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:11:50 np0005548789.localdomain sudo[102319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548789.localdomain sudo[102319]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548789.localdomain sudo[102354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:11:50 np0005548789.localdomain sudo[102354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:50 np0005548789.localdomain sudo[102354]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:50 np0005548789.localdomain sudo[102369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:11:50 np0005548789.localdomain sudo[102369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:51 np0005548789.localdomain sudo[102369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:52 np0005548789.localdomain sudo[102417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:11:52 np0005548789.localdomain sudo[102417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:11:52 np0005548789.localdomain sudo[102417]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:11:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:11:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:11:54 np0005548789.localdomain systemd[1]: tmp-crun.kK5qeh.mount: Deactivated successfully.
Dec 06 09:11:55 np0005548789.localdomain podman[102432]: 2025-12-06 09:11:55.013301634 +0000 UTC m=+0.173026285 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Dec 06 09:11:55 np0005548789.localdomain podman[102433]: 2025-12-06 09:11:54.99980353 +0000 UTC m=+0.159063477 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step1, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:11:55 np0005548789.localdomain podman[102432]: 2025-12-06 09:11:55.055064094 +0000 UTC m=+0.214788655 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, distribution-scope=public, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:11:55 np0005548789.localdomain podman[102432]: unhealthy
Dec 06 09:11:55 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:55 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:11:55 np0005548789.localdomain podman[102434]: 2025-12-06 09:11:55.017846653 +0000 UTC m=+0.175822052 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:11:55 np0005548789.localdomain podman[102434]: 2025-12-06 09:11:55.100219289 +0000 UTC m=+0.258194698 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:11:55 np0005548789.localdomain podman[102434]: unhealthy
Dec 06 09:11:55 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:11:55 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:11:55 np0005548789.localdomain podman[102433]: 2025-12-06 09:11:55.188453393 +0000 UTC m=+0.347713340 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z)
Dec 06 09:11:55 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:11:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:11:56 np0005548789.localdomain podman[102502]: 2025-12-06 09:11:56.916412306 +0000 UTC m=+0.079166828 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git)
Dec 06 09:11:56 np0005548789.localdomain podman[102502]: 2025-12-06 09:11:56.945059714 +0000 UTC m=+0.107814176 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team)
Dec 06 09:11:56 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:11:59 np0005548789.localdomain sshd[102529]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:00 np0005548789.localdomain sshd[102529]: Received disconnect from 64.227.156.63 port 47706:11: Bye Bye [preauth]
Dec 06 09:12:00 np0005548789.localdomain sshd[102529]: Disconnected from authenticating user root 64.227.156.63 port 47706 [preauth]
Dec 06 09:12:05 np0005548789.localdomain sshd[102531]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:06 np0005548789.localdomain sshd[102531]: Received disconnect from 103.192.152.59 port 39908:11: Bye Bye [preauth]
Dec 06 09:12:06 np0005548789.localdomain sshd[102531]: Disconnected from authenticating user root 103.192.152.59 port 39908 [preauth]
Dec 06 09:12:11 np0005548789.localdomain sshd[102533]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:12 np0005548789.localdomain sshd[102533]: Received disconnect from 118.193.38.207 port 34588:11: Bye Bye [preauth]
Dec 06 09:12:12 np0005548789.localdomain sshd[102533]: Disconnected from authenticating user root 118.193.38.207 port 34588 [preauth]
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:12:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:12:18 np0005548789.localdomain podman[102538]: 2025-12-06 09:12:18.964298879 +0000 UTC m=+0.112407687 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4)
Dec 06 09:12:19 np0005548789.localdomain podman[102537]: 2025-12-06 09:12:19.002390347 +0000 UTC m=+0.157130068 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z)
Dec 06 09:12:19 np0005548789.localdomain podman[102537]: 2025-12-06 09:12:19.012970852 +0000 UTC m=+0.167710573 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:12:19 np0005548789.localdomain podman[102536]: 2025-12-06 09:12:19.054539195 +0000 UTC m=+0.210739681 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:12:19 np0005548789.localdomain podman[102561]: 2025-12-06 09:12:19.115207536 +0000 UTC m=+0.245366623 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:12:19 np0005548789.localdomain podman[102536]: 2025-12-06 09:12:19.138287123 +0000 UTC m=+0.294487659 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container)
Dec 06 09:12:19 np0005548789.localdomain podman[102561]: 2025-12-06 09:12:19.150128797 +0000 UTC m=+0.280287934 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:12:19 np0005548789.localdomain podman[102544]: 2025-12-06 09:12:19.163396653 +0000 UTC m=+0.308505429 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:12:19 np0005548789.localdomain podman[102544]: 2025-12-06 09:12:19.200118038 +0000 UTC m=+0.345226844 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:12:19 np0005548789.localdomain podman[102550]: 2025-12-06 09:12:19.217537753 +0000 UTC m=+0.355325864 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:12:19 np0005548789.localdomain podman[102550]: 2025-12-06 09:12:19.232141141 +0000 UTC m=+0.369929262 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:12:19 np0005548789.localdomain podman[102538]: 2025-12-06 09:12:19.334044604 +0000 UTC m=+0.482153382 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:12:19 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:12:25 np0005548789.localdomain sshd[102673]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:12:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:12:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:12:25 np0005548789.localdomain podman[102676]: 2025-12-06 09:12:25.92279456 +0000 UTC m=+0.076227757 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:25 np0005548789.localdomain systemd[1]: tmp-crun.HFJrYT.mount: Deactivated successfully.
Dec 06 09:12:25 np0005548789.localdomain podman[102675]: 2025-12-06 09:12:25.992995052 +0000 UTC m=+0.145911474 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible)
Dec 06 09:12:26 np0005548789.localdomain podman[102677]: 2025-12-06 09:12:26.032790582 +0000 UTC m=+0.179361050 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:12:26 np0005548789.localdomain podman[102675]: 2025-12-06 09:12:26.044279095 +0000 UTC m=+0.197195507 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, release=1761123044, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git)
Dec 06 09:12:26 np0005548789.localdomain podman[102675]: unhealthy
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:12:26 np0005548789.localdomain podman[102677]: 2025-12-06 09:12:26.099536148 +0000 UTC m=+0.246106606 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z)
Dec 06 09:12:26 np0005548789.localdomain podman[102677]: unhealthy
Dec 06 09:12:26 np0005548789.localdomain podman[102676]: 2025-12-06 09:12:26.111187445 +0000 UTC m=+0.264620672 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:12:26 np0005548789.localdomain sshd[102673]: Received disconnect from 103.234.151.178 port 19004:11: Bye Bye [preauth]
Dec 06 09:12:26 np0005548789.localdomain sshd[102673]: Disconnected from authenticating user root 103.234.151.178 port 19004 [preauth]
Dec 06 09:12:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:12:27 np0005548789.localdomain podman[102744]: 2025-12-06 09:12:27.075056915 +0000 UTC m=+0.067672996 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, container_name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4)
Dec 06 09:12:27 np0005548789.localdomain podman[102744]: 2025-12-06 09:12:27.096953875 +0000 UTC m=+0.089569926 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:12:27 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:12:29 np0005548789.localdomain sshd[102768]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:30 np0005548789.localdomain sshd[102768]: Received disconnect from 81.192.46.35 port 49826:11: Bye Bye [preauth]
Dec 06 09:12:30 np0005548789.localdomain sshd[102768]: Disconnected from authenticating user root 81.192.46.35 port 49826 [preauth]
Dec 06 09:12:30 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:12:30 np0005548789.localdomain recover_tripleo_nova_virtqemud[102771]: 61814
Dec 06 09:12:30 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:12:30 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:12:44 np0005548789.localdomain sshd[102772]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:45 np0005548789.localdomain sshd[102772]: Received disconnect from 179.33.210.213 port 56546:11: Bye Bye [preauth]
Dec 06 09:12:45 np0005548789.localdomain sshd[102772]: Disconnected from authenticating user root 179.33.210.213 port 56546 [preauth]
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: tmp-crun.QU8GNj.mount: Deactivated successfully.
Dec 06 09:12:49 np0005548789.localdomain podman[102776]: 2025-12-06 09:12:49.948443517 +0000 UTC m=+0.101421091 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:12:49 np0005548789.localdomain systemd[1]: tmp-crun.3mQvPz.mount: Deactivated successfully.
Dec 06 09:12:49 np0005548789.localdomain podman[102775]: 2025-12-06 09:12:49.940974588 +0000 UTC m=+0.098790580 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, config_id=tripleo_step3)
Dec 06 09:12:49 np0005548789.localdomain podman[102790]: 2025-12-06 09:12:49.996194291 +0000 UTC m=+0.139853679 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4)
Dec 06 09:12:50 np0005548789.localdomain podman[102790]: 2025-12-06 09:12:50.019037511 +0000 UTC m=+0.162696909 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:12:50 np0005548789.localdomain podman[102777]: 2025-12-06 09:12:50.042261883 +0000 UTC m=+0.191445200 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:12:50 np0005548789.localdomain podman[102777]: 2025-12-06 09:12:50.062001578 +0000 UTC m=+0.211184875 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:12:50 np0005548789.localdomain podman[102774]: 2025-12-06 09:12:49.968855632 +0000 UTC m=+0.126578201 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, maintainer=OpenStack TripleO Team)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:12:50 np0005548789.localdomain podman[102774]: 2025-12-06 09:12:50.103032936 +0000 UTC m=+0.260755515 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-cron)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:12:50 np0005548789.localdomain podman[102775]: 2025-12-06 09:12:50.123908715 +0000 UTC m=+0.281724727 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:12:50 np0005548789.localdomain podman[102778]: 2025-12-06 09:12:50.107951136 +0000 UTC m=+0.250307274 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z)
Dec 06 09:12:50 np0005548789.localdomain podman[102778]: 2025-12-06 09:12:50.190020753 +0000 UTC m=+0.332376841 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:12:50 np0005548789.localdomain podman[102776]: 2025-12-06 09:12:50.297633761 +0000 UTC m=+0.450611305 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4)
Dec 06 09:12:50 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:12:52 np0005548789.localdomain sshd[102900]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:52 np0005548789.localdomain sshd[102902]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:12:52 np0005548789.localdomain sudo[102903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:12:52 np0005548789.localdomain sudo[102903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:52 np0005548789.localdomain sudo[102903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:52 np0005548789.localdomain sudo[102919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:12:52 np0005548789.localdomain sudo[102919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:52 np0005548789.localdomain sshd[102900]: Received disconnect from 12.156.67.18 port 57882:11: Bye Bye [preauth]
Dec 06 09:12:52 np0005548789.localdomain sshd[102900]: Disconnected from authenticating user root 12.156.67.18 port 57882 [preauth]
Dec 06 09:12:53 np0005548789.localdomain sudo[102919]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:53 np0005548789.localdomain sudo[102965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:12:53 np0005548789.localdomain sudo[102965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:12:53 np0005548789.localdomain sudo[102965]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:53 np0005548789.localdomain sshd[102902]: Received disconnect from 103.157.25.60 port 49290:11: Bye Bye [preauth]
Dec 06 09:12:53 np0005548789.localdomain sshd[102902]: Disconnected from authenticating user root 103.157.25.60 port 49290 [preauth]
Dec 06 09:12:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:12:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:12:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:12:56 np0005548789.localdomain podman[102980]: 2025-12-06 09:12:56.922561455 +0000 UTC m=+0.081084597 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, distribution-scope=public, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Dec 06 09:12:56 np0005548789.localdomain podman[102980]: 2025-12-06 09:12:56.935045878 +0000 UTC m=+0.093569020 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 09:12:56 np0005548789.localdomain podman[102980]: unhealthy
Dec 06 09:12:56 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:56 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:12:56 np0005548789.localdomain podman[102986]: 2025-12-06 09:12:56.992112327 +0000 UTC m=+0.138380984 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible)
Dec 06 09:12:57 np0005548789.localdomain podman[102981]: 2025-12-06 09:12:57.029633827 +0000 UTC m=+0.181510106 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, config_id=tripleo_step1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd)
Dec 06 09:12:57 np0005548789.localdomain podman[102986]: 2025-12-06 09:12:57.035162316 +0000 UTC m=+0.181430923 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:12:57 np0005548789.localdomain podman[102986]: unhealthy
Dec 06 09:12:57 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:12:57 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:12:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:12:57 np0005548789.localdomain podman[102981]: 2025-12-06 09:12:57.250135077 +0000 UTC m=+0.402011336 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z)
Dec 06 09:12:57 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:12:57 np0005548789.localdomain podman[103049]: 2025-12-06 09:12:57.313960364 +0000 UTC m=+0.070347528 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:12:57 np0005548789.localdomain podman[103049]: 2025-12-06 09:12:57.344367966 +0000 UTC m=+0.100755150 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 09:12:57 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:13:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:13:20 np0005548789.localdomain podman[103085]: 2025-12-06 09:13:20.945161478 +0000 UTC m=+0.087942817 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-type=git)
Dec 06 09:13:20 np0005548789.localdomain podman[103085]: 2025-12-06 09:13:20.956106904 +0000 UTC m=+0.098888193 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Dec 06 09:13:20 np0005548789.localdomain podman[103075]: 2025-12-06 09:13:20.980728619 +0000 UTC m=+0.136672681 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team)
Dec 06 09:13:20 np0005548789.localdomain podman[103075]: 2025-12-06 09:13:20.989934241 +0000 UTC m=+0.145878313 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-cron)
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: tmp-crun.zyYXNg.mount: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain podman[103076]: 2025-12-06 09:13:21.050921401 +0000 UTC m=+0.201759507 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1)
Dec 06 09:13:21 np0005548789.localdomain podman[103077]: 2025-12-06 09:13:21.092733562 +0000 UTC m=+0.238671087 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 09:13:21 np0005548789.localdomain podman[103091]: 2025-12-06 09:13:21.146074148 +0000 UTC m=+0.286152164 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:13:21 np0005548789.localdomain podman[103076]: 2025-12-06 09:13:21.166642438 +0000 UTC m=+0.317480604 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain podman[103091]: 2025-12-06 09:13:21.198063552 +0000 UTC m=+0.338141538 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain podman[103078]: 2025-12-06 09:13:21.257599397 +0000 UTC m=+0.404001746 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 09:13:21 np0005548789.localdomain podman[103078]: 2025-12-06 09:13:21.307370903 +0000 UTC m=+0.453773252 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git)
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain podman[103077]: 2025-12-06 09:13:21.440306328 +0000 UTC m=+0.586243853 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git)
Dec 06 09:13:21 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:13:21 np0005548789.localdomain sshd[103210]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:23 np0005548789.localdomain sshd[103210]: Received disconnect from 64.227.156.63 port 43526:11: Bye Bye [preauth]
Dec 06 09:13:23 np0005548789.localdomain sshd[103210]: Disconnected from authenticating user root 64.227.156.63 port 43526 [preauth]
Dec 06 09:13:24 np0005548789.localdomain sshd[103212]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:26 np0005548789.localdomain sshd[103212]: Received disconnect from 118.193.38.207 port 56114:11: Bye Bye [preauth]
Dec 06 09:13:26 np0005548789.localdomain sshd[103212]: Disconnected from authenticating user root 118.193.38.207 port 56114 [preauth]
Dec 06 09:13:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:13:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:13:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:13:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:13:27 np0005548789.localdomain podman[103214]: 2025-12-06 09:13:27.982816526 +0000 UTC m=+0.142909192 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1)
Dec 06 09:13:27 np0005548789.localdomain podman[103215]: 2025-12-06 09:13:27.948874956 +0000 UTC m=+0.108442245 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, managed_by=tripleo_ansible)
Dec 06 09:13:28 np0005548789.localdomain podman[103214]: 2025-12-06 09:13:28.026212317 +0000 UTC m=+0.186304983 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com)
Dec 06 09:13:28 np0005548789.localdomain podman[103214]: unhealthy
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:13:28 np0005548789.localdomain podman[103217]: 2025-12-06 09:13:28.041166935 +0000 UTC m=+0.193552065 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1)
Dec 06 09:13:28 np0005548789.localdomain podman[103217]: 2025-12-06 09:13:28.081256915 +0000 UTC m=+0.233642005 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4)
Dec 06 09:13:28 np0005548789.localdomain podman[103217]: unhealthy
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:13:28 np0005548789.localdomain podman[103216]: 2025-12-06 09:13:28.094886351 +0000 UTC m=+0.250714426 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=)
Dec 06 09:13:28 np0005548789.localdomain podman[103215]: 2025-12-06 09:13:28.142558753 +0000 UTC m=+0.302126022 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, architecture=x86_64)
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:13:28 np0005548789.localdomain podman[103216]: 2025-12-06 09:13:28.15910449 +0000 UTC m=+0.314932555 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-nova-compute)
Dec 06 09:13:28 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:13:35 np0005548789.localdomain sshd[103308]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:36 np0005548789.localdomain sshd[103308]: Received disconnect from 81.192.46.35 port 48156:11: Bye Bye [preauth]
Dec 06 09:13:36 np0005548789.localdomain sshd[103308]: Disconnected from authenticating user root 81.192.46.35 port 48156 [preauth]
Dec 06 09:13:41 np0005548789.localdomain sshd[103310]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:43 np0005548789.localdomain sshd[103310]: Received disconnect from 103.192.152.59 port 55060:11: Bye Bye [preauth]
Dec 06 09:13:43 np0005548789.localdomain sshd[103310]: Disconnected from authenticating user root 103.192.152.59 port 55060 [preauth]
Dec 06 09:13:48 np0005548789.localdomain sshd[103312]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:50 np0005548789.localdomain sshd[103312]: Received disconnect from 103.234.151.178 port 45144:11: Bye Bye [preauth]
Dec 06 09:13:50 np0005548789.localdomain sshd[103312]: Disconnected from authenticating user root 103.234.151.178 port 45144 [preauth]
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:13:51 np0005548789.localdomain podman[103314]: 2025-12-06 09:13:51.93819714 +0000 UTC m=+0.094956122 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4)
Dec 06 09:13:51 np0005548789.localdomain podman[103314]: 2025-12-06 09:13:51.946486524 +0000 UTC m=+0.103245486 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.expose-services=)
Dec 06 09:13:51 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:13:51 np0005548789.localdomain podman[103321]: 2025-12-06 09:13:51.991528034 +0000 UTC m=+0.136090533 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_migration_target)
Dec 06 09:13:52 np0005548789.localdomain podman[103322]: 2025-12-06 09:13:52.047098408 +0000 UTC m=+0.190388157 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:13:52 np0005548789.localdomain podman[103323]: 2025-12-06 09:13:52.097818463 +0000 UTC m=+0.236872163 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044)
Dec 06 09:13:52 np0005548789.localdomain podman[103323]: 2025-12-06 09:13:52.108627134 +0000 UTC m=+0.247680874 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:13:52 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:13:52 np0005548789.localdomain podman[103322]: 2025-12-06 09:13:52.159154103 +0000 UTC m=+0.302443882 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Dec 06 09:13:52 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:13:52 np0005548789.localdomain podman[103315]: 2025-12-06 09:13:52.204328418 +0000 UTC m=+0.348620419 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd)
Dec 06 09:13:52 np0005548789.localdomain podman[103337]: 2025-12-06 09:13:52.162095714 +0000 UTC m=+0.298539624 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.)
Dec 06 09:13:52 np0005548789.localdomain podman[103315]: 2025-12-06 09:13:52.219168953 +0000 UTC m=+0.363460984 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:13:52 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:13:52 np0005548789.localdomain podman[103337]: 2025-12-06 09:13:52.245342985 +0000 UTC m=+0.381786915 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:13:52 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:13:52 np0005548789.localdomain podman[103321]: 2025-12-06 09:13:52.402029928 +0000 UTC m=+0.546592387 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 06 09:13:52 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:13:54 np0005548789.localdomain sudo[103447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:13:54 np0005548789.localdomain sudo[103447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:54 np0005548789.localdomain sudo[103447]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:54 np0005548789.localdomain sudo[103462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:13:54 np0005548789.localdomain sudo[103462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:54 np0005548789.localdomain podman[103547]: 2025-12-06 09:13:54.896407646 +0000 UTC m=+0.094102286 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Dec 06 09:13:54 np0005548789.localdomain podman[103547]: 2025-12-06 09:13:54.999415703 +0000 UTC m=+0.197110313 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, name=rhceph, release=1763362218, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 09:13:55 np0005548789.localdomain sudo[103462]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:55 np0005548789.localdomain sudo[103613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:13:55 np0005548789.localdomain sudo[103613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:55 np0005548789.localdomain sudo[103613]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:55 np0005548789.localdomain sudo[103628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:13:55 np0005548789.localdomain sudo[103628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:56 np0005548789.localdomain sudo[103628]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:56 np0005548789.localdomain sudo[103675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:13:56 np0005548789.localdomain sudo[103675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:13:56 np0005548789.localdomain sudo[103675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:13:57 np0005548789.localdomain sshd[103690]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:58 np0005548789.localdomain sshd[103690]: Received disconnect from 12.156.67.18 port 38092:11: Bye Bye [preauth]
Dec 06 09:13:58 np0005548789.localdomain sshd[103690]: Disconnected from authenticating user root 12.156.67.18 port 38092 [preauth]
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:13:58 np0005548789.localdomain sshd[103710]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: tmp-crun.1lmW42.mount: Deactivated successfully.
Dec 06 09:13:58 np0005548789.localdomain podman[103693]: 2025-12-06 09:13:58.269217563 +0000 UTC m=+0.103081081 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team)
Dec 06 09:13:58 np0005548789.localdomain podman[103719]: 2025-12-06 09:13:58.347022768 +0000 UTC m=+0.073875436 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:13:58 np0005548789.localdomain podman[103693]: 2025-12-06 09:13:58.363536244 +0000 UTC m=+0.197399822 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:13:58 np0005548789.localdomain podman[103693]: unhealthy
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:13:58 np0005548789.localdomain podman[103692]: 2025-12-06 09:13:58.413587749 +0000 UTC m=+0.247520339 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, release=1761123044, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:13:58 np0005548789.localdomain podman[103692]: 2025-12-06 09:13:58.464314754 +0000 UTC m=+0.298247364 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git)
Dec 06 09:13:58 np0005548789.localdomain podman[103692]: unhealthy
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:13:58 np0005548789.localdomain podman[103720]: 2025-12-06 09:13:58.50691583 +0000 UTC m=+0.233640844 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:13:58 np0005548789.localdomain podman[103720]: 2025-12-06 09:13:58.531676009 +0000 UTC m=+0.258401013 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:13:58 np0005548789.localdomain podman[103719]: 2025-12-06 09:13:58.553194189 +0000 UTC m=+0.280046867 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Dec 06 09:13:58 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:13:58 np0005548789.localdomain sshd[103710]: Invalid user ubuntu from 92.118.39.95 port 50274
Dec 06 09:13:58 np0005548789.localdomain sshd[103710]: Connection closed by invalid user ubuntu 92.118.39.95 port 50274 [preauth]
Dec 06 09:13:59 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:13:59 np0005548789.localdomain recover_tripleo_nova_virtqemud[103788]: 61814
Dec 06 09:13:59 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:13:59 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:13:59 np0005548789.localdomain sshd[103789]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:02 np0005548789.localdomain sshd[103789]: Received disconnect from 45.78.222.162 port 53728:11: Bye Bye [preauth]
Dec 06 09:14:02 np0005548789.localdomain sshd[103789]: Disconnected from authenticating user root 45.78.222.162 port 53728 [preauth]
Dec 06 09:14:16 np0005548789.localdomain sshd[103791]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:17 np0005548789.localdomain sshd[103791]: Received disconnect from 103.157.25.60 port 50958:11: Bye Bye [preauth]
Dec 06 09:14:17 np0005548789.localdomain sshd[103791]: Disconnected from authenticating user root 103.157.25.60 port 50958 [preauth]
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:14:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:14:22 np0005548789.localdomain podman[103794]: 2025-12-06 09:14:22.952777819 +0000 UTC m=+0.104950679 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:14:22 np0005548789.localdomain podman[103795]: 2025-12-06 09:14:22.993241379 +0000 UTC m=+0.142522850 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1)
Dec 06 09:14:23 np0005548789.localdomain podman[103794]: 2025-12-06 09:14:23.01220755 +0000 UTC m=+0.164380380 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container)
Dec 06 09:14:23 np0005548789.localdomain podman[103796]: 2025-12-06 09:14:23.05787025 +0000 UTC m=+0.200319001 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:14:23 np0005548789.localdomain podman[103796]: 2025-12-06 09:14:23.10972852 +0000 UTC m=+0.252177291 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:14:23 np0005548789.localdomain podman[103793]: 2025-12-06 09:14:23.15572575 +0000 UTC m=+0.309007584 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=)
Dec 06 09:14:23 np0005548789.localdomain podman[103793]: 2025-12-06 09:14:23.192189798 +0000 UTC m=+0.345471592 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:14:23 np0005548789.localdomain podman[103813]: 2025-12-06 09:14:23.201029869 +0000 UTC m=+0.340505319 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:14:23 np0005548789.localdomain podman[103813]: 2025-12-06 09:14:23.227555872 +0000 UTC m=+0.367031352 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:14:23 np0005548789.localdomain podman[103801]: 2025-12-06 09:14:23.304056347 +0000 UTC m=+0.445985363 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=iscsid, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d)
Dec 06 09:14:23 np0005548789.localdomain podman[103801]: 2025-12-06 09:14:23.315153558 +0000 UTC m=+0.457082574 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:14:23 np0005548789.localdomain podman[103795]: 2025-12-06 09:14:23.38111854 +0000 UTC m=+0.530400011 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git)
Dec 06 09:14:23 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:14:28 np0005548789.localdomain podman[103928]: 2025-12-06 09:14:28.921945601 +0000 UTC m=+0.080033065 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Dec 06 09:14:28 np0005548789.localdomain podman[103927]: 2025-12-06 09:14:28.982539558 +0000 UTC m=+0.140623182 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:14:28 np0005548789.localdomain podman[103928]: 2025-12-06 09:14:28.987187441 +0000 UTC m=+0.145274905 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Dec 06 09:14:28 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:14:29 np0005548789.localdomain podman[103932]: 2025-12-06 09:14:29.080701517 +0000 UTC m=+0.231840267 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12)
Dec 06 09:14:29 np0005548789.localdomain podman[103926]: 2025-12-06 09:14:29.051815512 +0000 UTC m=+0.213102044 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 09:14:29 np0005548789.localdomain podman[103932]: 2025-12-06 09:14:29.121160638 +0000 UTC m=+0.272299418 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, version=17.1.12, tcib_managed=true)
Dec 06 09:14:29 np0005548789.localdomain podman[103932]: unhealthy
Dec 06 09:14:29 np0005548789.localdomain podman[103926]: 2025-12-06 09:14:29.135175668 +0000 UTC m=+0.296462230 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z)
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:14:29 np0005548789.localdomain podman[103926]: unhealthy
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:14:29 np0005548789.localdomain podman[103927]: 2025-12-06 09:14:29.183415727 +0000 UTC m=+0.341499361 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, distribution-scope=public)
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:14:29 np0005548789.localdomain systemd[1]: tmp-crun.vrLii0.mount: Deactivated successfully.
Dec 06 09:14:39 np0005548789.localdomain sshd[104022]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:41 np0005548789.localdomain sshd[104022]: Received disconnect from 118.193.38.207 port 48636:11: Bye Bye [preauth]
Dec 06 09:14:41 np0005548789.localdomain sshd[104022]: Disconnected from authenticating user root 118.193.38.207 port 48636 [preauth]
Dec 06 09:14:44 np0005548789.localdomain sshd[104024]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:45 np0005548789.localdomain sshd[104024]: Received disconnect from 81.192.46.35 port 46472:11: Bye Bye [preauth]
Dec 06 09:14:45 np0005548789.localdomain sshd[104024]: Disconnected from authenticating user root 81.192.46.35 port 46472 [preauth]
Dec 06 09:14:51 np0005548789.localdomain sshd[104026]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:14:52 np0005548789.localdomain sshd[104026]: Received disconnect from 64.227.156.63 port 38792:11: Bye Bye [preauth]
Dec 06 09:14:52 np0005548789.localdomain sshd[104026]: Disconnected from authenticating user root 64.227.156.63 port 38792 [preauth]
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:14:53 np0005548789.localdomain systemd[1]: tmp-crun.QmywUE.mount: Deactivated successfully.
Dec 06 09:14:53 np0005548789.localdomain podman[104030]: 2025-12-06 09:14:53.948887161 +0000 UTC m=+0.102155972 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, build-date=2025-11-19T00:36:58Z)
Dec 06 09:14:53 np0005548789.localdomain podman[104042]: 2025-12-06 09:14:53.958530477 +0000 UTC m=+0.094620801 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Dec 06 09:14:53 np0005548789.localdomain podman[104042]: 2025-12-06 09:14:53.994272793 +0000 UTC m=+0.130363077 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:14:54 np0005548789.localdomain podman[104029]: 2025-12-06 09:14:54.013034798 +0000 UTC m=+0.169099015 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.)
Dec 06 09:14:54 np0005548789.localdomain podman[104029]: 2025-12-06 09:14:54.023241171 +0000 UTC m=+0.179305398 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, container_name=collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:14:54 np0005548789.localdomain podman[104028]: 2025-12-06 09:14:54.094417423 +0000 UTC m=+0.252401709 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:14:54 np0005548789.localdomain podman[104048]: 2025-12-06 09:14:54.072227882 +0000 UTC m=+0.210011739 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi)
Dec 06 09:14:54 np0005548789.localdomain podman[104028]: 2025-12-06 09:14:54.129094736 +0000 UTC m=+0.287078952 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:14:54 np0005548789.localdomain podman[104048]: 2025-12-06 09:14:54.153079341 +0000 UTC m=+0.290863198 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:14:54 np0005548789.localdomain podman[104031]: 2025-12-06 09:14:54.171664271 +0000 UTC m=+0.305694502 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true)
Dec 06 09:14:54 np0005548789.localdomain podman[104031]: 2025-12-06 09:14:54.194109179 +0000 UTC m=+0.328139410 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:14:54 np0005548789.localdomain podman[104030]: 2025-12-06 09:14:54.288161842 +0000 UTC m=+0.441430713 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:14:54 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:14:57 np0005548789.localdomain sudo[104161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:14:57 np0005548789.localdomain sudo[104161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:57 np0005548789.localdomain sudo[104161]: pam_unix(sudo:session): session closed for user root
Dec 06 09:14:57 np0005548789.localdomain sudo[104176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:14:57 np0005548789.localdomain sudo[104176]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:57 np0005548789.localdomain sudo[104176]: pam_unix(sudo:session): session closed for user root
Dec 06 09:14:58 np0005548789.localdomain sudo[104222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:14:58 np0005548789.localdomain sudo[104222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:14:58 np0005548789.localdomain sudo[104222]: pam_unix(sudo:session): session closed for user root
Dec 06 09:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:14:59 np0005548789.localdomain systemd[1]: tmp-crun.Uyyz8A.mount: Deactivated successfully.
Dec 06 09:14:59 np0005548789.localdomain podman[104238]: 2025-12-06 09:14:59.939263294 +0000 UTC m=+0.097900032 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64)
Dec 06 09:14:59 np0005548789.localdomain podman[104239]: 2025-12-06 09:14:59.982044005 +0000 UTC m=+0.138362472 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5)
Dec 06 09:15:00 np0005548789.localdomain podman[104239]: 2025-12-06 09:15:00.014686236 +0000 UTC m=+0.171004703 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, container_name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:15:00 np0005548789.localdomain podman[104237]: 2025-12-06 09:15:00.021154264 +0000 UTC m=+0.182705882 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, version=17.1.12, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4)
Dec 06 09:15:00 np0005548789.localdomain podman[104240]: 2025-12-06 09:15:00.087566671 +0000 UTC m=+0.240225026 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Dec 06 09:15:00 np0005548789.localdomain podman[104240]: 2025-12-06 09:15:00.104263762 +0000 UTC m=+0.256922137 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:00 np0005548789.localdomain podman[104240]: unhealthy
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:15:00 np0005548789.localdomain podman[104238]: 2025-12-06 09:15:00.135194131 +0000 UTC m=+0.293830909 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:15:00 np0005548789.localdomain podman[104237]: 2025-12-06 09:15:00.15440999 +0000 UTC m=+0.315961598 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z)
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:15:00 np0005548789.localdomain podman[104237]: unhealthy
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:00 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:15:03 np0005548789.localdomain sshd[104327]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:04 np0005548789.localdomain sshd[104327]: Received disconnect from 12.156.67.18 port 35402:11: Bye Bye [preauth]
Dec 06 09:15:04 np0005548789.localdomain sshd[104327]: Disconnected from authenticating user root 12.156.67.18 port 35402 [preauth]
Dec 06 09:15:11 np0005548789.localdomain sshd[104329]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:11 np0005548789.localdomain sshd[104331]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:13 np0005548789.localdomain sshd[104329]: Received disconnect from 103.234.151.178 port 7750:11: Bye Bye [preauth]
Dec 06 09:15:13 np0005548789.localdomain sshd[104329]: Disconnected from authenticating user root 103.234.151.178 port 7750 [preauth]
Dec 06 09:15:13 np0005548789.localdomain sshd[104331]: Received disconnect from 103.192.152.59 port 50880:11: Bye Bye [preauth]
Dec 06 09:15:13 np0005548789.localdomain sshd[104331]: Disconnected from authenticating user root 103.192.152.59 port 50880 [preauth]
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:15:24 np0005548789.localdomain podman[104333]: 2025-12-06 09:15:24.941533759 +0000 UTC m=+0.086438371 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:15:24 np0005548789.localdomain podman[104353]: 2025-12-06 09:15:24.952412542 +0000 UTC m=+0.077121745 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Dec 06 09:15:24 np0005548789.localdomain systemd[1]: tmp-crun.uY4Wo6.mount: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104335]: 2025-12-06 09:15:25.003045604 +0000 UTC m=+0.142174629 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:15:25 np0005548789.localdomain podman[104353]: 2025-12-06 09:15:25.007771009 +0000 UTC m=+0.132480232 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104333]: 2025-12-06 09:15:25.026928756 +0000 UTC m=+0.171833388 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104336]: 2025-12-06 09:15:25.098716387 +0000 UTC m=+0.234251332 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12)
Dec 06 09:15:25 np0005548789.localdomain podman[104347]: 2025-12-06 09:15:25.02477821 +0000 UTC m=+0.155093075 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:25 np0005548789.localdomain podman[104334]: 2025-12-06 09:15:25.148636938 +0000 UTC m=+0.291019553 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team)
Dec 06 09:15:25 np0005548789.localdomain podman[104347]: 2025-12-06 09:15:25.158159969 +0000 UTC m=+0.288474824 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104334]: 2025-12-06 09:15:25.181317749 +0000 UTC m=+0.323700374 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, distribution-scope=public)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104336]: 2025-12-06 09:15:25.201061844 +0000 UTC m=+0.336596779 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:15:25 np0005548789.localdomain podman[104335]: 2025-12-06 09:15:25.435221713 +0000 UTC m=+0.574350718 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:15:25 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:15:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:15:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:15:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:15:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:15:30 np0005548789.localdomain podman[104468]: 2025-12-06 09:15:30.925536835 +0000 UTC m=+0.080461648 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5)
Dec 06 09:15:30 np0005548789.localdomain podman[104468]: 2025-12-06 09:15:30.956105302 +0000 UTC m=+0.111030105 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1)
Dec 06 09:15:30 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Deactivated successfully.
Dec 06 09:15:30 np0005548789.localdomain podman[104467]: 2025-12-06 09:15:30.977172287 +0000 UTC m=+0.133221584 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:31 np0005548789.localdomain podman[104469]: 2025-12-06 09:15:31.033626898 +0000 UTC m=+0.183420654 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:15:31 np0005548789.localdomain podman[104469]: 2025-12-06 09:15:31.051155926 +0000 UTC m=+0.200949682 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4)
Dec 06 09:15:31 np0005548789.localdomain podman[104469]: unhealthy
Dec 06 09:15:31 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:31 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:15:31 np0005548789.localdomain podman[104466]: 2025-12-06 09:15:31.130720685 +0000 UTC m=+0.288060092 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z)
Dec 06 09:15:31 np0005548789.localdomain podman[104467]: 2025-12-06 09:15:31.156607668 +0000 UTC m=+0.312656985 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1)
Dec 06 09:15:31 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:15:31 np0005548789.localdomain podman[104466]: 2025-12-06 09:15:31.173423944 +0000 UTC m=+0.330763391 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, release=1761123044, vcs-type=git)
Dec 06 09:15:31 np0005548789.localdomain podman[104466]: unhealthy
Dec 06 09:15:31 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:15:31 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:15:40 np0005548789.localdomain sshd[104556]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:42 np0005548789.localdomain sshd[104558]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:43 np0005548789.localdomain sshd[104558]: Received disconnect from 103.157.25.60 port 52752:11: Bye Bye [preauth]
Dec 06 09:15:43 np0005548789.localdomain sshd[104558]: Disconnected from authenticating user root 103.157.25.60 port 52752 [preauth]
Dec 06 09:15:43 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:15:44 np0005548789.localdomain recover_tripleo_nova_virtqemud[104561]: 61814
Dec 06 09:15:44 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:15:44 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:15:44 np0005548789.localdomain sshd[104556]: Received disconnect from 179.33.210.213 port 34402:11: Bye Bye [preauth]
Dec 06 09:15:44 np0005548789.localdomain sshd[104556]: Disconnected from authenticating user root 179.33.210.213 port 34402 [preauth]
Dec 06 09:15:52 np0005548789.localdomain sshd[104562]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:53 np0005548789.localdomain sshd[104562]: Received disconnect from 81.192.46.35 port 44792:11: Bye Bye [preauth]
Dec 06 09:15:53 np0005548789.localdomain sshd[104562]: Disconnected from authenticating user root 81.192.46.35 port 44792 [preauth]
Dec 06 09:15:53 np0005548789.localdomain sshd[104564]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:15:55 np0005548789.localdomain sshd[104564]: Received disconnect from 118.193.38.207 port 38716:11: Bye Bye [preauth]
Dec 06 09:15:55 np0005548789.localdomain sshd[104564]: Disconnected from authenticating user root 118.193.38.207 port 38716 [preauth]
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: tmp-crun.to3dTV.mount: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104567]: 2025-12-06 09:15:55.549621979 +0000 UTC m=+0.116692940 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:15:55 np0005548789.localdomain podman[104575]: 2025-12-06 09:15:55.58883586 +0000 UTC m=+0.148296928 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:15:55 np0005548789.localdomain podman[104567]: 2025-12-06 09:15:55.59603591 +0000 UTC m=+0.163106841 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104575]: 2025-12-06 09:15:55.634040075 +0000 UTC m=+0.193501113 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104569]: 2025-12-06 09:15:55.646836717 +0000 UTC m=+0.206945995 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1)
Dec 06 09:15:55 np0005548789.localdomain podman[104568]: 2025-12-06 09:15:55.698187562 +0000 UTC m=+0.262396555 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12)
Dec 06 09:15:55 np0005548789.localdomain podman[104568]: 2025-12-06 09:15:55.741096777 +0000 UTC m=+0.305305750 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104569]: 2025-12-06 09:15:55.77641358 +0000 UTC m=+0.336522868 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104566]: 2025-12-06 09:15:55.745865844 +0000 UTC m=+0.315560556 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc.)
Dec 06 09:15:55 np0005548789.localdomain podman[104566]: 2025-12-06 09:15:55.828150066 +0000 UTC m=+0.397844778 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-cron, release=1761123044, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, config_id=tripleo_step4, container_name=logrotate_crond)
Dec 06 09:15:55 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:15:55 np0005548789.localdomain podman[104622]: 2025-12-06 09:15:55.679313413 +0000 UTC m=+0.106267408 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:15:55 np0005548789.localdomain podman[104622]: 2025-12-06 09:15:55.995146106 +0000 UTC m=+0.422100131 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:15:56 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:15:58 np0005548789.localdomain sudo[104699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:15:58 np0005548789.localdomain sudo[104699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:15:58 np0005548789.localdomain sudo[104699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:15:58 np0005548789.localdomain sudo[104714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:15:58 np0005548789.localdomain sudo[104714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:15:59 np0005548789.localdomain sudo[104714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:00 np0005548789.localdomain sudo[104762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:16:00 np0005548789.localdomain sudo[104762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:16:00 np0005548789.localdomain sudo[104762]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:00 np0005548789.localdomain sshd[104777]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:16:01 np0005548789.localdomain podman[104781]: 2025-12-06 09:16:01.930745627 +0000 UTC m=+0.084709278 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:16:01 np0005548789.localdomain podman[104782]: 2025-12-06 09:16:01.942811148 +0000 UTC m=+0.090549958 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, vcs-type=git, build-date=2025-11-19T00:14:25Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Dec 06 09:16:01 np0005548789.localdomain podman[104781]: 2025-12-06 09:16:01.951965598 +0000 UTC m=+0.105929269 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute)
Dec 06 09:16:01 np0005548789.localdomain podman[104781]: unhealthy
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:01 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: tmp-crun.A7FGfZ.mount: Deactivated successfully.
Dec 06 09:16:02 np0005548789.localdomain podman[104780]: 2025-12-06 09:16:02.107634201 +0000 UTC m=+0.263844821 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 09:16:02 np0005548789.localdomain podman[104782]: 2025-12-06 09:16:02.123810996 +0000 UTC m=+0.271549836 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.openshift.expose-services=, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:16:02 np0005548789.localdomain podman[104782]: unhealthy
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:16:02 np0005548789.localdomain podman[104779]: 2025-12-06 09:16:02.078830167 +0000 UTC m=+0.235695856 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 09:16:02 np0005548789.localdomain podman[104779]: 2025-12-06 09:16:02.207803451 +0000 UTC m=+0.364669090 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-type=git)
Dec 06 09:16:02 np0005548789.localdomain podman[104779]: unhealthy
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:16:02 np0005548789.localdomain podman[104780]: 2025-12-06 09:16:02.331311707 +0000 UTC m=+0.487522297 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Dec 06 09:16:02 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:16:02 np0005548789.localdomain sshd[104777]: Connection reset by authenticating user root 91.202.233.33 port 61724 [preauth]
Dec 06 09:16:03 np0005548789.localdomain sshd[104868]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:05 np0005548789.localdomain sshd[104868]: Connection reset by authenticating user root 91.202.233.33 port 36784 [preauth]
Dec 06 09:16:05 np0005548789.localdomain sshd[104870]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:07 np0005548789.localdomain sshd[104870]: Connection reset by authenticating user root 91.202.233.33 port 36802 [preauth]
Dec 06 09:16:07 np0005548789.localdomain sshd[104872]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:08 np0005548789.localdomain sshd[104874]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:08 np0005548789.localdomain sshd[104876]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:08 np0005548789.localdomain sshd[104874]: Received disconnect from 12.156.67.18 port 32842:11: Bye Bye [preauth]
Dec 06 09:16:08 np0005548789.localdomain sshd[104874]: Disconnected from authenticating user root 12.156.67.18 port 32842 [preauth]
Dec 06 09:16:09 np0005548789.localdomain sshd[104876]: Invalid user ubuntu from 92.118.39.95 port 37034
Dec 06 09:16:09 np0005548789.localdomain sshd[104876]: Connection closed by invalid user ubuntu 92.118.39.95 port 37034 [preauth]
Dec 06 09:16:09 np0005548789.localdomain sshd[104872]: Connection reset by authenticating user root 91.202.233.33 port 36810 [preauth]
Dec 06 09:16:09 np0005548789.localdomain sshd[104878]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:11 np0005548789.localdomain sshd[104878]: Connection reset by authenticating user root 91.202.233.33 port 36830 [preauth]
Dec 06 09:16:16 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49370 DF PROTO=TCP SPT=32840 DPT=9105 SEQ=376045614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF13330000000001030307) 
Dec 06 09:16:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19399 DF PROTO=TCP SPT=36498 DPT=9100 SEQ=3394050689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF17080000000001030307) 
Dec 06 09:16:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49371 DF PROTO=TCP SPT=32840 DPT=9105 SEQ=376045614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF172F0000000001030307) 
Dec 06 09:16:18 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19400 DF PROTO=TCP SPT=36498 DPT=9100 SEQ=3394050689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF1B300000000001030307) 
Dec 06 09:16:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49372 DF PROTO=TCP SPT=32840 DPT=9105 SEQ=376045614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF1F2F0000000001030307) 
Dec 06 09:16:20 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19401 DF PROTO=TCP SPT=36498 DPT=9100 SEQ=3394050689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF232F0000000001030307) 
Dec 06 09:16:23 np0005548789.localdomain sshd[104880]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49373 DF PROTO=TCP SPT=32840 DPT=9105 SEQ=376045614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF2EF00000000001030307) 
Dec 06 09:16:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51159 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF2FA80000000001030307) 
Dec 06 09:16:24 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19402 DF PROTO=TCP SPT=36498 DPT=9100 SEQ=3394050689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF32EF0000000001030307) 
Dec 06 09:16:24 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51160 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF33B00000000001030307) 
Dec 06 09:16:25 np0005548789.localdomain sshd[104880]: Received disconnect from 64.227.156.63 port 46010:11: Bye Bye [preauth]
Dec 06 09:16:25 np0005548789.localdomain sshd[104880]: Disconnected from authenticating user root 64.227.156.63 port 46010 [preauth]
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:16:25 np0005548789.localdomain systemd[1]: tmp-crun.MGd74U.mount: Deactivated successfully.
Dec 06 09:16:25 np0005548789.localdomain podman[104885]: 2025-12-06 09:16:25.956316351 +0000 UTC m=+0.106594768 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:16:25 np0005548789.localdomain podman[104884]: 2025-12-06 09:16:25.913879761 +0000 UTC m=+0.073197755 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:16:26 np0005548789.localdomain podman[104883]: 2025-12-06 09:16:25.967412312 +0000 UTC m=+0.127979965 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:16:26 np0005548789.localdomain podman[104884]: 2025-12-06 09:16:26.044124063 +0000 UTC m=+0.203442087 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain podman[104882]: 2025-12-06 09:16:26.023677186 +0000 UTC m=+0.184156316 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, container_name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:16:26 np0005548789.localdomain podman[104900]: 2025-12-06 09:16:26.049837938 +0000 UTC m=+0.194896465 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 06 09:16:26 np0005548789.localdomain podman[104882]: 2025-12-06 09:16:26.108052763 +0000 UTC m=+0.268531833 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain podman[104900]: 2025-12-06 09:16:26.133198354 +0000 UTC m=+0.278256831 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., url=https://www.redhat.com)
Dec 06 09:16:26 np0005548789.localdomain podman[104885]: 2025-12-06 09:16:26.145630005 +0000 UTC m=+0.295908452 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain podman[104986]: 2025-12-06 09:16:26.18462432 +0000 UTC m=+0.132013078 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team)
Dec 06 09:16:26 np0005548789.localdomain podman[104883]: 2025-12-06 09:16:26.198009841 +0000 UTC m=+0.358577484 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain sshd[105010]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:26 np0005548789.localdomain podman[104986]: 2025-12-06 09:16:26.56068887 +0000 UTC m=+0.508077588 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container)
Dec 06 09:16:26 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:16:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51161 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF3BB00000000001030307) 
Dec 06 09:16:28 np0005548789.localdomain sshd[105010]: Received disconnect from 45.78.222.162 port 42156:11: Bye Bye [preauth]
Dec 06 09:16:28 np0005548789.localdomain sshd[105010]: Disconnected from authenticating user root 45.78.222.162 port 42156 [preauth]
Dec 06 09:16:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51162 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF4B700000000001030307) 
Dec 06 09:16:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49374 DF PROTO=TCP SPT=32840 DPT=9105 SEQ=376045614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF4FEF0000000001030307) 
Dec 06 09:16:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:16:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:16:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:16:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:16:32 np0005548789.localdomain systemd[1]: tmp-crun.NcldAR.mount: Deactivated successfully.
Dec 06 09:16:32 np0005548789.localdomain podman[105014]: 2025-12-06 09:16:32.948174527 +0000 UTC m=+0.105152965 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container)
Dec 06 09:16:33 np0005548789.localdomain podman[105013]: 2025-12-06 09:16:33.001481392 +0000 UTC m=+0.159440729 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true)
Dec 06 09:16:33 np0005548789.localdomain podman[105015]: 2025-12-06 09:16:33.041024264 +0000 UTC m=+0.194654049 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:16:33 np0005548789.localdomain podman[105013]: 2025-12-06 09:16:33.047650326 +0000 UTC m=+0.205609653 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Dec 06 09:16:33 np0005548789.localdomain podman[105013]: unhealthy
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:16:33 np0005548789.localdomain podman[105016]: 2025-12-06 09:16:33.088701755 +0000 UTC m=+0.238983928 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:16:33 np0005548789.localdomain podman[105016]: 2025-12-06 09:16:33.111069401 +0000 UTC m=+0.261351574 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:16:33 np0005548789.localdomain podman[105016]: unhealthy
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:16:33 np0005548789.localdomain podman[105015]: 2025-12-06 09:16:33.163410225 +0000 UTC m=+0.317040060 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:16:33 np0005548789.localdomain podman[105015]: unhealthy
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:16:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19403 DF PROTO=TCP SPT=36498 DPT=9100 SEQ=3394050689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF53EF0000000001030307) 
Dec 06 09:16:33 np0005548789.localdomain podman[105014]: 2025-12-06 09:16:33.214733429 +0000 UTC m=+0.371711867 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:16:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16951 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=3862954808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF55D60000000001030307) 
Dec 06 09:16:33 np0005548789.localdomain systemd[1]: tmp-crun.WwzWXr.mount: Deactivated successfully.
Dec 06 09:16:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16952 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=3862954808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF59EF0000000001030307) 
Dec 06 09:16:35 np0005548789.localdomain sshd[105102]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:36 np0005548789.localdomain sshd[105102]: Received disconnect from 103.234.151.178 port 33872:11: Bye Bye [preauth]
Dec 06 09:16:36 np0005548789.localdomain sshd[105102]: Disconnected from authenticating user root 103.234.151.178 port 33872 [preauth]
Dec 06 09:16:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16953 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=3862954808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF61EF0000000001030307) 
Dec 06 09:16:37 np0005548789.localdomain sshd[105104]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:37 np0005548789.localdomain sshd[105104]: Accepted publickey for zuul from 192.168.122.31 port 36920 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:16:37 np0005548789.localdomain systemd-logind[766]: New session 37 of user zuul.
Dec 06 09:16:37 np0005548789.localdomain systemd[1]: Started Session 37 of User zuul.
Dec 06 09:16:37 np0005548789.localdomain sshd[105104]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:16:38 np0005548789.localdomain sudo[105197]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plmjanxotvafevztzwtvgsdyvhrxyhbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012597.9452593-27-179806358758654/AnsiballZ_stat.py
Dec 06 09:16:38 np0005548789.localdomain sudo[105197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:38 np0005548789.localdomain python3.9[105199]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:38 np0005548789.localdomain sshd[105201]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:38 np0005548789.localdomain sudo[105197]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:39 np0005548789.localdomain sudo[105293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfrzhdkchrwfuzefhxwkkmicvpjkjxzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012598.866414-63-84202019969889/AnsiballZ_command.py
Dec 06 09:16:39 np0005548789.localdomain sudo[105293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51163 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF6BEF0000000001030307) 
Dec 06 09:16:39 np0005548789.localdomain python3.9[105295]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:39 np0005548789.localdomain sudo[105293]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:40 np0005548789.localdomain sudo[105386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbswfxwbfurcfbfhtvdpstmlhdlfgalo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012599.8196986-87-193405104887910/AnsiballZ_stat.py
Dec 06 09:16:40 np0005548789.localdomain sudo[105386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:40 np0005548789.localdomain python3.9[105388]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:40 np0005548789.localdomain sudo[105386]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:40 np0005548789.localdomain sshd[105201]: Received disconnect from 103.192.152.59 port 53576:11: Bye Bye [preauth]
Dec 06 09:16:40 np0005548789.localdomain sshd[105201]: Disconnected from authenticating user root 103.192.152.59 port 53576 [preauth]
Dec 06 09:16:40 np0005548789.localdomain sudo[105480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekfzyswrluptnugedvehhjuzrgqldetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012600.495084-111-189212766874246/AnsiballZ_command.py
Dec 06 09:16:40 np0005548789.localdomain sudo[105480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16954 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=3862954808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF71AF0000000001030307) 
Dec 06 09:16:40 np0005548789.localdomain python3.9[105482]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:41 np0005548789.localdomain sudo[105480]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:41 np0005548789.localdomain sudo[105573]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-showaoxncdibmbnnshkucvkzidrzvjma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012601.3410356-138-70470819660467/AnsiballZ_command.py
Dec 06 09:16:41 np0005548789.localdomain sudo[105573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:41 np0005548789.localdomain python3.9[105575]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:16:41 np0005548789.localdomain sudo[105573]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:42 np0005548789.localdomain python3.9[105666]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:16:44 np0005548789.localdomain python3.9[105756]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:16:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3987 DF PROTO=TCP SPT=48846 DPT=9101 SEQ=3545670120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF7F3F0000000001030307) 
Dec 06 09:16:44 np0005548789.localdomain python3.9[105848]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:16:45 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3988 DF PROTO=TCP SPT=48846 DPT=9101 SEQ=3545670120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF83300000000001030307) 
Dec 06 09:16:46 np0005548789.localdomain python3.9[105938]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:16:46 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39158 DF PROTO=TCP SPT=54560 DPT=9105 SEQ=2042264912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF88640000000001030307) 
Dec 06 09:16:46 np0005548789.localdomain python3.9[105986]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:16:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3989 DF PROTO=TCP SPT=48846 DPT=9101 SEQ=3545670120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF8B2F0000000001030307) 
Dec 06 09:16:47 np0005548789.localdomain sshd[105104]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:16:47 np0005548789.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Dec 06 09:16:47 np0005548789.localdomain systemd[1]: session-37.scope: Consumed 4.724s CPU time.
Dec 06 09:16:47 np0005548789.localdomain systemd-logind[766]: Session 37 logged out. Waiting for processes to exit.
Dec 06 09:16:47 np0005548789.localdomain systemd-logind[766]: Removed session 37.
Dec 06 09:16:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9154 DF PROTO=TCP SPT=56244 DPT=9100 SEQ=1924823422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF8C380000000001030307) 
Dec 06 09:16:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39159 DF PROTO=TCP SPT=54560 DPT=9105 SEQ=2042264912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF8C6F0000000001030307) 
Dec 06 09:16:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39160 DF PROTO=TCP SPT=54560 DPT=9105 SEQ=2042264912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BF94700000000001030307) 
Dec 06 09:16:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39161 DF PROTO=TCP SPT=54560 DPT=9105 SEQ=2042264912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFA42F0000000001030307) 
Dec 06 09:16:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51164 DF PROTO=TCP SPT=44704 DPT=9882 SEQ=50667696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFABF00000000001030307) 
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:16:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:16:56 np0005548789.localdomain podman[106011]: 2025-12-06 09:16:56.953515277 +0000 UTC m=+0.097709596 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:16:56 np0005548789.localdomain podman[106011]: 2025-12-06 09:16:56.988094877 +0000 UTC m=+0.132289206 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step3)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: tmp-crun.euSGza.mount: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain podman[106017]: 2025-12-06 09:16:57.055140873 +0000 UTC m=+0.195439933 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1)
Dec 06 09:16:57 np0005548789.localdomain podman[106004]: 2025-12-06 09:16:57.102807744 +0000 UTC m=+0.247116257 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true)
Dec 06 09:16:57 np0005548789.localdomain podman[106017]: 2025-12-06 09:16:57.111882692 +0000 UTC m=+0.252181772 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain podman[106002]: 2025-12-06 09:16:57.164881647 +0000 UTC m=+0.320272120 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:16:57 np0005548789.localdomain podman[106002]: 2025-12-06 09:16:57.172959754 +0000 UTC m=+0.328350257 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Dec 06 09:16:57 np0005548789.localdomain podman[106003]: 2025-12-06 09:16:57.026791584 +0000 UTC m=+0.179689820 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, architecture=x86_64)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain podman[106003]: 2025-12-06 09:16:57.209520105 +0000 UTC m=+0.362418441 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain podman[106005]: 2025-12-06 09:16:57.266515922 +0000 UTC m=+0.410649039 container health_status a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:16:57 np0005548789.localdomain podman[106005]: 2025-12-06 09:16:57.316496044 +0000 UTC m=+0.460629181 container exec_died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain podman[106004]: 2025-12-06 09:16:57.439802704 +0000 UTC m=+0.584111237 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:16:57 np0005548789.localdomain sshd[106134]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:16:57 np0005548789.localdomain sshd[106134]: Accepted publickey for zuul from 192.168.122.31 port 50082 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:16:57 np0005548789.localdomain systemd-logind[766]: New session 38 of user zuul.
Dec 06 09:16:57 np0005548789.localdomain systemd[1]: Started Session 38 of User zuul.
Dec 06 09:16:57 np0005548789.localdomain sshd[106134]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:16:58 np0005548789.localdomain sudo[106227]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbusmtsuehmznkzawpfjegctwjtqvgli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012617.7882442-24-45337395536208/AnsiballZ_systemd_service.py
Dec 06 09:16:58 np0005548789.localdomain sudo[106227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:58 np0005548789.localdomain python3.9[106229]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:16:58 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:16:58 np0005548789.localdomain systemd-rc-local-generator[106253]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:16:58 np0005548789.localdomain systemd-sysv-generator[106256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:16:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:16:59 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:16:59 np0005548789.localdomain systemd[1]: Starting dnf makecache...
Dec 06 09:16:59 np0005548789.localdomain sudo[106227]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:59 np0005548789.localdomain recover_tripleo_nova_virtqemud[106267]: 61814
Dec 06 09:16:59 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:16:59 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:16:59 np0005548789.localdomain dnf[106266]: Updating Subscription Management repositories.
Dec 06 09:16:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3991 DF PROTO=TCP SPT=48846 DPT=9101 SEQ=3545670120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFBBEF0000000001030307) 
Dec 06 09:17:00 np0005548789.localdomain python3.9[106357]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:17:00 np0005548789.localdomain network[106374]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:17:00 np0005548789.localdomain network[106375]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:17:00 np0005548789.localdomain network[106376]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:17:00 np0005548789.localdomain sudo[106381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:17:00 np0005548789.localdomain sudo[106381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:00 np0005548789.localdomain sudo[106381]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:00 np0005548789.localdomain sudo[106396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:17:00 np0005548789.localdomain sudo[106396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:01 np0005548789.localdomain sshd[106430]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:01 np0005548789.localdomain sudo[106396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:01 np0005548789.localdomain dnf[106266]: Metadata cache refreshed recently.
Dec 06 09:17:01 np0005548789.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 09:17:01 np0005548789.localdomain systemd[1]: Finished dnf makecache.
Dec 06 09:17:01 np0005548789.localdomain systemd[1]: dnf-makecache.service: Consumed 2.332s CPU time.
Dec 06 09:17:01 np0005548789.localdomain sshd[106430]: Received disconnect from 81.192.46.35 port 43116:11: Bye Bye [preauth]
Dec 06 09:17:01 np0005548789.localdomain sshd[106430]: Disconnected from authenticating user root 81.192.46.35 port 43116 [preauth]
Dec 06 09:17:01 np0005548789.localdomain sudo[106453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:17:01 np0005548789.localdomain sudo[106453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:17:01 np0005548789.localdomain sudo[106453]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39162 DF PROTO=TCP SPT=54560 DPT=9105 SEQ=2042264912 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFC3EF0000000001030307) 
Dec 06 09:17:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:17:03 np0005548789.localdomain podman[106563]: 2025-12-06 09:17:03.284826872 +0000 UTC m=+0.079133887 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true)
Dec 06 09:17:03 np0005548789.localdomain podman[106532]: 2025-12-06 09:17:03.202914691 +0000 UTC m=+0.102549245 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git)
Dec 06 09:17:03 np0005548789.localdomain podman[106545]: 2025-12-06 09:17:03.254361598 +0000 UTC m=+0.089190346 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.12)
Dec 06 09:17:03 np0005548789.localdomain podman[106563]: 2025-12-06 09:17:03.336446404 +0000 UTC m=+0.130753429 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:17:03 np0005548789.localdomain podman[106563]: unhealthy
Dec 06 09:17:03 np0005548789.localdomain podman[106583]: 2025-12-06 09:17:03.344087268 +0000 UTC m=+0.081514130 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:17:03 np0005548789.localdomain podman[106545]: 2025-12-06 09:17:03.384653561 +0000 UTC m=+0.219482319 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team)
Dec 06 09:17:03 np0005548789.localdomain podman[106532]: 2025-12-06 09:17:03.38753385 +0000 UTC m=+0.287168434 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 06 09:17:03 np0005548789.localdomain podman[106532]: unhealthy
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:17:03 np0005548789.localdomain podman[106545]: unhealthy
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:17:03 np0005548789.localdomain podman[106583]: 2025-12-06 09:17:03.53104358 +0000 UTC m=+0.268470502 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team)
Dec 06 09:17:03 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:17:03 np0005548789.localdomain sshd[106623]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7214 DF PROTO=TCP SPT=60606 DPT=9102 SEQ=595273658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFCEEF0000000001030307) 
Dec 06 09:17:04 np0005548789.localdomain sshd[106664]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:05 np0005548789.localdomain sshd[106623]: Received disconnect from 103.157.25.60 port 54418:11: Bye Bye [preauth]
Dec 06 09:17:05 np0005548789.localdomain sshd[106623]: Disconnected from authenticating user root 103.157.25.60 port 54418 [preauth]
Dec 06 09:17:06 np0005548789.localdomain sshd[106664]: Received disconnect from 118.193.38.207 port 36964:11: Bye Bye [preauth]
Dec 06 09:17:06 np0005548789.localdomain sshd[106664]: Disconnected from authenticating user root 118.193.38.207 port 36964 [preauth]
Dec 06 09:17:06 np0005548789.localdomain python3.9[106741]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:17:06 np0005548789.localdomain network[106758]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:17:06 np0005548789.localdomain network[106759]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:17:06 np0005548789.localdomain network[106760]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:17:08 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11973 DF PROTO=TCP SPT=40214 DPT=9882 SEQ=1514282561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFE1EF0000000001030307) 
Dec 06 09:17:10 np0005548789.localdomain sudo[106957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggyxwrpiqgtkgzdxsjcelmshcklllcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012630.4013336-114-107354254147424/AnsiballZ_systemd_service.py
Dec 06 09:17:10 np0005548789.localdomain sudo[106957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:17:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7216 DF PROTO=TCP SPT=60606 DPT=9102 SEQ=595273658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFE6AF0000000001030307) 
Dec 06 09:17:11 np0005548789.localdomain sshd[106960]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:11 np0005548789.localdomain python3.9[106959]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:17:11 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:17:11 np0005548789.localdomain systemd-rc-local-generator[106986]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:17:11 np0005548789.localdomain systemd-sysv-generator[106993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:17:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:11 np0005548789.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 06 09:17:11 np0005548789.localdomain sshd[106960]: Received disconnect from 12.156.67.18 port 52368:11: Bye Bye [preauth]
Dec 06 09:17:11 np0005548789.localdomain sshd[106960]: Disconnected from authenticating user root 12.156.67.18 port 52368 [preauth]
Dec 06 09:17:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58176 DF PROTO=TCP SPT=50230 DPT=9101 SEQ=213042245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52BFF4700000000001030307) 
Dec 06 09:17:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58178 DF PROTO=TCP SPT=50230 DPT=9101 SEQ=213042245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0006F0000000001030307) 
Dec 06 09:17:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34747 DF PROTO=TCP SPT=35542 DPT=9105 SEQ=1265973637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C009B00000000001030307) 
Dec 06 09:17:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34748 DF PROTO=TCP SPT=35542 DPT=9105 SEQ=1265973637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0196F0000000001030307) 
Dec 06 09:17:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11974 DF PROTO=TCP SPT=40214 DPT=9882 SEQ=1514282561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C021EF0000000001030307) 
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:17:27 np0005548789.localdomain podman[107015]: 2025-12-06 09:17:27.17526018 +0000 UTC m=+0.084571503 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, release=1761123044, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:17:27 np0005548789.localdomain podman[107015]: 2025-12-06 09:17:27.224581322 +0000 UTC m=+0.133892625 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:17:27 np0005548789.localdomain podman[107035]: 2025-12-06 09:17:27.271441448 +0000 UTC m=+0.065476858 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044)
Dec 06 09:17:27 np0005548789.localdomain podman[107057]: 2025-12-06 09:17:27.35564977 +0000 UTC m=+0.088547595 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:17:27 np0005548789.localdomain podman[107035]: 2025-12-06 09:17:27.361418277 +0000 UTC m=+0.155453707 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:17:27 np0005548789.localdomain podman[107034]: 2025-12-06 09:17:27.331974534 +0000 UTC m=+0.133278547 container health_status b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:17:27 np0005548789.localdomain podman[107057]: 2025-12-06 09:17:27.392809879 +0000 UTC m=+0.125707724 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=collectd, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-collectd)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:17:27 np0005548789.localdomain podman[107034]: 2025-12-06 09:17:27.420193858 +0000 UTC m=+0.221497881 container exec_died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Deactivated successfully.
Dec 06 09:17:27 np0005548789.localdomain podman[107097]: Error: container a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 is not running
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Failed with result 'exit-code'.
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:17:27 np0005548789.localdomain podman[107114]: 2025-12-06 09:17:27.60127485 +0000 UTC m=+0.076865868 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:17:27 np0005548789.localdomain podman[107114]: 2025-12-06 09:17:27.97640352 +0000 UTC m=+0.451994488 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true)
Dec 06 09:17:27 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:17:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58180 DF PROTO=TCP SPT=50230 DPT=9101 SEQ=213042245 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C02FEF0000000001030307) 
Dec 06 09:17:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34749 DF PROTO=TCP SPT=35542 DPT=9105 SEQ=1265973637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C039EF0000000001030307) 
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:17:33 np0005548789.localdomain podman[107137]: 2025-12-06 09:17:33.68424033 +0000 UTC m=+0.092161067 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4)
Dec 06 09:17:33 np0005548789.localdomain podman[107137]: 2025-12-06 09:17:33.72926033 +0000 UTC m=+0.137181047 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:17:33 np0005548789.localdomain podman[107137]: unhealthy
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:17:33 np0005548789.localdomain podman[107138]: 2025-12-06 09:17:33.745749145 +0000 UTC m=+0.150412351 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr)
Dec 06 09:17:33 np0005548789.localdomain podman[107143]: 2025-12-06 09:17:33.793941822 +0000 UTC m=+0.192585294 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:17:33 np0005548789.localdomain podman[107139]: 2025-12-06 09:17:33.849694192 +0000 UTC m=+0.251733929 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:17:33 np0005548789.localdomain podman[107143]: 2025-12-06 09:17:33.86200821 +0000 UTC m=+0.260651682 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:17:33 np0005548789.localdomain podman[107143]: unhealthy
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:17:33 np0005548789.localdomain podman[107139]: 2025-12-06 09:17:33.895367133 +0000 UTC m=+0.297406870 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_id=tripleo_step5)
Dec 06 09:17:33 np0005548789.localdomain podman[107139]: unhealthy
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:17:33 np0005548789.localdomain podman[107138]: 2025-12-06 09:17:33.955227127 +0000 UTC m=+0.359890303 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible)
Dec 06 09:17:33 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:17:34 np0005548789.localdomain systemd[1]: tmp-crun.8Fg5GR.mount: Deactivated successfully.
Dec 06 09:17:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54799 DF PROTO=TCP SPT=39736 DPT=9102 SEQ=3229812551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0442F0000000001030307) 
Dec 06 09:17:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16957 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=3862954808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C04FEF0000000001030307) 
Dec 06 09:17:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54801 DF PROTO=TCP SPT=39736 DPT=9102 SEQ=3229812551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C05BEF0000000001030307) 
Dec 06 09:17:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10083 DF PROTO=TCP SPT=51286 DPT=9101 SEQ=2421426605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C069A00000000001030307) 
Dec 06 09:17:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10085 DF PROTO=TCP SPT=51286 DPT=9101 SEQ=2421426605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C075AF0000000001030307) 
Dec 06 09:17:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1823 DF PROTO=TCP SPT=33042 DPT=9105 SEQ=3654695548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C07EAF0000000001030307) 
Dec 06 09:17:51 np0005548789.localdomain sshd[107226]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:53 np0005548789.localdomain sshd[107226]: Received disconnect from 103.234.151.178 port 59994:11: Bye Bye [preauth]
Dec 06 09:17:53 np0005548789.localdomain sshd[107226]: Disconnected from authenticating user root 103.234.151.178 port 59994 [preauth]
Dec 06 09:17:53 np0005548789.localdomain podman[107002]: time="2025-12-06T09:17:53Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: libpod-a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.scope: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: libpod-a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.scope: Consumed 5.215s CPU time.
Dec 06 09:17:53 np0005548789.localdomain podman[107002]: 2025-12-06 09:17:53.626787085 +0000 UTC m=+42.104093093 container died a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.timer: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Failed to open /run/systemd/transient/a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: No such file or directory
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: tmp-crun.3KDGoB.mount: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9-userdata-shm.mount: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain podman[107002]: 2025-12-06 09:17:53.681868094 +0000 UTC m=+42.159174072 container cleanup a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z)
Dec 06 09:17:53 np0005548789.localdomain podman[107002]: ceilometer_agent_compute
Dec 06 09:17:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1824 DF PROTO=TCP SPT=33042 DPT=9105 SEQ=3654695548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C08E6F0000000001030307) 
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.timer: Failed to open /run/systemd/transient/a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.timer: No such file or directory
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Failed to open /run/systemd/transient/a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: No such file or directory
Dec 06 09:17:53 np0005548789.localdomain podman[107229]: 2025-12-06 09:17:53.717810766 +0000 UTC m=+0.077625541 container cleanup a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: libpod-conmon-a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.scope: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.timer: Failed to open /run/systemd/transient/a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.timer: No such file or directory
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: Failed to open /run/systemd/transient/a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9.service: No such file or directory
Dec 06 09:17:53 np0005548789.localdomain podman[107246]: 2025-12-06 09:17:53.819803472 +0000 UTC m=+0.066966454 container cleanup a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Dec 06 09:17:53 np0005548789.localdomain podman[107246]: ceilometer_agent_compute
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 06 09:17:53 np0005548789.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.064s CPU time, no IO.
Dec 06 09:17:53 np0005548789.localdomain sudo[106957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:17:54 np0005548789.localdomain sudo[107346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxyivokxyimuidfuxuwelilnifmiixqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012673.9737697-114-42874041112005/AnsiballZ_systemd_service.py
Dec 06 09:17:54 np0005548789.localdomain sudo[107346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:17:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-15de5573c617e73fedd1daaecfac821d4b4021582e250a3cae6d24e4b8e4cd51-merged.mount: Deactivated successfully.
Dec 06 09:17:54 np0005548789.localdomain python3.9[107348]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:17:54 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:17:54 np0005548789.localdomain systemd-rc-local-generator[107372]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:17:54 np0005548789.localdomain systemd-sysv-generator[107378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:17:54 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:17:55 np0005548789.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Dec 06 09:17:55 np0005548789.localdomain systemd[1]: tmp-crun.dwxIby.mount: Deactivated successfully.
Dec 06 09:17:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64633 DF PROTO=TCP SPT=38780 DPT=9882 SEQ=1054856800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C095F00000000001030307) 
Dec 06 09:17:55 np0005548789.localdomain sshd[107403]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:17:57 np0005548789.localdomain sshd[107403]: Received disconnect from 64.227.156.63 port 60066:11: Bye Bye [preauth]
Dec 06 09:17:57 np0005548789.localdomain sshd[107403]: Disconnected from authenticating user root 64.227.156.63 port 60066 [preauth]
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: tmp-crun.SidGPq.mount: Deactivated successfully.
Dec 06 09:17:57 np0005548789.localdomain podman[107407]: 2025-12-06 09:17:57.683808137 +0000 UTC m=+0.085377747 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:17:57 np0005548789.localdomain podman[107407]: 2025-12-06 09:17:57.728559859 +0000 UTC m=+0.130129439 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible)
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: tmp-crun.doAp3b.mount: Deactivated successfully.
Dec 06 09:17:57 np0005548789.localdomain podman[107408]: Error: container b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 is not running
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed with result 'exit-code'.
Dec 06 09:17:57 np0005548789.localdomain podman[107405]: 2025-12-06 09:17:57.778289354 +0000 UTC m=+0.182978130 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, name=rhosp17/openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:17:57 np0005548789.localdomain podman[107405]: 2025-12-06 09:17:57.786441684 +0000 UTC m=+0.191130520 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true)
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:17:57 np0005548789.localdomain podman[107406]: 2025-12-06 09:17:57.741928769 +0000 UTC m=+0.143463288 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Dec 06 09:17:57 np0005548789.localdomain podman[107406]: 2025-12-06 09:17:57.872593815 +0000 UTC m=+0.274128364 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd)
Dec 06 09:17:57 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:17:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:17:58 np0005548789.localdomain podman[107474]: 2025-12-06 09:17:58.665427411 +0000 UTC m=+0.076437715 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, container_name=nova_migration_target, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute)
Dec 06 09:17:59 np0005548789.localdomain podman[107474]: 2025-12-06 09:17:59.047130152 +0000 UTC m=+0.458140386 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, vcs-type=git)
Dec 06 09:17:59 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:17:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10087 DF PROTO=TCP SPT=51286 DPT=9101 SEQ=2421426605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0A5EF0000000001030307) 
Dec 06 09:18:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1825 DF PROTO=TCP SPT=33042 DPT=9105 SEQ=3654695548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0ADF00000000001030307) 
Dec 06 09:18:02 np0005548789.localdomain sudo[107498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:18:02 np0005548789.localdomain sudo[107498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:02 np0005548789.localdomain sudo[107498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:02 np0005548789.localdomain sudo[107513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:18:02 np0005548789.localdomain sudo[107513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:02 np0005548789.localdomain sudo[107513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:03 np0005548789.localdomain sudo[107560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:18:03 np0005548789.localdomain sudo[107560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:18:03 np0005548789.localdomain sudo[107560]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:18:03 np0005548789.localdomain podman[107575]: 2025-12-06 09:18:03.894091143 +0000 UTC m=+0.056007168 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64)
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:18:03 np0005548789.localdomain podman[107575]: 2025-12-06 09:18:03.911589919 +0000 UTC m=+0.073506004 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, distribution-scope=public, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:18:03 np0005548789.localdomain podman[107575]: unhealthy
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:18:03 np0005548789.localdomain podman[107593]: 2025-12-06 09:18:03.965023507 +0000 UTC m=+0.057358439 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4)
Dec 06 09:18:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:18:04 np0005548789.localdomain podman[107603]: 2025-12-06 09:18:04.013274477 +0000 UTC m=+0.086434741 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 09:18:04 np0005548789.localdomain podman[107593]: 2025-12-06 09:18:04.033477746 +0000 UTC m=+0.125812678 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, tcib_managed=true)
Dec 06 09:18:04 np0005548789.localdomain podman[107593]: unhealthy
Dec 06 09:18:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:18:04 np0005548789.localdomain podman[107603]: 2025-12-06 09:18:04.059178754 +0000 UTC m=+0.132339028 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:18:04 np0005548789.localdomain podman[107603]: unhealthy
Dec 06 09:18:04 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:04 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:18:04 np0005548789.localdomain podman[107626]: 2025-12-06 09:18:04.12362901 +0000 UTC m=+0.123422085 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container)
Dec 06 09:18:04 np0005548789.localdomain podman[107626]: 2025-12-06 09:18:04.343153119 +0000 UTC m=+0.342946204 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, config_id=tripleo_step1)
Dec 06 09:18:04 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:18:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42592 DF PROTO=TCP SPT=58746 DPT=9102 SEQ=2208224687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0B96F0000000001030307) 
Dec 06 09:18:05 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:18:05 np0005548789.localdomain recover_tripleo_nova_virtqemud[107665]: 61814
Dec 06 09:18:05 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:18:05 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:18:05 np0005548789.localdomain sshd[107666]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:06 np0005548789.localdomain sshd[107668]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:07 np0005548789.localdomain sshd[107668]: Received disconnect from 81.192.46.35 port 41434:11: Bye Bye [preauth]
Dec 06 09:18:07 np0005548789.localdomain sshd[107668]: Disconnected from authenticating user root 81.192.46.35 port 41434 [preauth]
Dec 06 09:18:07 np0005548789.localdomain sshd[107666]: Received disconnect from 103.192.152.59 port 36408:11: Bye Bye [preauth]
Dec 06 09:18:07 np0005548789.localdomain sshd[107666]: Disconnected from authenticating user root 103.192.152.59 port 36408 [preauth]
Dec 06 09:18:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7219 DF PROTO=TCP SPT=60606 DPT=9102 SEQ=595273658 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0C5EF0000000001030307) 
Dec 06 09:18:10 np0005548789.localdomain sshd[107670]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42594 DF PROTO=TCP SPT=58746 DPT=9102 SEQ=2208224687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0D12F0000000001030307) 
Dec 06 09:18:11 np0005548789.localdomain sshd[107670]: Invalid user ubuntu from 92.118.39.95 port 52024
Dec 06 09:18:11 np0005548789.localdomain sshd[107670]: Connection closed by invalid user ubuntu 92.118.39.95 port 52024 [preauth]
Dec 06 09:18:12 np0005548789.localdomain sshd[107672]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:13 np0005548789.localdomain sshd[107672]: Received disconnect from 12.156.67.18 port 53548:11: Bye Bye [preauth]
Dec 06 09:18:13 np0005548789.localdomain sshd[107672]: Disconnected from authenticating user root 12.156.67.18 port 53548 [preauth]
Dec 06 09:18:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5262 DF PROTO=TCP SPT=59642 DPT=9101 SEQ=1211313988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0DED00000000001030307) 
Dec 06 09:18:16 np0005548789.localdomain sshd[107674]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5264 DF PROTO=TCP SPT=59642 DPT=9101 SEQ=1211313988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0EAEF0000000001030307) 
Dec 06 09:18:18 np0005548789.localdomain sshd[107674]: Received disconnect from 118.193.38.207 port 47488:11: Bye Bye [preauth]
Dec 06 09:18:18 np0005548789.localdomain sshd[107674]: Disconnected from authenticating user root 118.193.38.207 port 47488 [preauth]
Dec 06 09:18:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21345 DF PROTO=TCP SPT=43012 DPT=9105 SEQ=1515895537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C0F3EF0000000001030307) 
Dec 06 09:18:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21346 DF PROTO=TCP SPT=43012 DPT=9105 SEQ=1515895537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C103AF0000000001030307) 
Dec 06 09:18:25 np0005548789.localdomain sshd[107676]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12084 DF PROTO=TCP SPT=57334 DPT=9882 SEQ=2216587260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C10BEF0000000001030307) 
Dec 06 09:18:26 np0005548789.localdomain sshd[107676]: Received disconnect from 103.157.25.60 port 56088:11: Bye Bye [preauth]
Dec 06 09:18:26 np0005548789.localdomain sshd[107676]: Disconnected from authenticating user root 103.157.25.60 port 56088 [preauth]
Dec 06 09:18:27 np0005548789.localdomain sshd[107678]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: tmp-crun.PdrVzd.mount: Deactivated successfully.
Dec 06 09:18:27 np0005548789.localdomain podman[107679]: 2025-12-06 09:18:27.940363151 +0000 UTC m=+0.092072263 container health_status 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:18:27 np0005548789.localdomain podman[107681]: Error: container b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 is not running
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed with result 'exit-code'.
Dec 06 09:18:27 np0005548789.localdomain podman[107679]: 2025-12-06 09:18:27.981179422 +0000 UTC m=+0.132888534 container exec_died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Dec 06 09:18:27 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Deactivated successfully.
Dec 06 09:18:28 np0005548789.localdomain systemd[1]: tmp-crun.LzHasY.mount: Deactivated successfully.
Dec 06 09:18:28 np0005548789.localdomain podman[107680]: 2025-12-06 09:18:28.04958655 +0000 UTC m=+0.199752255 container health_status b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:28 np0005548789.localdomain podman[107680]: 2025-12-06 09:18:28.087232074 +0000 UTC m=+0.237397719 container exec_died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:18:28 np0005548789.localdomain podman[107720]: 2025-12-06 09:18:28.09688938 +0000 UTC m=+0.134677189 container health_status 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vcs-type=git, version=17.1.12, container_name=collectd)
Dec 06 09:18:28 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Deactivated successfully.
Dec 06 09:18:28 np0005548789.localdomain podman[107720]: 2025-12-06 09:18:28.110364312 +0000 UTC m=+0.148152122 container exec_died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z)
Dec 06 09:18:28 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Deactivated successfully.
Dec 06 09:18:29 np0005548789.localdomain sshd[107678]: Invalid user ubuntu from 45.140.17.124 port 64934
Dec 06 09:18:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:18:29 np0005548789.localdomain systemd[1]: tmp-crun.ooehbT.mount: Deactivated successfully.
Dec 06 09:18:29 np0005548789.localdomain podman[107749]: 2025-12-06 09:18:29.581862753 +0000 UTC m=+0.093220979 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, version=17.1.12, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4)
Dec 06 09:18:29 np0005548789.localdomain sshd[107678]: Connection reset by invalid user ubuntu 45.140.17.124 port 64934 [preauth]
Dec 06 09:18:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5266 DF PROTO=TCP SPT=59642 DPT=9101 SEQ=1211313988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C11BEF0000000001030307) 
Dec 06 09:18:29 np0005548789.localdomain podman[107749]: 2025-12-06 09:18:29.960206832 +0000 UTC m=+0.471565048 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Dec 06 09:18:29 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:18:30 np0005548789.localdomain sshd[107773]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:30 np0005548789.localdomain sshd[107775]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:31 np0005548789.localdomain sshd[107773]: Connection reset by authenticating user root 45.140.17.124 port 64940 [preauth]
Dec 06 09:18:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21347 DF PROTO=TCP SPT=43012 DPT=9105 SEQ=1515895537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C123F00000000001030307) 
Dec 06 09:18:32 np0005548789.localdomain sshd[107777]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:18:34 np0005548789.localdomain podman[107779]: 2025-12-06 09:18:34.16713884 +0000 UTC m=+0.078430525 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:18:34 np0005548789.localdomain podman[107779]: 2025-12-06 09:18:34.210255032 +0000 UTC m=+0.121546747 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team)
Dec 06 09:18:34 np0005548789.localdomain sshd[107775]: Received disconnect from 179.33.210.213 port 37644:11: Bye Bye [preauth]
Dec 06 09:18:34 np0005548789.localdomain sshd[107775]: Disconnected from authenticating user root 179.33.210.213 port 37644 [preauth]
Dec 06 09:18:34 np0005548789.localdomain podman[107779]: unhealthy
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:34 np0005548789.localdomain podman[107780]: 2025-12-06 09:18:34.226810129 +0000 UTC m=+0.136182966 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:18:34 np0005548789.localdomain podman[107780]: 2025-12-06 09:18:34.271390076 +0000 UTC m=+0.180762953 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:18:34 np0005548789.localdomain podman[107780]: unhealthy
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:18:34 np0005548789.localdomain podman[107781]: 2025-12-06 09:18:34.289659876 +0000 UTC m=+0.192882635 container health_status 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4)
Dec 06 09:18:34 np0005548789.localdomain podman[107781]: 2025-12-06 09:18:34.30546021 +0000 UTC m=+0.208682949 container exec_died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:18:34 np0005548789.localdomain podman[107781]: unhealthy
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:18:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50264 DF PROTO=TCP SPT=54836 DPT=9102 SEQ=3526283303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C12EAF0000000001030307) 
Dec 06 09:18:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:18:34 np0005548789.localdomain podman[107839]: 2025-12-06 09:18:34.92001644 +0000 UTC m=+0.082611343 container health_status 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:18:35 np0005548789.localdomain podman[107839]: 2025-12-06 09:18:35.12029045 +0000 UTC m=+0.282885323 container exec_died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Dec 06 09:18:35 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain podman[107389]: time="2025-12-06T09:18:37Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: tmp-crun.6bjeXq.mount: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: libpod-b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.scope: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: libpod-b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.scope: Consumed 5.022s CPU time.
Dec 06 09:18:37 np0005548789.localdomain podman[107389]: 2025-12-06 09:18:37.266291729 +0000 UTC m=+42.106054036 container died b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.timer: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed to open /run/systemd/transient/b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: No such file or directory
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain podman[107389]: 2025-12-06 09:18:37.327494895 +0000 UTC m=+42.167257122 container cleanup b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container)
Dec 06 09:18:37 np0005548789.localdomain podman[107389]: ceilometer_agent_ipmi
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.timer: Failed to open /run/systemd/transient/b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.timer: No such file or directory
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed to open /run/systemd/transient/b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: No such file or directory
Dec 06 09:18:37 np0005548789.localdomain podman[107870]: 2025-12-06 09:18:37.405648811 +0000 UTC m=+0.126994934 container cleanup b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: libpod-conmon-b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.scope: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.timer: Failed to open /run/systemd/transient/b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.timer: No such file or directory
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: Failed to open /run/systemd/transient/b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6.service: No such file or directory
Dec 06 09:18:37 np0005548789.localdomain podman[107883]: 2025-12-06 09:18:37.507493073 +0000 UTC m=+0.069059748 container cleanup b63991e076084f33fdbae970a3ed856f42cb83fed28fd3dc61558f92f8ff65a6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi)
Dec 06 09:18:37 np0005548789.localdomain podman[107883]: ceilometer_agent_ipmi
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Dec 06 09:18:37 np0005548789.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Dec 06 09:18:37 np0005548789.localdomain sudo[107346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:37 np0005548789.localdomain sudo[107986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqbzriknbkoqjsmipgtrvevuotsmplyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012717.6791122-114-178043421518948/AnsiballZ_systemd_service.py
Dec 06 09:18:37 np0005548789.localdomain sudo[107986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c7edd91eaf927f0cc6c745dda6c529d67c09cf793c73be3335bece938eeb713d-merged.mount: Deactivated successfully.
Dec 06 09:18:38 np0005548789.localdomain python3.9[107988]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:38 np0005548789.localdomain sshd[107777]: Invalid user test1 from 45.140.17.124 port 64950
Dec 06 09:18:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:18:38 np0005548789.localdomain systemd-sysv-generator[108016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:38 np0005548789.localdomain systemd-rc-local-generator[108012]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:38 np0005548789.localdomain sshd[107777]: Connection reset by invalid user test1 45.140.17.124 port 64950 [preauth]
Dec 06 09:18:38 np0005548789.localdomain systemd[1]: Stopping collectd container...
Dec 06 09:18:38 np0005548789.localdomain sshd[108040]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:38 np0005548789.localdomain systemd[1]: tmp-crun.fkRsYw.mount: Deactivated successfully.
Dec 06 09:18:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27001 DF PROTO=TCP SPT=51632 DPT=9882 SEQ=2718330891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C13FF00000000001030307) 
Dec 06 09:18:40 np0005548789.localdomain sshd[108040]: Invalid user admin from 45.140.17.124 port 59974
Dec 06 09:18:40 np0005548789.localdomain sshd[108040]: Connection reset by invalid user admin 45.140.17.124 port 59974 [preauth]
Dec 06 09:18:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50266 DF PROTO=TCP SPT=54836 DPT=9102 SEQ=3526283303 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1466F0000000001030307) 
Dec 06 09:18:40 np0005548789.localdomain sshd[108043]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: libpod-2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.scope: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: libpod-2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.scope: Consumed 2.024s CPU time.
Dec 06 09:18:42 np0005548789.localdomain podman[108029]: 2025-12-06 09:18:42.036474655 +0000 UTC m=+3.289200135 container stop 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, container_name=collectd, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Dec 06 09:18:42 np0005548789.localdomain podman[108029]: 2025-12-06 09:18:42.067177656 +0000 UTC m=+3.319903126 container died 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, release=1761123044, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com)
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: tmp-crun.31wJvF.mount: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.timer: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Failed to open /run/systemd/transient/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: tmp-crun.bZ5Qzu.mount: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain podman[108029]: 2025-12-06 09:18:42.134436238 +0000 UTC m=+3.387161718 container cleanup 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, release=1761123044, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true)
Dec 06 09:18:42 np0005548789.localdomain podman[108029]: collectd
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.timer: Failed to open /run/systemd/transient/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.timer: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Failed to open /run/systemd/transient/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain podman[108045]: 2025-12-06 09:18:42.157418263 +0000 UTC m=+0.104781964 container cleanup 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-collectd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: libpod-conmon-2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.scope: Deactivated successfully.
Dec 06 09:18:42 np0005548789.localdomain podman[108076]: error opening file `/run/crun/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185/status`: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.timer: Failed to open /run/systemd/transient/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.timer: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: Failed to open /run/systemd/transient/2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185.service: No such file or directory
Dec 06 09:18:42 np0005548789.localdomain podman[108065]: 2025-12-06 09:18:42.276128721 +0000 UTC m=+0.081334854 container cleanup 2057e5294552232f9f4337cf76c4aeca0fbda37fb4011fc7748fc47e299ca185 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4767aaabc3de112d8791c290aa2b669d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Dec 06 09:18:42 np0005548789.localdomain podman[108065]: collectd
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Dec 06 09:18:42 np0005548789.localdomain systemd[1]: Stopped collectd container.
Dec 06 09:18:42 np0005548789.localdomain sudo[107986]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:42 np0005548789.localdomain sudo[108167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edajgoogiwryrxblsjyqtouaifqiqamp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012722.4418604-114-100555232535677/AnsiballZ_systemd_service.py
Dec 06 09:18:42 np0005548789.localdomain sudo[108167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:43 np0005548789.localdomain python3.9[108169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d980d54738e5f040d62ff40bb9abb4b1931da0f4c80c1ba3031e7feabd416146-merged.mount: Deactivated successfully.
Dec 06 09:18:43 np0005548789.localdomain sshd[108043]: Connection reset by authenticating user root 45.140.17.124 port 59994 [preauth]
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:18:44 np0005548789.localdomain systemd-rc-local-generator[108192]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:44 np0005548789.localdomain systemd-sysv-generator[108197]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12871 DF PROTO=TCP SPT=37808 DPT=9101 SEQ=2036028519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C154000000000001030307) 
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: Stopping iscsid container...
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: libpod-b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.scope: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: libpod-b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.scope: Consumed 1.024s CPU time.
Dec 06 09:18:44 np0005548789.localdomain podman[108210]: 2025-12-06 09:18:44.516363749 +0000 UTC m=+0.064636163 container died b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git)
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.timer: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Failed to open /run/systemd/transient/b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: No such file or directory
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef5a2dfca972201470637fe24151a27f03799cbb5a942d988431f305c9ea334c-merged.mount: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain podman[108210]: 2025-12-06 09:18:44.570250301 +0000 UTC m=+0.118522685 container cleanup b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Dec 06 09:18:44 np0005548789.localdomain podman[108210]: iscsid
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.timer: Failed to open /run/systemd/transient/b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.timer: No such file or directory
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Failed to open /run/systemd/transient/b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: No such file or directory
Dec 06 09:18:44 np0005548789.localdomain podman[108224]: 2025-12-06 09:18:44.596247667 +0000 UTC m=+0.071648787 container cleanup b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, release=1761123044)
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: libpod-conmon-b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.scope: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.timer: Failed to open /run/systemd/transient/b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.timer: No such file or directory
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: Failed to open /run/systemd/transient/b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4.service: No such file or directory
Dec 06 09:18:44 np0005548789.localdomain podman[108237]: 2025-12-06 09:18:44.697018456 +0000 UTC m=+0.070947815 container cleanup b42304674c227698d01aa8793300474cc86d1ff521bfa05eefa217b5f5fc74a4 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Dec 06 09:18:44 np0005548789.localdomain podman[108237]: iscsid
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Dec 06 09:18:44 np0005548789.localdomain systemd[1]: Stopped iscsid container.
Dec 06 09:18:44 np0005548789.localdomain sudo[108167]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:45 np0005548789.localdomain sudo[108338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdqfzskzpdmyekyqigwhkffwqyqzgihd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012724.8652375-114-176570233151747/AnsiballZ_systemd_service.py
Dec 06 09:18:45 np0005548789.localdomain sudo[108338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:45 np0005548789.localdomain python3.9[108340]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:18:45 np0005548789.localdomain systemd-rc-local-generator[108363]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:45 np0005548789.localdomain systemd-sysv-generator[108368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: Stopping logrotate_crond container...
Dec 06 09:18:45 np0005548789.localdomain crond[68752]: (CRON) INFO (Shutting down)
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: libpod-04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.scope: Deactivated successfully.
Dec 06 09:18:45 np0005548789.localdomain podman[108381]: 2025-12-06 09:18:45.867149478 +0000 UTC m=+0.073315188 container died 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z)
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.timer: Deactivated successfully.
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.
Dec 06 09:18:45 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Failed to open /run/systemd/transient/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: No such file or directory
Dec 06 09:18:45 np0005548789.localdomain podman[108381]: 2025-12-06 09:18:45.995819893 +0000 UTC m=+0.201985543 container cleanup 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git)
Dec 06 09:18:45 np0005548789.localdomain podman[108381]: logrotate_crond
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.timer: Failed to open /run/systemd/transient/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.timer: No such file or directory
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Failed to open /run/systemd/transient/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: No such file or directory
Dec 06 09:18:46 np0005548789.localdomain podman[108394]: 2025-12-06 09:18:46.021864921 +0000 UTC m=+0.154797466 container cleanup 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, container_name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron)
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: libpod-conmon-04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.scope: Deactivated successfully.
Dec 06 09:18:46 np0005548789.localdomain podman[108426]: error opening file `/run/crun/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc/status`: No such file or directory
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.timer: Failed to open /run/systemd/transient/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.timer: No such file or directory
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: Failed to open /run/systemd/transient/04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc.service: No such file or directory
Dec 06 09:18:46 np0005548789.localdomain podman[108414]: 2025-12-06 09:18:46.135213297 +0000 UTC m=+0.075290140 container cleanup 04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:18:46 np0005548789.localdomain podman[108414]: logrotate_crond
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: Stopped logrotate_crond container.
Dec 06 09:18:46 np0005548789.localdomain sudo[108338]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:46 np0005548789.localdomain sudo[108517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwamnehtvbflsjhoegemjoxnqlfegheo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012726.3021889-114-114187479239336/AnsiballZ_systemd_service.py
Dec 06 09:18:46 np0005548789.localdomain sudo[108517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:46 np0005548789.localdomain python3.9[108519]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:46 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:18:46 np0005548789.localdomain systemd-rc-local-generator[108542]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:46 np0005548789.localdomain systemd-sysv-generator[108546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04f6b29afd29691f16a28d2049f2bd2128d6ac4e86c833f5e34438bbec3e10bc-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cd6425452938e99a947d98ed440c416f97c1a47fc1a973380479b4612f15ab3d-merged.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: Stopping metrics_qdr container...
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: tmp-crun.VN62HF.mount: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain kernel: qdrouterd[54519]: segfault at 0 ip 00007fb4f238b7cb sp 00007ffe18e60290 error 4 in libc.so.6[7fb4f2328000+175000]
Dec 06 09:18:47 np0005548789.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: Started Process Core Dump (PID 108573/UID 0).
Dec 06 09:18:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12873 DF PROTO=TCP SPT=37808 DPT=9101 SEQ=2036028519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C15FF00000000001030307) 
Dec 06 09:18:47 np0005548789.localdomain systemd-coredump[108574]: Resource limits disable core dumping for process 54519 (qdrouterd).
Dec 06 09:18:47 np0005548789.localdomain systemd-coredump[108574]: Process 54519 (qdrouterd) of user 42465 dumped core.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: systemd-coredump@0-108573-0.service: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain podman[108560]: 2025-12-06 09:18:47.451626293 +0000 UTC m=+0.238700660 container died 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, distribution-scope=public)
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: libpod-203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.scope: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: libpod-203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.scope: Consumed 27.794s CPU time.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.timer: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Failed to open /run/systemd/transient/203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: No such file or directory
Dec 06 09:18:47 np0005548789.localdomain podman[108560]: 2025-12-06 09:18:47.512706245 +0000 UTC m=+0.299780622 container cleanup 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z)
Dec 06 09:18:47 np0005548789.localdomain podman[108560]: metrics_qdr
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.timer: Failed to open /run/systemd/transient/203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.timer: No such file or directory
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Failed to open /run/systemd/transient/203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: No such file or directory
Dec 06 09:18:47 np0005548789.localdomain podman[108578]: 2025-12-06 09:18:47.575883592 +0000 UTC m=+0.111450288 container cleanup 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1)
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: libpod-conmon-203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.scope: Deactivated successfully.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.timer: Failed to open /run/systemd/transient/203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.timer: No such file or directory
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: Failed to open /run/systemd/transient/203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c.service: No such file or directory
Dec 06 09:18:47 np0005548789.localdomain podman[108592]: 2025-12-06 09:18:47.668518842 +0000 UTC m=+0.066571713 container cleanup 203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'e8f60832f8f2382eeceefcaaff307d45'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:18:47 np0005548789.localdomain podman[108592]: metrics_qdr
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Dec 06 09:18:47 np0005548789.localdomain systemd[1]: Stopped metrics_qdr container.
Dec 06 09:18:47 np0005548789.localdomain sudo[108517]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:48 np0005548789.localdomain sudo[108693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xttruyghtfdnbhwgdotywnnplqwszvka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012727.831048-114-99754640952207/AnsiballZ_systemd_service.py
Dec 06 09:18:48 np0005548789.localdomain sudo[108693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-beaf327340ccd7215a759765519263eed11e8999b460cf785f7dbab3207ce8ee-merged.mount: Deactivated successfully.
Dec 06 09:18:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-203277eef6de030daaca698f56ad8c5cc24d6633f7bc188a739eccc264c31a0c-userdata-shm.mount: Deactivated successfully.
Dec 06 09:18:48 np0005548789.localdomain python3.9[108695]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:48 np0005548789.localdomain sudo[108693]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:48 np0005548789.localdomain sudo[108786]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqeelsrlwdchjreivvsjhfvfpbvchntw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012728.597864-114-34606334262654/AnsiballZ_systemd_service.py
Dec 06 09:18:48 np0005548789.localdomain sudo[108786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:49 np0005548789.localdomain python3.9[108788]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:49 np0005548789.localdomain sudo[108786]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:49 np0005548789.localdomain sshd[108849]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:18:49 np0005548789.localdomain sudo[108881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gskkrawyqnsjrybuowlyckvzgrzxlucd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012729.3106263-114-277812908777508/AnsiballZ_systemd_service.py
Dec 06 09:18:49 np0005548789.localdomain sudo[108881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25244 DF PROTO=TCP SPT=57414 DPT=9105 SEQ=1289446335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1692F0000000001030307) 
Dec 06 09:18:49 np0005548789.localdomain python3.9[108883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:50 np0005548789.localdomain sudo[108881]: pam_unix(sudo:session): session closed for user root
Dec 06 09:18:51 np0005548789.localdomain sudo[108974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppjudjslqterpmtxnenmuslfjcfmltah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012731.0058763-114-214249403271047/AnsiballZ_systemd_service.py
Dec 06 09:18:51 np0005548789.localdomain sudo[108974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:18:51 np0005548789.localdomain python3.9[108976]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:18:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:18:51 np0005548789.localdomain systemd-rc-local-generator[109003]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:18:51 np0005548789.localdomain systemd-sysv-generator[109008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:18:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:18:52 np0005548789.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:18:52 np0005548789.localdomain systemd[1]: tmp-crun.9XsxYW.mount: Deactivated successfully.
Dec 06 09:18:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25245 DF PROTO=TCP SPT=57414 DPT=9105 SEQ=1289446335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C178EF0000000001030307) 
Dec 06 09:18:55 np0005548789.localdomain sshd[108849]: Connection closed by 45.78.222.162 port 42876 [preauth]
Dec 06 09:18:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35040 DF PROTO=TCP SPT=33504 DPT=9882 SEQ=1702904302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C185AF0000000001030307) 
Dec 06 09:18:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12875 DF PROTO=TCP SPT=37808 DPT=9101 SEQ=2036028519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C18FF00000000001030307) 
Dec 06 09:19:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:19:00 np0005548789.localdomain systemd[1]: tmp-crun.z7cxYs.mount: Deactivated successfully.
Dec 06 09:19:00 np0005548789.localdomain podman[109029]: 2025-12-06 09:19:00.396714151 +0000 UTC m=+0.298264755 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:19:00 np0005548789.localdomain podman[109029]: 2025-12-06 09:19:00.774168971 +0000 UTC m=+0.675719535 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4)
Dec 06 09:19:00 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:19:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25246 DF PROTO=TCP SPT=57414 DPT=9105 SEQ=1289446335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C199EF0000000001030307) 
Dec 06 09:19:03 np0005548789.localdomain sudo[109054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:19:03 np0005548789.localdomain sudo[109054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:03 np0005548789.localdomain sudo[109054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:03 np0005548789.localdomain sudo[109069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:19:03 np0005548789.localdomain sudo[109069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:04 np0005548789.localdomain sudo[109069]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:19:04 np0005548789.localdomain podman[109116]: Error: container 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 is not running
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Main process exited, code=exited, status=125/n/a
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed with result 'exit-code'.
Dec 06 09:19:04 np0005548789.localdomain podman[109117]: 2025-12-06 09:19:04.682505776 +0000 UTC m=+0.085679427 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:19:04 np0005548789.localdomain podman[109117]: 2025-12-06 09:19:04.701181379 +0000 UTC m=+0.104355100 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Dec 06 09:19:04 np0005548789.localdomain podman[109117]: unhealthy
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:19:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41233 DF PROTO=TCP SPT=37402 DPT=9102 SEQ=1076870088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1A3EF0000000001030307) 
Dec 06 09:19:04 np0005548789.localdomain podman[109115]: 2025-12-06 09:19:04.786953969 +0000 UTC m=+0.194690520 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git)
Dec 06 09:19:04 np0005548789.localdomain podman[109115]: 2025-12-06 09:19:04.800927577 +0000 UTC m=+0.208664128 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Dec 06 09:19:04 np0005548789.localdomain podman[109115]: unhealthy
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:04 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:19:05 np0005548789.localdomain sudo[109164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:19:05 np0005548789.localdomain sudo[109164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:19:05 np0005548789.localdomain sudo[109164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42597 DF PROTO=TCP SPT=58746 DPT=9102 SEQ=2208224687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1AFEF0000000001030307) 
Dec 06 09:19:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41235 DF PROTO=TCP SPT=37402 DPT=9102 SEQ=1076870088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1BBAF0000000001030307) 
Dec 06 09:19:14 np0005548789.localdomain sshd[109179]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3911 DF PROTO=TCP SPT=59048 DPT=9101 SEQ=357860565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1C9300000000001030307) 
Dec 06 09:19:15 np0005548789.localdomain sshd[109179]: Received disconnect from 103.234.151.178 port 22596:11: Bye Bye [preauth]
Dec 06 09:19:15 np0005548789.localdomain sshd[109179]: Disconnected from authenticating user root 103.234.151.178 port 22596 [preauth]
Dec 06 09:19:16 np0005548789.localdomain sshd[109181]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3913 DF PROTO=TCP SPT=59048 DPT=9101 SEQ=357860565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1D5300000000001030307) 
Dec 06 09:19:17 np0005548789.localdomain sshd[109181]: Received disconnect from 81.192.46.35 port 39752:11: Bye Bye [preauth]
Dec 06 09:19:17 np0005548789.localdomain sshd[109181]: Disconnected from authenticating user root 81.192.46.35 port 39752 [preauth]
Dec 06 09:19:18 np0005548789.localdomain sshd[109183]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:18 np0005548789.localdomain sshd[109183]: Received disconnect from 12.156.67.18 port 54460:11: Bye Bye [preauth]
Dec 06 09:19:18 np0005548789.localdomain sshd[109183]: Disconnected from authenticating user root 12.156.67.18 port 54460 [preauth]
Dec 06 09:19:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55718 DF PROTO=TCP SPT=40516 DPT=9105 SEQ=4098139791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1DE6F0000000001030307) 
Dec 06 09:19:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55719 DF PROTO=TCP SPT=40516 DPT=9105 SEQ=4098139791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1EE2F0000000001030307) 
Dec 06 09:19:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35043 DF PROTO=TCP SPT=33504 DPT=9882 SEQ=1702904302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C1F5EF0000000001030307) 
Dec 06 09:19:29 np0005548789.localdomain sshd[109185]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3915 DF PROTO=TCP SPT=59048 DPT=9101 SEQ=357860565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C205EF0000000001030307) 
Dec 06 09:19:29 np0005548789.localdomain sshd[109187]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:19:30 np0005548789.localdomain podman[109189]: 2025-12-06 09:19:30.912949779 +0000 UTC m=+0.076986901 container health_status 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12)
Dec 06 09:19:31 np0005548789.localdomain sshd[109185]: Received disconnect from 64.227.156.63 port 43624:11: Bye Bye [preauth]
Dec 06 09:19:31 np0005548789.localdomain sshd[109185]: Disconnected from authenticating user root 64.227.156.63 port 43624 [preauth]
Dec 06 09:19:31 np0005548789.localdomain podman[109189]: 2025-12-06 09:19:31.294618399 +0000 UTC m=+0.458655511 container exec_died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_migration_target, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute)
Dec 06 09:19:31 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Deactivated successfully.
Dec 06 09:19:31 np0005548789.localdomain sshd[109187]: Received disconnect from 118.193.38.207 port 52364:11: Bye Bye [preauth]
Dec 06 09:19:31 np0005548789.localdomain sshd[109187]: Disconnected from authenticating user root 118.193.38.207 port 52364 [preauth]
Dec 06 09:19:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55720 DF PROTO=TCP SPT=40516 DPT=9105 SEQ=4098139791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C20DF00000000001030307) 
Dec 06 09:19:34 np0005548789.localdomain podman[109016]: time="2025-12-06T09:19:34Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: tmp-crun.IStCZ5.mount: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: libpod-41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.scope: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: libpod-41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.scope: Consumed 36.098s CPU time.
Dec 06 09:19:34 np0005548789.localdomain podman[109016]: 2025-12-06 09:19:34.185366949 +0000 UTC m=+42.111065654 container died 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, release=1761123044, url=https://www.redhat.com, container_name=nova_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.timer: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed to open /run/systemd/transient/41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: No such file or directory
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4fa26257b93de11d5a1a515bc1294a83a2d0558581107701a6e94f33d44fcb5e-merged.mount: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain podman[109016]: 2025-12-06 09:19:34.246278727 +0000 UTC m=+42.171977412 container cleanup 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Dec 06 09:19:34 np0005548789.localdomain podman[109016]: nova_compute
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.timer: Failed to open /run/systemd/transient/41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.timer: No such file or directory
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed to open /run/systemd/transient/41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: No such file or directory
Dec 06 09:19:34 np0005548789.localdomain podman[109213]: 2025-12-06 09:19:34.264586357 +0000 UTC m=+0.065228050 container cleanup 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: libpod-conmon-41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.scope: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.timer: Failed to open /run/systemd/transient/41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.timer: No such file or directory
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: Failed to open /run/systemd/transient/41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007.service: No such file or directory
Dec 06 09:19:34 np0005548789.localdomain podman[109225]: 2025-12-06 09:19:34.34945265 +0000 UTC m=+0.057562006 container cleanup 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:19:34 np0005548789.localdomain podman[109225]: nova_compute
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.075s CPU time, no IO.
Dec 06 09:19:34 np0005548789.localdomain sudo[108974]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20151 DF PROTO=TCP SPT=45332 DPT=9102 SEQ=2909602329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C218EF0000000001030307) 
Dec 06 09:19:34 np0005548789.localdomain sudo[109328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hijdljxdyokzjnnrskmqewhvqvngcrif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012774.531551-114-241120997131655/AnsiballZ_systemd_service.py
Dec 06 09:19:34 np0005548789.localdomain sudo[109328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:19:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:19:34 np0005548789.localdomain podman[109331]: 2025-12-06 09:19:34.916580615 +0000 UTC m=+0.090290428 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:19:34 np0005548789.localdomain podman[109332]: 2025-12-06 09:19:34.963109522 +0000 UTC m=+0.135640880 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, vcs-type=git)
Dec 06 09:19:34 np0005548789.localdomain podman[109331]: 2025-12-06 09:19:34.990322716 +0000 UTC m=+0.164032549 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vcs-type=git)
Dec 06 09:19:34 np0005548789.localdomain podman[109331]: unhealthy
Dec 06 09:19:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:19:35 np0005548789.localdomain podman[109332]: 2025-12-06 09:19:35.002472259 +0000 UTC m=+0.175003587 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, vcs-type=git)
Dec 06 09:19:35 np0005548789.localdomain podman[109332]: unhealthy
Dec 06 09:19:35 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:19:35 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:19:35 np0005548789.localdomain python3.9[109330]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:19:36 np0005548789.localdomain sshd[109374]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:36 np0005548789.localdomain systemd-rc-local-generator[109396]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:19:36 np0005548789.localdomain systemd-sysv-generator[109399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: Stopping nova_migration_target container...
Dec 06 09:19:36 np0005548789.localdomain sshd[69100]: Received signal 15; terminating.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: tmp-crun.UaOSug.mount: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: libpod-38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.scope: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: libpod-38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.scope: Consumed 33.642s CPU time.
Dec 06 09:19:36 np0005548789.localdomain podman[109413]: 2025-12-06 09:19:36.662342474 +0000 UTC m=+0.082212841 container died 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute)
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.timer: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Failed to open /run/systemd/transient/38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: No such file or directory
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b-userdata-shm.mount: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain podman[109413]: 2025-12-06 09:19:36.72191383 +0000 UTC m=+0.141784147 container cleanup 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true)
Dec 06 09:19:36 np0005548789.localdomain podman[109413]: nova_migration_target
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.timer: Failed to open /run/systemd/transient/38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.timer: No such file or directory
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Failed to open /run/systemd/transient/38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: No such file or directory
Dec 06 09:19:36 np0005548789.localdomain podman[109426]: 2025-12-06 09:19:36.751027293 +0000 UTC m=+0.076991912 container cleanup 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: libpod-conmon-38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.scope: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.timer: Failed to open /run/systemd/transient/38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.timer: No such file or directory
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: Failed to open /run/systemd/transient/38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b.service: No such file or directory
Dec 06 09:19:36 np0005548789.localdomain podman[109442]: 2025-12-06 09:19:36.855639789 +0000 UTC m=+0.067304584 container cleanup 38053a2512c651e0a8d3b0f4b541280b1332690a87434e686da8738a0222dd0b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Dec 06 09:19:36 np0005548789.localdomain podman[109442]: nova_migration_target
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Dec 06 09:19:36 np0005548789.localdomain systemd[1]: Stopped nova_migration_target container.
Dec 06 09:19:36 np0005548789.localdomain sudo[109328]: pam_unix(sudo:session): session closed for user root
Dec 06 09:19:37 np0005548789.localdomain sudo[109543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agwpwyjhhcfvipdebgadjvdyquftugdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012777.0116653-114-77447859350094/AnsiballZ_systemd_service.py
Dec 06 09:19:37 np0005548789.localdomain sudo[109543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:19:37 np0005548789.localdomain python3.9[109545]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:19:37 np0005548789.localdomain systemd-sysv-generator[109571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:19:37 np0005548789.localdomain systemd-rc-local-generator[109567]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-16e4342af8bf5958b38bc295034feee0dd1522d1c796e48d3acbadb880cc49ff-merged.mount: Deactivated successfully.
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Dec 06 09:19:37 np0005548789.localdomain recover_tripleo_nova_virtqemud[109588]: 61814
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Dec 06 09:19:37 np0005548789.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Dec 06 09:19:38 np0005548789.localdomain systemd[1]: libpod-c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82.scope: Deactivated successfully.
Dec 06 09:19:38 np0005548789.localdomain podman[109587]: 2025-12-06 09:19:38.018389155 +0000 UTC m=+0.050608483 container stop c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, build-date=2025-11-19T00:35:22Z, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible)
Dec 06 09:19:38 np0005548789.localdomain podman[109587]: 2025-12-06 09:19:38.052108538 +0000 UTC m=+0.084327906 container died c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:19:38 np0005548789.localdomain podman[109587]: 2025-12-06 09:19:38.089813175 +0000 UTC m=+0.122032503 container cleanup c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:19:38 np0005548789.localdomain podman[109587]: nova_virtlogd_wrapper
Dec 06 09:19:38 np0005548789.localdomain sshd[109374]: Received disconnect from 103.192.152.59 port 59832:11: Bye Bye [preauth]
Dec 06 09:19:38 np0005548789.localdomain sshd[109374]: Disconnected from authenticating user root 103.192.152.59 port 59832 [preauth]
Dec 06 09:19:38 np0005548789.localdomain podman[109603]: 2025-12-06 09:19:38.154547429 +0000 UTC m=+0.119989379 container cleanup c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true)
Dec 06 09:19:38 np0005548789.localdomain systemd[1]: tmp-crun.2g2Tyh.mount: Deactivated successfully.
Dec 06 09:19:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394-merged.mount: Deactivated successfully.
Dec 06 09:19:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82-userdata-shm.mount: Deactivated successfully.
Dec 06 09:19:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60842 DF PROTO=TCP SPT=46442 DPT=9882 SEQ=1051052648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C22BF00000000001030307) 
Dec 06 09:19:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20153 DF PROTO=TCP SPT=45332 DPT=9102 SEQ=2909602329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C230AF0000000001030307) 
Dec 06 09:19:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45909 DF PROTO=TCP SPT=52054 DPT=9101 SEQ=2336290836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C23E600000000001030307) 
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Activating special unit Exit the Session...
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Removed slice User Background Tasks Slice.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped target Main User Target.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped target Basic System.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped target Paths.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped target Sockets.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped target Timers.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Closed D-Bus User Message Bus Socket.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Removed slice User Application Slice.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Reached target Shutdown.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Finished Exit the Session.
Dec 06 09:19:44 np0005548789.localdomain systemd[84400]: Reached target Exit the Session.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: user@0.service: Consumed 3.449s CPU time, no IO.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 09:19:44 np0005548789.localdomain systemd[1]: user-0.slice: Consumed 4.412s CPU time.
Dec 06 09:19:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45911 DF PROTO=TCP SPT=52054 DPT=9101 SEQ=2336290836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C24A6F0000000001030307) 
Dec 06 09:19:49 np0005548789.localdomain sshd[109619]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:19:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15920 DF PROTO=TCP SPT=47802 DPT=9105 SEQ=1421469768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2536F0000000001030307) 
Dec 06 09:19:51 np0005548789.localdomain sshd[109619]: Received disconnect from 103.157.25.60 port 57758:11: Bye Bye [preauth]
Dec 06 09:19:51 np0005548789.localdomain sshd[109619]: Disconnected from authenticating user root 103.157.25.60 port 57758 [preauth]
Dec 06 09:19:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15921 DF PROTO=TCP SPT=47802 DPT=9105 SEQ=1421469768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2632F0000000001030307) 
Dec 06 09:19:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60843 DF PROTO=TCP SPT=46442 DPT=9882 SEQ=1051052648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C26BEF0000000001030307) 
Dec 06 09:19:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45913 DF PROTO=TCP SPT=52054 DPT=9101 SEQ=2336290836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C279EF0000000001030307) 
Dec 06 09:20:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15922 DF PROTO=TCP SPT=47802 DPT=9105 SEQ=1421469768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C283EF0000000001030307) 
Dec 06 09:20:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55772 DF PROTO=TCP SPT=59922 DPT=9102 SEQ=2564949314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C28E2F0000000001030307) 
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:20:05 np0005548789.localdomain podman[109621]: 2025-12-06 09:20:05.188972662 +0000 UTC m=+0.098596924 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044)
Dec 06 09:20:05 np0005548789.localdomain podman[109622]: 2025-12-06 09:20:05.232683182 +0000 UTC m=+0.139605661 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn)
Dec 06 09:20:05 np0005548789.localdomain podman[109622]: 2025-12-06 09:20:05.253090527 +0000 UTC m=+0.160013036 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible)
Dec 06 09:20:05 np0005548789.localdomain podman[109622]: unhealthy
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:20:05 np0005548789.localdomain sudo[109650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:20:05 np0005548789.localdomain sudo[109650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:05 np0005548789.localdomain sudo[109650]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:05 np0005548789.localdomain podman[109621]: 2025-12-06 09:20:05.308808386 +0000 UTC m=+0.218432668 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:20:05 np0005548789.localdomain podman[109621]: unhealthy
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:05 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:20:05 np0005548789.localdomain sudo[109675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:20:05 np0005548789.localdomain sudo[109675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:06 np0005548789.localdomain sudo[109675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:06 np0005548789.localdomain sudo[109722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:20:06 np0005548789.localdomain sudo[109722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:20:06 np0005548789.localdomain sudo[109722]: pam_unix(sudo:session): session closed for user root
Dec 06 09:20:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41238 DF PROTO=TCP SPT=37402 DPT=9102 SEQ=1076870088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C299EF0000000001030307) 
Dec 06 09:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:20:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55774 DF PROTO=TCP SPT=59922 DPT=9102 SEQ=2564949314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2A6180000000001030307) 
Dec 06 09:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:20:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57534 DF PROTO=TCP SPT=48814 DPT=9101 SEQ=595854149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2B3900000000001030307) 
Dec 06 09:20:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57536 DF PROTO=TCP SPT=48814 DPT=9101 SEQ=595854149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2BFB00000000001030307) 
Dec 06 09:20:17 np0005548789.localdomain sshd[109737]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:18 np0005548789.localdomain sshd[109737]: Invalid user ubuntu from 92.118.39.95 port 38780
Dec 06 09:20:18 np0005548789.localdomain sshd[109737]: Connection closed by invalid user ubuntu 92.118.39.95 port 38780 [preauth]
Dec 06 09:20:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30377 DF PROTO=TCP SPT=43298 DPT=9105 SEQ=2476358954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2C8AF0000000001030307) 
Dec 06 09:20:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30378 DF PROTO=TCP SPT=43298 DPT=9105 SEQ=2476358954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2D8700000000001030307) 
Dec 06 09:20:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31339 DF PROTO=TCP SPT=46294 DPT=9882 SEQ=296207786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2DFEF0000000001030307) 
Dec 06 09:20:26 np0005548789.localdomain sshd[109739]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:27 np0005548789.localdomain sshd[109739]: Received disconnect from 81.192.46.35 port 38074:11: Bye Bye [preauth]
Dec 06 09:20:27 np0005548789.localdomain sshd[109739]: Disconnected from authenticating user root 81.192.46.35 port 38074 [preauth]
Dec 06 09:20:27 np0005548789.localdomain sshd[109741]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:28 np0005548789.localdomain sshd[109741]: Received disconnect from 12.156.67.18 port 60092:11: Bye Bye [preauth]
Dec 06 09:20:28 np0005548789.localdomain sshd[109741]: Disconnected from authenticating user root 12.156.67.18 port 60092 [preauth]
Dec 06 09:20:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57538 DF PROTO=TCP SPT=48814 DPT=9101 SEQ=595854149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2EFEF0000000001030307) 
Dec 06 09:20:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30379 DF PROTO=TCP SPT=43298 DPT=9105 SEQ=2476358954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C2F7EF0000000001030307) 
Dec 06 09:20:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30919 DF PROTO=TCP SPT=44688 DPT=9102 SEQ=3698087328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C303700000000001030307) 
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:20:35 np0005548789.localdomain podman[109744]: 2025-12-06 09:20:35.68440968 +0000 UTC m=+0.088747322 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:20:35 np0005548789.localdomain podman[109744]: 2025-12-06 09:20:35.698983926 +0000 UTC m=+0.103321568 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, url=https://www.redhat.com)
Dec 06 09:20:35 np0005548789.localdomain podman[109744]: unhealthy
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:20:35 np0005548789.localdomain podman[109743]: 2025-12-06 09:20:35.785945792 +0000 UTC m=+0.190238523 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:20:35 np0005548789.localdomain podman[109743]: 2025-12-06 09:20:35.831191239 +0000 UTC m=+0.235483920 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git)
Dec 06 09:20:35 np0005548789.localdomain podman[109743]: unhealthy
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:20:35 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:20:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20156 DF PROTO=TCP SPT=45332 DPT=9102 SEQ=2909602329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C30FEF0000000001030307) 
Dec 06 09:20:38 np0005548789.localdomain sshd[109782]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:40 np0005548789.localdomain sshd[109782]: Received disconnect from 103.234.151.178 port 48722:11: Bye Bye [preauth]
Dec 06 09:20:40 np0005548789.localdomain sshd[109782]: Disconnected from authenticating user root 103.234.151.178 port 48722 [preauth]
Dec 06 09:20:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30921 DF PROTO=TCP SPT=44688 DPT=9102 SEQ=3698087328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C31B2F0000000001030307) 
Dec 06 09:20:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33984 DF PROTO=TCP SPT=53444 DPT=9101 SEQ=497035705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C328C00000000001030307) 
Dec 06 09:20:45 np0005548789.localdomain sshd[109784]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:20:46 np0005548789.localdomain sshd[109784]: Received disconnect from 118.193.38.207 port 60384:11: Bye Bye [preauth]
Dec 06 09:20:46 np0005548789.localdomain sshd[109784]: Disconnected from authenticating user root 118.193.38.207 port 60384 [preauth]
Dec 06 09:20:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33986 DF PROTO=TCP SPT=53444 DPT=9101 SEQ=497035705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C334AF0000000001030307) 
Dec 06 09:20:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14537 DF PROTO=TCP SPT=57488 DPT=9105 SEQ=3212763973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C33DF00000000001030307) 
Dec 06 09:20:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14538 DF PROTO=TCP SPT=57488 DPT=9105 SEQ=3212763973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C34DAF0000000001030307) 
Dec 06 09:20:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41178 DF PROTO=TCP SPT=56252 DPT=9882 SEQ=3025860389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C355EF0000000001030307) 
Dec 06 09:20:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33988 DF PROTO=TCP SPT=53444 DPT=9101 SEQ=497035705 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C363F00000000001030307) 
Dec 06 09:21:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14539 DF PROTO=TCP SPT=57488 DPT=9105 SEQ=3212763973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C36DEF0000000001030307) 
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61031 (conmon) with signal SIGKILL.
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: libpod-conmon-c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82.scope: Deactivated successfully.
Dec 06 09:21:02 np0005548789.localdomain podman[109798]: error opening file `/run/crun/c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82/status`: No such file or directory
Dec 06 09:21:02 np0005548789.localdomain podman[109786]: 2025-12-06 09:21:02.384332619 +0000 UTC m=+0.040545184 container cleanup c55a3fa9476956f37d3ecfbe7a06aced3ea8b321c934918e4f504b9cf2d8fc82 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtlogd_wrapper, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 09:21:02 np0005548789.localdomain podman[109786]: nova_virtlogd_wrapper
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Dec 06 09:21:02 np0005548789.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Dec 06 09:21:02 np0005548789.localdomain sudo[109543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:02 np0005548789.localdomain sudo[109889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gimbbwguqjwlsuqxghoysotieyitniba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012862.5305955-114-251106968989525/AnsiballZ_systemd_service.py
Dec 06 09:21:02 np0005548789.localdomain sudo[109889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:03 np0005548789.localdomain python3.9[109891]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:21:03 np0005548789.localdomain systemd-rc-local-generator[109918]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:03 np0005548789.localdomain systemd-sysv-generator[109921]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: libpod-77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5.scope: Deactivated successfully.
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: libpod-77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5.scope: Consumed 1.535s CPU time.
Dec 06 09:21:03 np0005548789.localdomain podman[109932]: 2025-12-06 09:21:03.557459693 +0000 UTC m=+0.077343973 container died 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: tmp-crun.wsNh1F.mount: Deactivated successfully.
Dec 06 09:21:03 np0005548789.localdomain podman[109932]: 2025-12-06 09:21:03.592867268 +0000 UTC m=+0.112751518 container cleanup 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, container_name=nova_virtnodedevd, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vendor=Red Hat, Inc., vcs-type=git)
Dec 06 09:21:03 np0005548789.localdomain podman[109932]: nova_virtnodedevd
Dec 06 09:21:03 np0005548789.localdomain podman[109946]: 2025-12-06 09:21:03.629271494 +0000 UTC m=+0.062705943 container cleanup 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: libpod-conmon-77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5.scope: Deactivated successfully.
Dec 06 09:21:03 np0005548789.localdomain podman[109975]: error opening file `/run/crun/77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5/status`: No such file or directory
Dec 06 09:21:03 np0005548789.localdomain podman[109962]: 2025-12-06 09:21:03.719456908 +0000 UTC m=+0.056140332 container cleanup 77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, container_name=nova_virtnodedevd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1)
Dec 06 09:21:03 np0005548789.localdomain podman[109962]: nova_virtnodedevd
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Dec 06 09:21:03 np0005548789.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Dec 06 09:21:03 np0005548789.localdomain sudo[109889]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:04 np0005548789.localdomain sudo[110066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jheassjffpzuaajdryteodlwtfoilzqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012864.081343-114-18855967148644/AnsiballZ_systemd_service.py
Dec 06 09:21:04 np0005548789.localdomain sudo[110066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20-merged.mount: Deactivated successfully.
Dec 06 09:21:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:04 np0005548789.localdomain python3.9[110068]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:04 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:21:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39352 DF PROTO=TCP SPT=36072 DPT=9102 SEQ=2593195835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C378AF0000000001030307) 
Dec 06 09:21:04 np0005548789.localdomain systemd-rc-local-generator[110097]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:04 np0005548789.localdomain systemd-sysv-generator[110101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: Stopping nova_virtproxyd container...
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: libpod-abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa.scope: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain podman[110109]: 2025-12-06 09:21:05.170591175 +0000 UTC m=+0.086522844 container died abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:05 np0005548789.localdomain podman[110109]: 2025-12-06 09:21:05.219592337 +0000 UTC m=+0.135524006 container cleanup abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=nova_virtproxyd, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044)
Dec 06 09:21:05 np0005548789.localdomain podman[110109]: nova_virtproxyd
Dec 06 09:21:05 np0005548789.localdomain podman[110122]: 2025-12-06 09:21:05.251846856 +0000 UTC m=+0.069477091 container cleanup abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, container_name=nova_virtproxyd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: libpod-conmon-abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa.scope: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain podman[110151]: error opening file `/run/crun/abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa/status`: No such file or directory
Dec 06 09:21:05 np0005548789.localdomain podman[110139]: 2025-12-06 09:21:05.337605184 +0000 UTC m=+0.058928687 container cleanup abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtproxyd, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=)
Dec 06 09:21:05 np0005548789.localdomain podman[110139]: nova_virtproxyd
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: Stopped nova_virtproxyd container.
Dec 06 09:21:05 np0005548789.localdomain sudo[110066]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: tmp-crun.oyeqKz.mount: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47-merged.mount: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-abf33a7ce64d174f5aeca10ae9ef2b118248dbae4e0fd3e1c43527aa9d26cefa-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:05 np0005548789.localdomain sshd[110200]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:05 np0005548789.localdomain sudo[110245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snroewqlhbwhzmnedcwvslyrzsajsqju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012865.504527-114-190558596794423/AnsiballZ_systemd_service.py
Dec 06 09:21:05 np0005548789.localdomain sudo[110245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:21:05 np0005548789.localdomain podman[110248]: 2025-12-06 09:21:05.919131302 +0000 UTC m=+0.085925935 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Dec 06 09:21:05 np0005548789.localdomain podman[110248]: 2025-12-06 09:21:05.937126793 +0000 UTC m=+0.103921416 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 09:21:05 np0005548789.localdomain podman[110248]: unhealthy
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:21:05 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:21:06 np0005548789.localdomain podman[110264]: 2025-12-06 09:21:06.008533753 +0000 UTC m=+0.081788939 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true)
Dec 06 09:21:06 np0005548789.localdomain podman[110264]: 2025-12-06 09:21:06.026158213 +0000 UTC m=+0.099413389 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, release=1761123044, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12)
Dec 06 09:21:06 np0005548789.localdomain podman[110264]: unhealthy
Dec 06 09:21:06 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:21:06 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:21:06 np0005548789.localdomain python3.9[110247]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:21:06 np0005548789.localdomain sudo[110290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:21:06 np0005548789.localdomain sudo[110290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:06 np0005548789.localdomain sudo[110290]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:06 np0005548789.localdomain sudo[110305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:21:06 np0005548789.localdomain sudo[110305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:21:07 np0005548789.localdomain systemd-sysv-generator[110360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:21:07 np0005548789.localdomain systemd-rc-local-generator[110352]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:21:07 np0005548789.localdomain sshd[110200]: Received disconnect from 103.192.152.59 port 36182:11: Bye Bye [preauth]
Dec 06 09:21:07 np0005548789.localdomain sshd[110200]: Disconnected from authenticating user root 103.192.152.59 port 36182 [preauth]
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Main process exited, code=killed, status=15/TERM
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Failed with result 'signal'.
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud.
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: Stopping nova_virtqemud container...
Dec 06 09:21:07 np0005548789.localdomain sudo[110305]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: libpod-e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a.scope: Deactivated successfully.
Dec 06 09:21:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55777 DF PROTO=TCP SPT=59922 DPT=9102 SEQ=2564949314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C383EF0000000001030307) 
Dec 06 09:21:07 np0005548789.localdomain systemd[1]: libpod-e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a.scope: Consumed 2.878s CPU time.
Dec 06 09:21:07 np0005548789.localdomain podman[110392]: 2025-12-06 09:21:07.637258903 +0000 UTC m=+0.069287755 container died e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible)
Dec 06 09:21:07 np0005548789.localdomain podman[110392]: 2025-12-06 09:21:07.660543497 +0000 UTC m=+0.092572329 container cleanup e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, container_name=nova_virtqemud, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, version=17.1.12, batch=17.1_20251118.1)
Dec 06 09:21:07 np0005548789.localdomain podman[110392]: nova_virtqemud
Dec 06 09:21:07 np0005548789.localdomain podman[110407]: 2025-12-06 09:21:07.711691176 +0000 UTC m=+0.060338922 container cleanup e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:35:22Z, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=)
Dec 06 09:21:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35-merged.mount: Deactivated successfully.
Dec 06 09:21:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a-userdata-shm.mount: Deactivated successfully.
Dec 06 09:21:10 np0005548789.localdomain sudo[110423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:21:10 np0005548789.localdomain sudo[110423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:21:10 np0005548789.localdomain sudo[110423]: pam_unix(sudo:session): session closed for user root
Dec 06 09:21:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39354 DF PROTO=TCP SPT=36072 DPT=9102 SEQ=2593195835 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3906F0000000001030307) 
Dec 06 09:21:11 np0005548789.localdomain sshd[110438]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:12 np0005548789.localdomain sshd[110438]: Received disconnect from 64.227.156.63 port 43230:11: Bye Bye [preauth]
Dec 06 09:21:12 np0005548789.localdomain sshd[110438]: Disconnected from authenticating user root 64.227.156.63 port 43230 [preauth]
Dec 06 09:21:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31921 DF PROTO=TCP SPT=38962 DPT=9101 SEQ=2796151943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C39DF00000000001030307) 
Dec 06 09:21:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31923 DF PROTO=TCP SPT=38962 DPT=9101 SEQ=2796151943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3A9F00000000001030307) 
Dec 06 09:21:17 np0005548789.localdomain sshd[110440]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:19 np0005548789.localdomain sshd[110442]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15708 DF PROTO=TCP SPT=57760 DPT=9105 SEQ=1755571789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3B32F0000000001030307) 
Dec 06 09:21:19 np0005548789.localdomain sshd[110444]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:21 np0005548789.localdomain sshd[110442]: Received disconnect from 103.157.25.60 port 59444:11: Bye Bye [preauth]
Dec 06 09:21:21 np0005548789.localdomain sshd[110442]: Disconnected from authenticating user root 103.157.25.60 port 59444 [preauth]
Dec 06 09:21:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15709 DF PROTO=TCP SPT=57760 DPT=9105 SEQ=1755571789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3C2EF0000000001030307) 
Dec 06 09:21:23 np0005548789.localdomain sshd[110444]: Received disconnect from 179.33.210.213 port 40714:11: Bye Bye [preauth]
Dec 06 09:21:23 np0005548789.localdomain sshd[110444]: Disconnected from authenticating user root 179.33.210.213 port 40714 [preauth]
Dec 06 09:21:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46996 DF PROTO=TCP SPT=40800 DPT=9882 SEQ=4001645799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3CFAF0000000001030307) 
Dec 06 09:21:29 np0005548789.localdomain sshd[110440]: Connection closed by 45.78.222.162 port 44504 [preauth]
Dec 06 09:21:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31925 DF PROTO=TCP SPT=38962 DPT=9101 SEQ=2796151943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3D9F00000000001030307) 
Dec 06 09:21:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15710 DF PROTO=TCP SPT=57760 DPT=9105 SEQ=1755571789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3E3EF0000000001030307) 
Dec 06 09:21:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60336 DF PROTO=TCP SPT=48272 DPT=9102 SEQ=419265507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3EDAF0000000001030307) 
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: tmp-crun.0EjixX.mount: Deactivated successfully.
Dec 06 09:21:36 np0005548789.localdomain podman[110446]: 2025-12-06 09:21:36.369489425 +0000 UTC m=+0.279884741 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Dec 06 09:21:36 np0005548789.localdomain podman[110446]: 2025-12-06 09:21:36.382026449 +0000 UTC m=+0.292421735 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller)
Dec 06 09:21:36 np0005548789.localdomain podman[110446]: unhealthy
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:21:36 np0005548789.localdomain podman[110447]: 2025-12-06 09:21:36.427178103 +0000 UTC m=+0.334608088 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, architecture=x86_64, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git)
Dec 06 09:21:36 np0005548789.localdomain podman[110447]: 2025-12-06 09:21:36.466651394 +0000 UTC m=+0.374081439 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64)
Dec 06 09:21:36 np0005548789.localdomain podman[110447]: unhealthy
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:21:36 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:21:37 np0005548789.localdomain sshd[110485]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30924 DF PROTO=TCP SPT=44688 DPT=9102 SEQ=3698087328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C3F9EF0000000001030307) 
Dec 06 09:21:38 np0005548789.localdomain sshd[110485]: Received disconnect from 12.156.67.18 port 43802:11: Bye Bye [preauth]
Dec 06 09:21:38 np0005548789.localdomain sshd[110485]: Disconnected from authenticating user root 12.156.67.18 port 43802 [preauth]
Dec 06 09:21:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60338 DF PROTO=TCP SPT=48272 DPT=9102 SEQ=419265507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4056F0000000001030307) 
Dec 06 09:21:40 np0005548789.localdomain sshd[110487]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:21:41 np0005548789.localdomain sshd[110487]: Received disconnect from 81.192.46.35 port 36390:11: Bye Bye [preauth]
Dec 06 09:21:41 np0005548789.localdomain sshd[110487]: Disconnected from authenticating user root 81.192.46.35 port 36390 [preauth]
Dec 06 09:21:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14973 DF PROTO=TCP SPT=60808 DPT=9101 SEQ=2516561203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C413210000000001030307) 
Dec 06 09:21:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14975 DF PROTO=TCP SPT=60808 DPT=9101 SEQ=2516561203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C41F2F0000000001030307) 
Dec 06 09:21:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32181 DF PROTO=TCP SPT=55830 DPT=9105 SEQ=3315581072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4282F0000000001030307) 
Dec 06 09:21:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32182 DF PROTO=TCP SPT=55830 DPT=9105 SEQ=3315581072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C437F00000000001030307) 
Dec 06 09:21:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46999 DF PROTO=TCP SPT=40800 DPT=9882 SEQ=4001645799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C43FEF0000000001030307) 
Dec 06 09:21:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14977 DF PROTO=TCP SPT=60808 DPT=9101 SEQ=2516561203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C44FEF0000000001030307) 
Dec 06 09:22:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32183 DF PROTO=TCP SPT=55830 DPT=9105 SEQ=3315581072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C457EF0000000001030307) 
Dec 06 09:22:01 np0005548789.localdomain sshd[110489]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:02 np0005548789.localdomain sshd[110491]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:03 np0005548789.localdomain sshd[110489]: Received disconnect from 103.234.151.178 port 11306:11: Bye Bye [preauth]
Dec 06 09:22:03 np0005548789.localdomain sshd[110489]: Disconnected from authenticating user root 103.234.151.178 port 11306 [preauth]
Dec 06 09:22:03 np0005548789.localdomain sshd[110491]: Received disconnect from 118.193.38.207 port 40706:11: Bye Bye [preauth]
Dec 06 09:22:03 np0005548789.localdomain sshd[110491]: Disconnected from authenticating user root 118.193.38.207 port 40706 [preauth]
Dec 06 09:22:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4883 DF PROTO=TCP SPT=56360 DPT=9102 SEQ=336027206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C462EF0000000001030307) 
Dec 06 09:22:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:22:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:22:06 np0005548789.localdomain systemd[1]: tmp-crun.0U0JEP.mount: Deactivated successfully.
Dec 06 09:22:06 np0005548789.localdomain podman[110493]: 2025-12-06 09:22:06.937208396 +0000 UTC m=+0.093970832 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller)
Dec 06 09:22:06 np0005548789.localdomain podman[110493]: 2025-12-06 09:22:06.980312197 +0000 UTC m=+0.137074583 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Dec 06 09:22:06 np0005548789.localdomain systemd[1]: tmp-crun.Iejkwv.mount: Deactivated successfully.
Dec 06 09:22:06 np0005548789.localdomain podman[110493]: unhealthy
Dec 06 09:22:06 np0005548789.localdomain podman[110494]: 2025-12-06 09:22:06.994183473 +0000 UTC m=+0.141737906 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:22:07 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:22:07 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:22:07 np0005548789.localdomain podman[110494]: 2025-12-06 09:22:07.015217278 +0000 UTC m=+0.162771751 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:22:07 np0005548789.localdomain podman[110494]: unhealthy
Dec 06 09:22:07 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:22:07 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:22:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50985 DF PROTO=TCP SPT=35922 DPT=9882 SEQ=755790178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C473F00000000001030307) 
Dec 06 09:22:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4885 DF PROTO=TCP SPT=56360 DPT=9102 SEQ=336027206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C47AAF0000000001030307) 
Dec 06 09:22:10 np0005548789.localdomain sudo[110534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:22:10 np0005548789.localdomain sudo[110534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:10 np0005548789.localdomain sudo[110534]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:10 np0005548789.localdomain sudo[110549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:22:10 np0005548789.localdomain sudo[110549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:11 np0005548789.localdomain sudo[110549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:11 np0005548789.localdomain sudo[110585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:22:11 np0005548789.localdomain sudo[110585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:11 np0005548789.localdomain sudo[110585]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:11 np0005548789.localdomain sudo[110600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:22:11 np0005548789.localdomain sudo[110600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:12 np0005548789.localdomain sudo[110600]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:12 np0005548789.localdomain sudo[110647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:22:12 np0005548789.localdomain sudo[110647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:22:12 np0005548789.localdomain sudo[110647]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40564 DF PROTO=TCP SPT=50504 DPT=9101 SEQ=166074965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C488520000000001030307) 
Dec 06 09:22:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40566 DF PROTO=TCP SPT=50504 DPT=9101 SEQ=166074965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4946F0000000001030307) 
Dec 06 09:22:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44567 DF PROTO=TCP SPT=51520 DPT=9105 SEQ=2758505337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C49D6F0000000001030307) 
Dec 06 09:22:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44568 DF PROTO=TCP SPT=51520 DPT=9105 SEQ=2758505337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4AD2F0000000001030307) 
Dec 06 09:22:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9738 DF PROTO=TCP SPT=39894 DPT=9882 SEQ=2425240548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4B9EF0000000001030307) 
Dec 06 09:22:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40568 DF PROTO=TCP SPT=50504 DPT=9101 SEQ=166074965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4C3EF0000000001030307) 
Dec 06 09:22:29 np0005548789.localdomain sshd[110662]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:30 np0005548789.localdomain sshd[110662]: Invalid user ubuntu from 92.118.39.95 port 53770
Dec 06 09:22:30 np0005548789.localdomain sshd[110662]: Connection closed by invalid user ubuntu 92.118.39.95 port 53770 [preauth]
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 61810 (conmon) with signal SIGKILL.
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: libpod-conmon-e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a.scope: Deactivated successfully.
Dec 06 09:22:31 np0005548789.localdomain podman[110675]: error opening file `/run/crun/e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a/status`: No such file or directory
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: tmp-crun.yip6c4.mount: Deactivated successfully.
Dec 06 09:22:31 np0005548789.localdomain podman[110664]: 2025-12-06 09:22:31.918061813 +0000 UTC m=+0.071046418 container cleanup e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_virtqemud, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, config_id=tripleo_step3)
Dec 06 09:22:31 np0005548789.localdomain podman[110664]: nova_virtqemud
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Dec 06 09:22:31 np0005548789.localdomain systemd[1]: Stopped nova_virtqemud container.
Dec 06 09:22:31 np0005548789.localdomain sudo[110245]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44569 DF PROTO=TCP SPT=51520 DPT=9105 SEQ=2758505337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4CDEF0000000001030307) 
Dec 06 09:22:32 np0005548789.localdomain sudo[110767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcwkxdnfocgbrogbbjdlcxdwqygsznfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012952.0615416-114-200031645917085/AnsiballZ_systemd_service.py
Dec 06 09:22:32 np0005548789.localdomain sudo[110767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:22:32 np0005548789.localdomain python3.9[110769]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:22:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:22:32 np0005548789.localdomain systemd-rc-local-generator[110800]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:22:32 np0005548789.localdomain systemd-sysv-generator[110803]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:22:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:22:33 np0005548789.localdomain sshd[110809]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:33 np0005548789.localdomain sudo[110767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:33 np0005548789.localdomain sudo[110900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tomdktjssugnwgoyfdtxiggbaccwshlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012953.1697688-114-190670127223979/AnsiballZ_systemd_service.py
Dec 06 09:22:33 np0005548789.localdomain sudo[110900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:22:33 np0005548789.localdomain python3.9[110902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:22:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:22:33 np0005548789.localdomain systemd-rc-local-generator[110929]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:22:33 np0005548789.localdomain systemd-sysv-generator[110934]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:22:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:22:34 np0005548789.localdomain systemd[1]: Stopping nova_virtsecretd container...
Dec 06 09:22:34 np0005548789.localdomain systemd[1]: libpod-2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367.scope: Deactivated successfully.
Dec 06 09:22:34 np0005548789.localdomain podman[110943]: 2025-12-06 09:22:34.170845255 +0000 UTC m=+0.064836689 container died 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, container_name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, url=https://www.redhat.com, architecture=x86_64)
Dec 06 09:22:34 np0005548789.localdomain podman[110943]: 2025-12-06 09:22:34.215603127 +0000 UTC m=+0.109594521 container cleanup 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, container_name=nova_virtsecretd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d)
Dec 06 09:22:34 np0005548789.localdomain podman[110943]: nova_virtsecretd
Dec 06 09:22:34 np0005548789.localdomain podman[110957]: 2025-12-06 09:22:34.252785088 +0000 UTC m=+0.073851656 container cleanup 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, container_name=nova_virtsecretd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4)
Dec 06 09:22:34 np0005548789.localdomain systemd[1]: libpod-conmon-2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367.scope: Deactivated successfully.
Dec 06 09:22:34 np0005548789.localdomain podman[110985]: error opening file `/run/crun/2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367/status`: No such file or directory
Dec 06 09:22:34 np0005548789.localdomain podman[110974]: 2025-12-06 09:22:34.34649375 +0000 UTC m=+0.067903602 container cleanup 2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, version=17.1.12, architecture=x86_64, vcs-type=git, container_name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:22:34 np0005548789.localdomain podman[110974]: nova_virtsecretd
Dec 06 09:22:34 np0005548789.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Dec 06 09:22:34 np0005548789.localdomain systemd[1]: Stopped nova_virtsecretd container.
Dec 06 09:22:34 np0005548789.localdomain sudo[110900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23535 DF PROTO=TCP SPT=58814 DPT=9102 SEQ=2706882849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4D8300000000001030307) 
Dec 06 09:22:34 np0005548789.localdomain sudo[111076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxfwebsuulzldqzipndmfjfhpbisvncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012954.4976804-114-206789891568876/AnsiballZ_systemd_service.py
Dec 06 09:22:34 np0005548789.localdomain sudo[111076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:22:34 np0005548789.localdomain sshd[110809]: Received disconnect from 103.192.152.59 port 54982:11: Bye Bye [preauth]
Dec 06 09:22:34 np0005548789.localdomain sshd[110809]: Disconnected from authenticating user root 103.192.152.59 port 54982 [preauth]
Dec 06 09:22:35 np0005548789.localdomain python3.9[111078]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:22:35 np0005548789.localdomain systemd-rc-local-generator[111106]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:22:35 np0005548789.localdomain systemd-sysv-generator[111109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb-merged.mount: Deactivated successfully.
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2914dfad5be61e80048556735bb44e4f1907a2e2df52ff8faede941ddfde7367-userdata-shm.mount: Deactivated successfully.
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: Stopping nova_virtstoraged container...
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: libpod-92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d.scope: Deactivated successfully.
Dec 06 09:22:35 np0005548789.localdomain podman[111118]: 2025-12-06 09:22:35.560497387 +0000 UTC m=+0.075031021 container died 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team)
Dec 06 09:22:35 np0005548789.localdomain podman[111118]: 2025-12-06 09:22:35.602889737 +0000 UTC m=+0.117423351 container cleanup 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1)
Dec 06 09:22:35 np0005548789.localdomain podman[111118]: nova_virtstoraged
Dec 06 09:22:35 np0005548789.localdomain podman[111132]: 2025-12-06 09:22:35.677261626 +0000 UTC m=+0.105543806 container cleanup 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, container_name=nova_virtstoraged, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com)
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: libpod-conmon-92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d.scope: Deactivated successfully.
Dec 06 09:22:35 np0005548789.localdomain podman[111159]: error opening file `/run/crun/92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d/status`: No such file or directory
Dec 06 09:22:35 np0005548789.localdomain podman[111148]: 2025-12-06 09:22:35.785265878 +0000 UTC m=+0.069543874 container cleanup 92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtstoraged, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '179caa3982511c1fd3314b961771f96c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1)
Dec 06 09:22:35 np0005548789.localdomain podman[111148]: nova_virtstoraged
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Dec 06 09:22:35 np0005548789.localdomain systemd[1]: Stopped nova_virtstoraged container.
Dec 06 09:22:35 np0005548789.localdomain sudo[111076]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:36 np0005548789.localdomain sudo[111252]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmmibqazwcvinavtdmfgsmdtuxzvvart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012955.9595382-114-25367471951402/AnsiballZ_systemd_service.py
Dec 06 09:22:36 np0005548789.localdomain sudo[111252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:22:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043-merged.mount: Deactivated successfully.
Dec 06 09:22:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92a0134fb6ae7fa0506c791c4569a09e7c0cdb7fcb636d7ea6233a4978e1275d-userdata-shm.mount: Deactivated successfully.
Dec 06 09:22:36 np0005548789.localdomain python3.9[111254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:22:37 np0005548789.localdomain systemd-rc-local-generator[111314]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:22:37 np0005548789.localdomain systemd-sysv-generator[111317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:22:37 np0005548789.localdomain podman[111258]: 2025-12-06 09:22:37.734650538 +0000 UTC m=+0.148505723 container health_status 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent)
Dec 06 09:22:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60341 DF PROTO=TCP SPT=48272 DPT=9102 SEQ=419265507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4E3EF0000000001030307) 
Dec 06 09:22:37 np0005548789.localdomain podman[111258]: 2025-12-06 09:22:37.748089859 +0000 UTC m=+0.161945074 container exec_died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1)
Dec 06 09:22:37 np0005548789.localdomain podman[111258]: unhealthy
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:22:37 np0005548789.localdomain podman[111257]: 2025-12-06 09:22:37.68319786 +0000 UTC m=+0.098096178 container health_status 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:22:37 np0005548789.localdomain podman[111257]: 2025-12-06 09:22:37.815456984 +0000 UTC m=+0.230355272 container exec_died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, release=1761123044, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05)
Dec 06 09:22:37 np0005548789.localdomain podman[111257]: unhealthy
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed with result 'exit-code'.
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed with result 'exit-code'.
Dec 06 09:22:37 np0005548789.localdomain systemd[1]: Stopping ovn_controller container...
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: libpod-1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.scope: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: libpod-1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.scope: Consumed 2.722s CPU time.
Dec 06 09:22:38 np0005548789.localdomain podman[111333]: 2025-12-06 09:22:38.009067631 +0000 UTC m=+0.075275489 container died 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1)
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.timer: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed to open /run/systemd/transient/1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: No such file or directory
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076-userdata-shm.mount: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain podman[111333]: 2025-12-06 09:22:38.041321509 +0000 UTC m=+0.107529327 container cleanup 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc.)
Dec 06 09:22:38 np0005548789.localdomain podman[111333]: ovn_controller
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.timer: Failed to open /run/systemd/transient/1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.timer: No such file or directory
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed to open /run/systemd/transient/1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: No such file or directory
Dec 06 09:22:38 np0005548789.localdomain podman[111347]: 2025-12-06 09:22:38.067434659 +0000 UTC m=+0.052116478 container cleanup 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4)
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: libpod-conmon-1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.scope: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.timer: Failed to open /run/systemd/transient/1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.timer: No such file or directory
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: Failed to open /run/systemd/transient/1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076.service: No such file or directory
Dec 06 09:22:38 np0005548789.localdomain podman[111362]: 2025-12-06 09:22:38.157440859 +0000 UTC m=+0.066306154 container cleanup 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272)
Dec 06 09:22:38 np0005548789.localdomain podman[111362]: ovn_controller
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: Stopped ovn_controller container.
Dec 06 09:22:38 np0005548789.localdomain sudo[111252]: pam_unix(sudo:session): session closed for user root
Dec 06 09:22:38 np0005548789.localdomain sudo[111464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zekqenmweqkqetibyiosrfjvajzuxzpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765012958.3011425-114-25308801783148/AnsiballZ_systemd_service.py
Dec 06 09:22:38 np0005548789.localdomain sudo[111464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:22:38 np0005548789.localdomain python3.9[111466]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5c9f3dee13341691faa8d64763052b5a13a4b5ff224f1a26e18c82d56bd99001-merged.mount: Deactivated successfully.
Dec 06 09:22:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:22:39 np0005548789.localdomain systemd-rc-local-generator[111492]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:22:39 np0005548789.localdomain systemd-sysv-generator[111495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: libpod-87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.scope: Deactivated successfully.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: libpod-87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.scope: Consumed 11.209s CPU time.
Dec 06 09:22:39 np0005548789.localdomain podman[111508]: 2025-12-06 09:22:39.806689078 +0000 UTC m=+0.550984441 container died 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: tmp-crun.0VaXjp.mount: Deactivated successfully.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.timer: Deactivated successfully.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed to open /run/systemd/transient/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: No such file or directory
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-31be4bfa33f1fd50cae755746783d85d8683e10cd2caa7fdf7edb677e543b7f9-merged.mount: Deactivated successfully.
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54-userdata-shm.mount: Deactivated successfully.
Dec 06 09:22:39 np0005548789.localdomain podman[111508]: 2025-12-06 09:22:39.936489088 +0000 UTC m=+0.680784461 container cleanup 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:22:39 np0005548789.localdomain podman[111508]: ovn_metadata_agent
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.timer: Failed to open /run/systemd/transient/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.timer: No such file or directory
Dec 06 09:22:39 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed to open /run/systemd/transient/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: No such file or directory
Dec 06 09:22:39 np0005548789.localdomain podman[111521]: 2025-12-06 09:22:39.963431164 +0000 UTC m=+0.143125599 container cleanup 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:22:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23537 DF PROTO=TCP SPT=58814 DPT=9102 SEQ=2706882849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4EFEF0000000001030307) 
Dec 06 09:22:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21841 DF PROTO=TCP SPT=58354 DPT=9101 SEQ=2676575745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C4FD800000000001030307) 
Dec 06 09:22:44 np0005548789.localdomain sshd[111538]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:45 np0005548789.localdomain sshd[111538]: Received disconnect from 12.156.67.18 port 41002:11: Bye Bye [preauth]
Dec 06 09:22:45 np0005548789.localdomain sshd[111538]: Disconnected from authenticating user root 12.156.67.18 port 41002 [preauth]
Dec 06 09:22:47 np0005548789.localdomain sshd[111540]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21843 DF PROTO=TCP SPT=58354 DPT=9101 SEQ=2676575745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5096F0000000001030307) 
Dec 06 09:22:48 np0005548789.localdomain sshd[111540]: Received disconnect from 103.157.25.60 port 32880:11: Bye Bye [preauth]
Dec 06 09:22:48 np0005548789.localdomain sshd[111540]: Disconnected from authenticating user root 103.157.25.60 port 32880 [preauth]
Dec 06 09:22:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26066 DF PROTO=TCP SPT=41034 DPT=9105 SEQ=1222753639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C512B00000000001030307) 
Dec 06 09:22:50 np0005548789.localdomain sshd[111542]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:50 np0005548789.localdomain sshd[111544]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:22:51 np0005548789.localdomain sshd[111544]: Received disconnect from 81.192.46.35 port 34706:11: Bye Bye [preauth]
Dec 06 09:22:51 np0005548789.localdomain sshd[111544]: Disconnected from authenticating user root 81.192.46.35 port 34706 [preauth]
Dec 06 09:22:52 np0005548789.localdomain sshd[111542]: Received disconnect from 64.227.156.63 port 49150:11: Bye Bye [preauth]
Dec 06 09:22:52 np0005548789.localdomain sshd[111542]: Disconnected from authenticating user root 64.227.156.63 port 49150 [preauth]
Dec 06 09:22:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26067 DF PROTO=TCP SPT=41034 DPT=9105 SEQ=1222753639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5226F0000000001030307) 
Dec 06 09:22:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9741 DF PROTO=TCP SPT=39894 DPT=9882 SEQ=2425240548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C529EF0000000001030307) 
Dec 06 09:22:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21845 DF PROTO=TCP SPT=58354 DPT=9101 SEQ=2676575745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C539EF0000000001030307) 
Dec 06 09:23:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26068 DF PROTO=TCP SPT=41034 DPT=9105 SEQ=1222753639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C541F00000000001030307) 
Dec 06 09:23:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62057 DF PROTO=TCP SPT=55046 DPT=9102 SEQ=1063964906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C54D6F0000000001030307) 
Dec 06 09:23:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4888 DF PROTO=TCP SPT=56360 DPT=9102 SEQ=336027206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C559EF0000000001030307) 
Dec 06 09:23:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62059 DF PROTO=TCP SPT=55046 DPT=9102 SEQ=1063964906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5652F0000000001030307) 
Dec 06 09:23:12 np0005548789.localdomain sudo[111546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:23:13 np0005548789.localdomain sudo[111546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:13 np0005548789.localdomain sudo[111546]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:13 np0005548789.localdomain sudo[111561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:23:13 np0005548789.localdomain sudo[111561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:13 np0005548789.localdomain sudo[111561]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63284 DF PROTO=TCP SPT=54782 DPT=9101 SEQ=1038312291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C572B00000000001030307) 
Dec 06 09:23:14 np0005548789.localdomain sudo[111609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:23:14 np0005548789.localdomain sudo[111609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:23:14 np0005548789.localdomain sudo[111609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:23:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63286 DF PROTO=TCP SPT=54782 DPT=9101 SEQ=1038312291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C57EAF0000000001030307) 
Dec 06 09:23:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32693 DF PROTO=TCP SPT=59226 DPT=9105 SEQ=3382669033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C587F00000000001030307) 
Dec 06 09:23:19 np0005548789.localdomain sshd[111624]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:23:21 np0005548789.localdomain sshd[111624]: Received disconnect from 118.193.38.207 port 55676:11: Bye Bye [preauth]
Dec 06 09:23:21 np0005548789.localdomain sshd[111624]: Disconnected from authenticating user root 118.193.38.207 port 55676 [preauth]
Dec 06 09:23:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32694 DF PROTO=TCP SPT=59226 DPT=9105 SEQ=3382669033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C597AF0000000001030307) 
Dec 06 09:23:24 np0005548789.localdomain sshd[111626]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:23:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61831 DF PROTO=TCP SPT=47160 DPT=9882 SEQ=2720142757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C59FEF0000000001030307) 
Dec 06 09:23:26 np0005548789.localdomain sshd[111626]: Received disconnect from 103.234.151.178 port 37440:11: Bye Bye [preauth]
Dec 06 09:23:26 np0005548789.localdomain sshd[111626]: Disconnected from authenticating user root 103.234.151.178 port 37440 [preauth]
Dec 06 09:23:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63288 DF PROTO=TCP SPT=54782 DPT=9101 SEQ=1038312291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5ADEF0000000001030307) 
Dec 06 09:23:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32695 DF PROTO=TCP SPT=59226 DPT=9105 SEQ=3382669033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5B7F00000000001030307) 
Dec 06 09:23:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45721 DF PROTO=TCP SPT=53764 DPT=9102 SEQ=922920744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5C2700000000001030307) 
Dec 06 09:23:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23540 DF PROTO=TCP SPT=58814 DPT=9102 SEQ=2706882849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5CDEF0000000001030307) 
Dec 06 09:23:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45723 DF PROTO=TCP SPT=53764 DPT=9102 SEQ=922920744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5DA2F0000000001030307) 
Dec 06 09:23:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25076 DF PROTO=TCP SPT=48722 DPT=9101 SEQ=2915572739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5E7E00000000001030307) 
Dec 06 09:23:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25078 DF PROTO=TCP SPT=48722 DPT=9101 SEQ=2915572739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5F3EF0000000001030307) 
Dec 06 09:23:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23654 DF PROTO=TCP SPT=44542 DPT=9105 SEQ=1476196052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C5FCEF0000000001030307) 
Dec 06 09:23:50 np0005548789.localdomain sshd[111628]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:23:51 np0005548789.localdomain sshd[111628]: Received disconnect from 12.156.67.18 port 60802:11: Bye Bye [preauth]
Dec 06 09:23:51 np0005548789.localdomain sshd[111628]: Disconnected from authenticating user root 12.156.67.18 port 60802 [preauth]
Dec 06 09:23:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23655 DF PROTO=TCP SPT=44542 DPT=9105 SEQ=1476196052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C60CAF0000000001030307) 
Dec 06 09:23:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1316 DF PROTO=TCP SPT=40114 DPT=9882 SEQ=4058485434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C619700000000001030307) 
Dec 06 09:23:58 np0005548789.localdomain sshd[111630]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:23:59 np0005548789.localdomain sshd[111630]: Received disconnect from 81.192.46.35 port 33022:11: Bye Bye [preauth]
Dec 06 09:23:59 np0005548789.localdomain sshd[111630]: Disconnected from authenticating user root 81.192.46.35 port 33022 [preauth]
Dec 06 09:23:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25080 DF PROTO=TCP SPT=48722 DPT=9101 SEQ=2915572739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C623F00000000001030307) 
Dec 06 09:24:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23656 DF PROTO=TCP SPT=44542 DPT=9105 SEQ=1476196052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C62DEF0000000001030307) 
Dec 06 09:24:03 np0005548789.localdomain sshd[111632]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 69389 (conmon) with signal SIGKILL.
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: libpod-conmon-87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.scope: Deactivated successfully.
Dec 06 09:24:04 np0005548789.localdomain podman[111646]: error opening file `/run/crun/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54/status`: No such file or directory
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: tmp-crun.NBN1rV.mount: Deactivated successfully.
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.timer: Failed to open /run/systemd/transient/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.timer: No such file or directory
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: Failed to open /run/systemd/transient/87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54.service: No such file or directory
Dec 06 09:24:04 np0005548789.localdomain podman[111634]: 2025-12-06 09:24:04.183065716 +0000 UTC m=+0.092260549 container cleanup 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:24:04 np0005548789.localdomain podman[111634]: ovn_metadata_agent
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Dec 06 09:24:04 np0005548789.localdomain sudo[111464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:04 np0005548789.localdomain sudo[111739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-togmpgxjlyhbzzcpicumeywhewezofqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013044.3446894-114-171966385799594/AnsiballZ_systemd_service.py
Dec 06 09:24:04 np0005548789.localdomain sudo[111739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57763 DF PROTO=TCP SPT=45836 DPT=9102 SEQ=3722726399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C637B00000000001030307) 
Dec 06 09:24:04 np0005548789.localdomain python3.9[111741]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:24:04 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:24:05 np0005548789.localdomain systemd-rc-local-generator[111767]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:24:05 np0005548789.localdomain systemd-sysv-generator[111773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:24:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:24:05 np0005548789.localdomain sudo[111739]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:05 np0005548789.localdomain sshd[111632]: Received disconnect from 103.192.152.59 port 59698:11: Bye Bye [preauth]
Dec 06 09:24:05 np0005548789.localdomain sshd[111632]: Disconnected from authenticating user root 103.192.152.59 port 59698 [preauth]
Dec 06 09:24:06 np0005548789.localdomain sudo[111870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxnkpvvhfwihdtirakitfdfrpkuyscoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013046.1893127-564-143720749065390/AnsiballZ_file.py
Dec 06 09:24:06 np0005548789.localdomain sudo[111870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:06 np0005548789.localdomain python3.9[111872]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:06 np0005548789.localdomain sudo[111870]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:07 np0005548789.localdomain sudo[111962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijnssysltciyxnsxpijejarqyuihafgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013046.909242-564-122036894704400/AnsiballZ_file.py
Dec 06 09:24:07 np0005548789.localdomain sudo[111962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:07 np0005548789.localdomain python3.9[111964]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:07 np0005548789.localdomain sudo[111962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:07 np0005548789.localdomain sudo[112054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqjsdycrrczoicesxhytekqpcypfzkhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013047.4606423-564-198980553045266/AnsiballZ_file.py
Dec 06 09:24:07 np0005548789.localdomain sudo[112054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62062 DF PROTO=TCP SPT=55046 DPT=9102 SEQ=1063964906 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C643EF0000000001030307) 
Dec 06 09:24:07 np0005548789.localdomain python3.9[112056]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:07 np0005548789.localdomain sudo[112054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:08 np0005548789.localdomain sudo[112146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjkcdwdtmmpwtcnyiwycywxuqlsgalws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013048.033683-564-251433880672528/AnsiballZ_file.py
Dec 06 09:24:08 np0005548789.localdomain sudo[112146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:08 np0005548789.localdomain python3.9[112148]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:08 np0005548789.localdomain sudo[112146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:08 np0005548789.localdomain sudo[112238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvdbrtidiwoxbjeujquwdjgkjppihamy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013048.5848162-564-251689806220849/AnsiballZ_file.py
Dec 06 09:24:08 np0005548789.localdomain sudo[112238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:09 np0005548789.localdomain python3.9[112240]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:09 np0005548789.localdomain sudo[112238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:09 np0005548789.localdomain sudo[112330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwqamqdkwsryhcplpuvhaxsalexgbheo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013049.1591985-564-239404625657/AnsiballZ_file.py
Dec 06 09:24:09 np0005548789.localdomain sudo[112330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:09 np0005548789.localdomain python3.9[112332]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:09 np0005548789.localdomain sudo[112330]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:09 np0005548789.localdomain sshd[112392]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:09 np0005548789.localdomain sudo[112424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jymxdajpymkwegmkojkenwcxkppfqibz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013049.7377942-564-473484014122/AnsiballZ_file.py
Dec 06 09:24:09 np0005548789.localdomain sudo[112424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:10 np0005548789.localdomain python3.9[112426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:10 np0005548789.localdomain sudo[112424]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:10 np0005548789.localdomain sudo[112516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvyzittjzbefgxyefhnhqpevyjimakjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013050.2563493-564-192646472247615/AnsiballZ_file.py
Dec 06 09:24:10 np0005548789.localdomain sudo[112516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:10 np0005548789.localdomain python3.9[112518]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:10 np0005548789.localdomain sudo[112516]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57765 DF PROTO=TCP SPT=45836 DPT=9102 SEQ=3722726399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C64F700000000001030307) 
Dec 06 09:24:11 np0005548789.localdomain sudo[112608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkopmnbudsumpavtwapgocjfbewcprlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013050.7922049-564-224294694307982/AnsiballZ_file.py
Dec 06 09:24:11 np0005548789.localdomain sudo[112608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:11 np0005548789.localdomain python3.9[112610]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:11 np0005548789.localdomain sudo[112608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:11 np0005548789.localdomain sudo[112700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtmlwzokvlpuephwqvahgfxdxnwkzttj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013051.3727138-564-171191493861814/AnsiballZ_file.py
Dec 06 09:24:11 np0005548789.localdomain sudo[112700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:11 np0005548789.localdomain python3.9[112702]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:11 np0005548789.localdomain sudo[112700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:12 np0005548789.localdomain sudo[112792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlyagcfqqdsbqatubhfrhnfcnxuafxgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013051.9353015-564-81273155523088/AnsiballZ_file.py
Dec 06 09:24:12 np0005548789.localdomain sudo[112792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:12 np0005548789.localdomain python3.9[112794]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:12 np0005548789.localdomain sudo[112792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:12 np0005548789.localdomain sudo[112884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yttdmphsydxbrzqflrrwkdtoxpedjrcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013052.5177886-564-26392747776239/AnsiballZ_file.py
Dec 06 09:24:12 np0005548789.localdomain sudo[112884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:12 np0005548789.localdomain python3.9[112886]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:12 np0005548789.localdomain sudo[112884]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:13 np0005548789.localdomain sudo[112976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxqlxranzonvvkzqxfayiawwxkkdksmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013053.0962794-564-84738910731604/AnsiballZ_file.py
Dec 06 09:24:13 np0005548789.localdomain sudo[112976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:13 np0005548789.localdomain python3.9[112978]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:13 np0005548789.localdomain sudo[112976]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:13 np0005548789.localdomain sshd[113056]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:13 np0005548789.localdomain sudo[113070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwzonsrafavzuxwjxuyrbxsmalhilrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013053.656377-564-101845878175252/AnsiballZ_file.py
Dec 06 09:24:13 np0005548789.localdomain sudo[113070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:14 np0005548789.localdomain python3.9[113072]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:14 np0005548789.localdomain sudo[113070]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12344 DF PROTO=TCP SPT=56144 DPT=9101 SEQ=1440162264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C65D100000000001030307) 
Dec 06 09:24:14 np0005548789.localdomain sshd[112392]: Received disconnect from 179.33.210.213 port 43860:11: Bye Bye [preauth]
Dec 06 09:24:14 np0005548789.localdomain sshd[112392]: Disconnected from authenticating user root 179.33.210.213 port 43860 [preauth]
Dec 06 09:24:14 np0005548789.localdomain sudo[113162]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkhfkwushjhnhebypzgtrtanqupjaxvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013054.1676803-564-119531200398934/AnsiballZ_file.py
Dec 06 09:24:14 np0005548789.localdomain sudo[113162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:14 np0005548789.localdomain sudo[113165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:24:14 np0005548789.localdomain sudo[113165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:14 np0005548789.localdomain sudo[113165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548789.localdomain sudo[113180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:24:14 np0005548789.localdomain sudo[113180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:14 np0005548789.localdomain python3.9[113164]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:14 np0005548789.localdomain sudo[113162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:14 np0005548789.localdomain sudo[113296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbfrzneptgpurlshccdasowuhcuhaqit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013054.761255-564-129558471597231/AnsiballZ_file.py
Dec 06 09:24:14 np0005548789.localdomain sudo[113296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:15 np0005548789.localdomain python3.9[113308]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:15 np0005548789.localdomain sudo[113296]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548789.localdomain podman[113385]: 2025-12-06 09:24:15.389613375 +0000 UTC m=+0.079787567 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1763362218, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 09:24:15 np0005548789.localdomain podman[113385]: 2025-12-06 09:24:15.488498506 +0000 UTC m=+0.178672678 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container)
Dec 06 09:24:15 np0005548789.localdomain sshd[113056]: Received disconnect from 103.157.25.60 port 34554:11: Bye Bye [preauth]
Dec 06 09:24:15 np0005548789.localdomain sshd[113056]: Disconnected from authenticating user root 103.157.25.60 port 34554 [preauth]
Dec 06 09:24:15 np0005548789.localdomain sudo[113478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvkvidmgyrdpfsmbrpbdysodjbdgdvfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013055.305766-564-7260451220429/AnsiballZ_file.py
Dec 06 09:24:15 np0005548789.localdomain sudo[113478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:15 np0005548789.localdomain python3.9[113486]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:15 np0005548789.localdomain sudo[113180]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548789.localdomain sudo[113478]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548789.localdomain sudo[113531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:24:15 np0005548789.localdomain sudo[113531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:15 np0005548789.localdomain sudo[113531]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:15 np0005548789.localdomain sudo[113559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:24:15 np0005548789.localdomain sudo[113559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:16 np0005548789.localdomain sudo[113636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krqxcklsaaagqxsbtpiiifpzcdmuiwrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013055.8270729-564-112693322041552/AnsiballZ_file.py
Dec 06 09:24:16 np0005548789.localdomain sudo[113636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:16 np0005548789.localdomain python3.9[113638]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:16 np0005548789.localdomain sudo[113636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548789.localdomain sudo[113559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:16 np0005548789.localdomain sudo[113759]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqprdkldnqanslyslqrgsisfsptwedqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013056.3852334-564-261957006672312/AnsiballZ_file.py
Dec 06 09:24:16 np0005548789.localdomain sudo[113759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:16 np0005548789.localdomain python3.9[113761]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:16 np0005548789.localdomain sudo[113759]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548789.localdomain sudo[113808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:24:17 np0005548789.localdomain sudo[113808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:24:17 np0005548789.localdomain sudo[113808]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548789.localdomain sudo[113866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smnwjumesyjxnuftsnxzuasvpemtrxib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013057.0665855-564-101891052546866/AnsiballZ_file.py
Dec 06 09:24:17 np0005548789.localdomain sudo[113866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12346 DF PROTO=TCP SPT=56144 DPT=9101 SEQ=1440162264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6692F0000000001030307) 
Dec 06 09:24:17 np0005548789.localdomain python3.9[113868]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:17 np0005548789.localdomain sudo[113866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:17 np0005548789.localdomain sudo[113958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qejjbtudoeuncvwkxkxohhjqwqycungo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013057.6484487-564-164840044884975/AnsiballZ_file.py
Dec 06 09:24:17 np0005548789.localdomain sudo[113958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:18 np0005548789.localdomain python3.9[113960]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:18 np0005548789.localdomain sudo[113958]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:19 np0005548789.localdomain sudo[114050]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzqzlsrazrqnjdohzdxkeidaoemhpdlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013058.9729228-1014-18607492351389/AnsiballZ_file.py
Dec 06 09:24:19 np0005548789.localdomain sudo[114050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:19 np0005548789.localdomain python3.9[114052]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:19 np0005548789.localdomain sudo[114050]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25514 DF PROTO=TCP SPT=55834 DPT=9100 SEQ=2732588667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C671EF0000000001030307) 
Dec 06 09:24:19 np0005548789.localdomain sudo[114142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezfgvtlgujrjoxntquflcwjwoogdfwog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013059.5576758-1014-155719560593964/AnsiballZ_file.py
Dec 06 09:24:19 np0005548789.localdomain sudo[114142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:20 np0005548789.localdomain python3.9[114144]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:20 np0005548789.localdomain sudo[114142]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:20 np0005548789.localdomain sudo[114234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkrqcbzlceapuypleurxrnadcszsctvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013060.128064-1014-22605448198252/AnsiballZ_file.py
Dec 06 09:24:20 np0005548789.localdomain sudo[114234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:20 np0005548789.localdomain python3.9[114236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:20 np0005548789.localdomain sudo[114234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:20 np0005548789.localdomain sudo[114326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrfiedubjmdoenrapsiuoeoucybhdsue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013060.6850984-1014-128884857588239/AnsiballZ_file.py
Dec 06 09:24:20 np0005548789.localdomain sudo[114326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:21 np0005548789.localdomain python3.9[114328]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:21 np0005548789.localdomain sudo[114326]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:21 np0005548789.localdomain sudo[114418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jywndwedinxfzxwyuxnpnuwqqlcngrtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013061.2146993-1014-275095480821953/AnsiballZ_file.py
Dec 06 09:24:21 np0005548789.localdomain sudo[114418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:21 np0005548789.localdomain python3.9[114420]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:21 np0005548789.localdomain sudo[114418]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:22 np0005548789.localdomain sudo[114510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qshezeliuydmamgrzhgveoyzdpjbnfcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013061.782923-1014-124071474535714/AnsiballZ_file.py
Dec 06 09:24:22 np0005548789.localdomain sudo[114510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:22 np0005548789.localdomain python3.9[114512]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:22 np0005548789.localdomain sudo[114510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:22 np0005548789.localdomain sudo[114602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogqzhsxtjhrtjvddbkosnkjiikyqbnyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013062.3406298-1014-30553421183822/AnsiballZ_file.py
Dec 06 09:24:22 np0005548789.localdomain sudo[114602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:22 np0005548789.localdomain python3.9[114604]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:22 np0005548789.localdomain sudo[114602]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:23 np0005548789.localdomain sudo[114694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilgunwbxgsxexqmjxauybqssedsxkobz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013062.9132593-1014-39747460317236/AnsiballZ_file.py
Dec 06 09:24:23 np0005548789.localdomain sudo[114694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:23 np0005548789.localdomain python3.9[114696]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:23 np0005548789.localdomain sudo[114694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22459 DF PROTO=TCP SPT=52630 DPT=9105 SEQ=3910658383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C681EF0000000001030307) 
Dec 06 09:24:23 np0005548789.localdomain sudo[114786]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llxgznnfdbbyrfhxfgswcpqwoeyftfkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013063.6688626-1014-72039623480000/AnsiballZ_file.py
Dec 06 09:24:23 np0005548789.localdomain sudo[114786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:24 np0005548789.localdomain python3.9[114788]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:24 np0005548789.localdomain sudo[114786]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:24 np0005548789.localdomain sudo[114878]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hezjnnfcroqgzfjaatdbggnekyvtfnxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013064.1841557-1014-232863297913314/AnsiballZ_file.py
Dec 06 09:24:24 np0005548789.localdomain sudo[114878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:24 np0005548789.localdomain python3.9[114880]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:24 np0005548789.localdomain sudo[114878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:24 np0005548789.localdomain sudo[114970]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lydzhunaneaduzohmxsozghpgbnmxwut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013064.7266114-1014-248146883479172/AnsiballZ_file.py
Dec 06 09:24:24 np0005548789.localdomain sudo[114970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:25 np0005548789.localdomain python3.9[114972]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:25 np0005548789.localdomain sudo[114970]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:25 np0005548789.localdomain sudo[115062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezmmszqmugiptzngpxcmlhfdxvlrgwgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013065.315774-1014-66367239501092/AnsiballZ_file.py
Dec 06 09:24:25 np0005548789.localdomain sudo[115062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:25 np0005548789.localdomain sshd[115065]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:25 np0005548789.localdomain python3.9[115064]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1319 DF PROTO=TCP SPT=40114 DPT=9882 SEQ=4058485434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C689F00000000001030307) 
Dec 06 09:24:25 np0005548789.localdomain sudo[115062]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:26 np0005548789.localdomain sudo[115156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cezimofghdhfacbvhdunmpumgmyzakml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013065.9058778-1014-102798977635307/AnsiballZ_file.py
Dec 06 09:24:26 np0005548789.localdomain sudo[115156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:26 np0005548789.localdomain python3.9[115158]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:26 np0005548789.localdomain sudo[115156]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:26 np0005548789.localdomain sudo[115248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhjkwkdaerxaxkatpfbgibmwstdfldxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013066.473749-1014-226984604817398/AnsiballZ_file.py
Dec 06 09:24:26 np0005548789.localdomain sudo[115248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:26 np0005548789.localdomain python3.9[115250]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:26 np0005548789.localdomain sudo[115248]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:27 np0005548789.localdomain sshd[115065]: Received disconnect from 64.227.156.63 port 52216:11: Bye Bye [preauth]
Dec 06 09:24:27 np0005548789.localdomain sshd[115065]: Disconnected from authenticating user root 64.227.156.63 port 52216 [preauth]
Dec 06 09:24:27 np0005548789.localdomain sudo[115340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bezbxsvcjdgageafzcqckcebyzsdvsiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013067.0719192-1014-103734201015789/AnsiballZ_file.py
Dec 06 09:24:27 np0005548789.localdomain sudo[115340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:27 np0005548789.localdomain python3.9[115342]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:27 np0005548789.localdomain sudo[115340]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:27 np0005548789.localdomain sudo[115432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vozfrxdxevkinnnkvbemboosmiousrdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013067.6554608-1014-167230416619886/AnsiballZ_file.py
Dec 06 09:24:27 np0005548789.localdomain sudo[115432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:28 np0005548789.localdomain python3.9[115434]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:28 np0005548789.localdomain sudo[115432]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:28 np0005548789.localdomain sudo[115524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spsgayglpcciencgvaazfwzbujocltuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013068.174377-1014-187528701432421/AnsiballZ_file.py
Dec 06 09:24:28 np0005548789.localdomain sudo[115524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:28 np0005548789.localdomain python3.9[115526]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:28 np0005548789.localdomain sudo[115524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:29 np0005548789.localdomain sudo[115616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-labshludlqpiljxkvodmfhvwmchivpuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013068.7853804-1014-256491707193341/AnsiballZ_file.py
Dec 06 09:24:29 np0005548789.localdomain sudo[115616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12348 DF PROTO=TCP SPT=56144 DPT=9101 SEQ=1440162264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C699EF0000000001030307) 
Dec 06 09:24:30 np0005548789.localdomain python3.9[115618]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:30 np0005548789.localdomain sudo[115616]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:30 np0005548789.localdomain sudo[115708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsqhekgwyygfdumoryqppmgogahaedkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013070.1641335-1014-189099970650210/AnsiballZ_file.py
Dec 06 09:24:30 np0005548789.localdomain sudo[115708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:30 np0005548789.localdomain python3.9[115710]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:30 np0005548789.localdomain sudo[115708]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:31 np0005548789.localdomain sudo[115800]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwbdijfgcwddbpvsythuzhazwazmimfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013070.7089367-1014-58613097140447/AnsiballZ_file.py
Dec 06 09:24:31 np0005548789.localdomain sudo[115800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:31 np0005548789.localdomain python3.9[115802]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:31 np0005548789.localdomain sudo[115800]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22460 DF PROTO=TCP SPT=52630 DPT=9105 SEQ=3910658383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6A1EF0000000001030307) 
Dec 06 09:24:32 np0005548789.localdomain sudo[115892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwjdbvemmplnupypsveafejgzppjxklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013071.8770835-1014-80997325381758/AnsiballZ_file.py
Dec 06 09:24:32 np0005548789.localdomain sudo[115892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:32 np0005548789.localdomain python3.9[115894]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:24:32 np0005548789.localdomain sudo[115892]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:33 np0005548789.localdomain sudo[115984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycmonhmnyrwkjktfrhiqueikqjylqbyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013072.8744237-1461-17879871415581/AnsiballZ_command.py
Dec 06 09:24:33 np0005548789.localdomain sudo[115984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:33 np0005548789.localdomain python3.9[115986]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:33 np0005548789.localdomain sudo[115984]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:34 np0005548789.localdomain python3.9[116078]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:24:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39987 DF PROTO=TCP SPT=53286 DPT=9102 SEQ=3523987682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6ACF00000000001030307) 
Dec 06 09:24:34 np0005548789.localdomain sudo[116168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yymblllumgkcesvhurbokwokphqulibz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013074.4966877-1515-178811548223059/AnsiballZ_systemd_service.py
Dec 06 09:24:34 np0005548789.localdomain sudo[116168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:35 np0005548789.localdomain python3.9[116170]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:24:35 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:24:35 np0005548789.localdomain sshd[116171]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:35 np0005548789.localdomain systemd-rc-local-generator[116193]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:24:35 np0005548789.localdomain systemd-sysv-generator[116199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:24:35 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:24:35 np0005548789.localdomain sudo[116168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:36 np0005548789.localdomain sshd[116171]: Received disconnect from 118.193.38.207 port 56160:11: Bye Bye [preauth]
Dec 06 09:24:36 np0005548789.localdomain sshd[116171]: Disconnected from authenticating user root 118.193.38.207 port 56160 [preauth]
Dec 06 09:24:37 np0005548789.localdomain sudo[116298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmtqsqsucviswlkbbwnzpgiobutvcjal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013077.2329147-1539-190029029289567/AnsiballZ_command.py
Dec 06 09:24:37 np0005548789.localdomain sudo[116298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:37 np0005548789.localdomain python3.9[116300]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:37 np0005548789.localdomain sudo[116298]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:37 np0005548789.localdomain sshd[116348]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:38 np0005548789.localdomain sudo[116393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksvfnlsqrquljkuexxolzpxiadzfbrsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013077.7816083-1539-161217602462166/AnsiballZ_command.py
Dec 06 09:24:38 np0005548789.localdomain sudo[116393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:38 np0005548789.localdomain python3.9[116395]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:38 np0005548789.localdomain sudo[116393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:38 np0005548789.localdomain sshd[116348]: Invalid user ubuntu from 92.118.39.95 port 40536
Dec 06 09:24:38 np0005548789.localdomain sshd[116348]: Connection closed by invalid user ubuntu 92.118.39.95 port 40536 [preauth]
Dec 06 09:24:38 np0005548789.localdomain sudo[116486]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goknzygmvrvldmvonclnzvgpmnlatwgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013078.4149396-1539-100608048167800/AnsiballZ_command.py
Dec 06 09:24:38 np0005548789.localdomain sudo[116486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:38 np0005548789.localdomain python3.9[116488]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6232 DF PROTO=TCP SPT=43008 DPT=9882 SEQ=843075189 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6BDEF0000000001030307) 
Dec 06 09:24:39 np0005548789.localdomain sudo[116486]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:40 np0005548789.localdomain sudo[116579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcjsegocarkducsctkpaumdunvaeisrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013080.014824-1539-24619898956182/AnsiballZ_command.py
Dec 06 09:24:40 np0005548789.localdomain sudo[116579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:40 np0005548789.localdomain python3.9[116581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:40 np0005548789.localdomain sudo[116579]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39989 DF PROTO=TCP SPT=53286 DPT=9102 SEQ=3523987682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6C4AF0000000001030307) 
Dec 06 09:24:40 np0005548789.localdomain sudo[116672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiwqwrgpmcowyktcvldqhqxjbibxdstx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013080.665785-1539-114157676776230/AnsiballZ_command.py
Dec 06 09:24:40 np0005548789.localdomain sudo[116672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:41 np0005548789.localdomain python3.9[116674]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:41 np0005548789.localdomain sudo[116672]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:41 np0005548789.localdomain sudo[116765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvgoooddcayuuircybclrbgqnrmehrws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013081.2850535-1539-209036133383361/AnsiballZ_command.py
Dec 06 09:24:41 np0005548789.localdomain sudo[116765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:41 np0005548789.localdomain python3.9[116767]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:41 np0005548789.localdomain sudo[116765]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:42 np0005548789.localdomain sudo[116858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewzcpgizalouvnildfurkonhcahwewym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013081.8536844-1539-16897003549992/AnsiballZ_command.py
Dec 06 09:24:42 np0005548789.localdomain sudo[116858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:42 np0005548789.localdomain python3.9[116860]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:42 np0005548789.localdomain sudo[116858]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:42 np0005548789.localdomain sudo[116951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjhersezayeqkwbkpechoveezwewhjtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013082.398231-1539-16172126669775/AnsiballZ_command.py
Dec 06 09:24:42 np0005548789.localdomain sudo[116951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:42 np0005548789.localdomain python3.9[116953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:42 np0005548789.localdomain sudo[116951]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:43 np0005548789.localdomain sudo[117044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bskixfhgraarpwekvanfpoomjykzfmxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013082.9575908-1539-206897526463532/AnsiballZ_command.py
Dec 06 09:24:43 np0005548789.localdomain sudo[117044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:43 np0005548789.localdomain python3.9[117046]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:43 np0005548789.localdomain sudo[117044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:43 np0005548789.localdomain sudo[117137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppbtlgprniqmnccxhqsoeihvuawuxcmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013083.5176504-1539-196903294666064/AnsiballZ_command.py
Dec 06 09:24:43 np0005548789.localdomain sudo[117137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:43 np0005548789.localdomain python3.9[117139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:43 np0005548789.localdomain sudo[117137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8017 DF PROTO=TCP SPT=33954 DPT=9101 SEQ=1281717090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6D2400000000001030307) 
Dec 06 09:24:44 np0005548789.localdomain sudo[117230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uurggtirkrssyrhfnzfueygjjqxsaqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013084.1125684-1539-262450400928313/AnsiballZ_command.py
Dec 06 09:24:44 np0005548789.localdomain sudo[117230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:44 np0005548789.localdomain python3.9[117232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:44 np0005548789.localdomain sudo[117230]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:44 np0005548789.localdomain sudo[117323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibckrwhcboelxyiewxhvrnramsnnnhvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013084.6930563-1539-274542051294050/AnsiballZ_command.py
Dec 06 09:24:44 np0005548789.localdomain sudo[117323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:45 np0005548789.localdomain python3.9[117325]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:45 np0005548789.localdomain sudo[117323]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:45 np0005548789.localdomain sudo[117416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cecxkpwfzaoclvarwhbjhjxgkjhajnnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013085.2818837-1539-133925414515039/AnsiballZ_command.py
Dec 06 09:24:45 np0005548789.localdomain sudo[117416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:45 np0005548789.localdomain python3.9[117418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:45 np0005548789.localdomain sudo[117416]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:46 np0005548789.localdomain sudo[117509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shvczlvferxbrotpgipledsyptmedzpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013085.8617353-1539-275250204228066/AnsiballZ_command.py
Dec 06 09:24:46 np0005548789.localdomain sudo[117509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:46 np0005548789.localdomain python3.9[117511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:46 np0005548789.localdomain sudo[117509]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:46 np0005548789.localdomain sudo[117602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-defslyrrvuysphdusrqtcwtozjnuiylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013086.4865272-1539-258608618608611/AnsiballZ_command.py
Dec 06 09:24:46 np0005548789.localdomain sudo[117602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:46 np0005548789.localdomain python3.9[117604]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:46 np0005548789.localdomain sudo[117602]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:47 np0005548789.localdomain sshd[117620]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8019 DF PROTO=TCP SPT=33954 DPT=9101 SEQ=1281717090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6DE300000000001030307) 
Dec 06 09:24:47 np0005548789.localdomain sudo[117697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwngjijshykvcfwndunjnrjsfslugamr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013087.2059212-1539-125943782242584/AnsiballZ_command.py
Dec 06 09:24:47 np0005548789.localdomain sudo[117697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:47 np0005548789.localdomain python3.9[117699]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:47 np0005548789.localdomain sudo[117697]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:48 np0005548789.localdomain sudo[117790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtoezcyzkhvdrysxxcvggbblsfffbofx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013087.7967746-1539-65964035336891/AnsiballZ_command.py
Dec 06 09:24:48 np0005548789.localdomain sudo[117790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:48 np0005548789.localdomain python3.9[117792]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:48 np0005548789.localdomain sudo[117790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:48 np0005548789.localdomain sudo[117883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svxeixxvqijectfnrbxowyeyxilmwixo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013088.3747618-1539-269965522485141/AnsiballZ_command.py
Dec 06 09:24:48 np0005548789.localdomain sudo[117883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:48 np0005548789.localdomain sshd[117620]: Received disconnect from 103.234.151.178 port 63584:11: Bye Bye [preauth]
Dec 06 09:24:48 np0005548789.localdomain sshd[117620]: Disconnected from authenticating user root 103.234.151.178 port 63584 [preauth]
Dec 06 09:24:48 np0005548789.localdomain python3.9[117885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:48 np0005548789.localdomain sudo[117883]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:49 np0005548789.localdomain sudo[117976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edmhfzrmnlpmvxxfjdntnnhuflfkzrxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013088.904273-1539-259547298306706/AnsiballZ_command.py
Dec 06 09:24:49 np0005548789.localdomain sudo[117976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:49 np0005548789.localdomain python3.9[117978]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:49 np0005548789.localdomain sudo[117976]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:49 np0005548789.localdomain sudo[118069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eogblhtrukpkdjekchzrptjdxpibfpsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013089.4373446-1539-87454746626949/AnsiballZ_command.py
Dec 06 09:24:49 np0005548789.localdomain sudo[118069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55083 DF PROTO=TCP SPT=41472 DPT=9105 SEQ=338037815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6E76F0000000001030307) 
Dec 06 09:24:49 np0005548789.localdomain python3.9[118071]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:49 np0005548789.localdomain sudo[118069]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:50 np0005548789.localdomain sudo[118162]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nscqwxufjbzqdqdrseyndxeokmetvmqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013090.0145814-1539-227848360856569/AnsiballZ_command.py
Dec 06 09:24:50 np0005548789.localdomain sudo[118162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:24:50 np0005548789.localdomain python3.9[118164]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:24:50 np0005548789.localdomain sudo[118162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:24:50 np0005548789.localdomain sshd[106134]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:24:50 np0005548789.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Dec 06 09:24:50 np0005548789.localdomain systemd[1]: session-38.scope: Consumed 48.016s CPU time.
Dec 06 09:24:50 np0005548789.localdomain systemd-logind[766]: Session 38 logged out. Waiting for processes to exit.
Dec 06 09:24:50 np0005548789.localdomain systemd-logind[766]: Removed session 38.
Dec 06 09:24:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55084 DF PROTO=TCP SPT=41472 DPT=9105 SEQ=338037815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C6F72F0000000001030307) 
Dec 06 09:24:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40289 DF PROTO=TCP SPT=54604 DPT=9882 SEQ=3151936849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C703EF0000000001030307) 
Dec 06 09:24:58 np0005548789.localdomain sshd[118180]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:24:58 np0005548789.localdomain sshd[118180]: Received disconnect from 12.156.67.18 port 39104:11: Bye Bye [preauth]
Dec 06 09:24:58 np0005548789.localdomain sshd[118180]: Disconnected from authenticating user root 12.156.67.18 port 39104 [preauth]
Dec 06 09:24:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8021 DF PROTO=TCP SPT=33954 DPT=9101 SEQ=1281717090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C70DF00000000001030307) 
Dec 06 09:25:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55085 DF PROTO=TCP SPT=41472 DPT=9105 SEQ=338037815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C717EF0000000001030307) 
Dec 06 09:25:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62322 DF PROTO=TCP SPT=51760 DPT=9102 SEQ=1466230694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7222F0000000001030307) 
Dec 06 09:25:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57768 DF PROTO=TCP SPT=45836 DPT=9102 SEQ=3722726399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C72DEF0000000001030307) 
Dec 06 09:25:08 np0005548789.localdomain sshd[118182]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:09 np0005548789.localdomain sshd[118182]: Received disconnect from 81.192.46.35 port 59578:11: Bye Bye [preauth]
Dec 06 09:25:09 np0005548789.localdomain sshd[118182]: Disconnected from authenticating user root 81.192.46.35 port 59578 [preauth]
Dec 06 09:25:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62324 DF PROTO=TCP SPT=51760 DPT=9102 SEQ=1466230694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C739F00000000001030307) 
Dec 06 09:25:11 np0005548789.localdomain sshd[118184]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:11 np0005548789.localdomain sshd[118184]: Accepted publickey for zuul from 192.168.122.30 port 37034 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:25:11 np0005548789.localdomain systemd-logind[766]: New session 39 of user zuul.
Dec 06 09:25:11 np0005548789.localdomain systemd[1]: Started Session 39 of User zuul.
Dec 06 09:25:11 np0005548789.localdomain sshd[118184]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:25:12 np0005548789.localdomain python3.9[118277]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 09:25:13 np0005548789.localdomain python3.9[118381]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44462 DF PROTO=TCP SPT=41906 DPT=9101 SEQ=641563132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C747700000000001030307) 
Dec 06 09:25:14 np0005548789.localdomain sudo[118471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myjdqrvmlrouhrqtniajxvrxhzorbmpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013113.9471533-94-270970279109760/AnsiballZ_command.py
Dec 06 09:25:14 np0005548789.localdomain sudo[118471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:14 np0005548789.localdomain python3.9[118473]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:25:14 np0005548789.localdomain sudo[118471]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:15 np0005548789.localdomain sudo[118564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwnfznnzbnrdjwaxdzwsilwunckqgemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013114.8799682-130-266356115065165/AnsiballZ_stat.py
Dec 06 09:25:15 np0005548789.localdomain sudo[118564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:15 np0005548789.localdomain python3.9[118566]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:25:15 np0005548789.localdomain sudo[118564]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:16 np0005548789.localdomain sudo[118656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdmvgiujldhseymdkbcqprajsimdqhvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013115.641644-154-211743576684137/AnsiballZ_file.py
Dec 06 09:25:16 np0005548789.localdomain sudo[118656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:16 np0005548789.localdomain python3.9[118658]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:16 np0005548789.localdomain sudo[118656]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:16 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9670 DF PROTO=TCP SPT=35906 DPT=9105 SEQ=3358184643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C750960000000001030307) 
Dec 06 09:25:16 np0005548789.localdomain sudo[118748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kymtbmqqibytkswdswhwxkxqcftjqzvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013116.419006-178-186212007016195/AnsiballZ_stat.py
Dec 06 09:25:16 np0005548789.localdomain sudo[118748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:17 np0005548789.localdomain python3.9[118750]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:25:17 np0005548789.localdomain sudo[118748]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:17 np0005548789.localdomain sudo[118778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:25:17 np0005548789.localdomain sudo[118778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:17 np0005548789.localdomain sudo[118778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:17 np0005548789.localdomain sudo[118793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:25:17 np0005548789.localdomain sudo[118793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:17 np0005548789.localdomain sudo[118851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlqnsyqrdimzorwyfxzoollnhxggqqfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013116.419006-178-186212007016195/AnsiballZ_copy.py
Dec 06 09:25:17 np0005548789.localdomain sudo[118851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:17 np0005548789.localdomain python3.9[118853]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013116.419006-178-186212007016195/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:17 np0005548789.localdomain sudo[118851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548789.localdomain sudo[118793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548789.localdomain sudo[118976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogknrhbyxgycmnphnrxliwlunrhtthlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013118.0192344-223-146701611641028/AnsiballZ_setup.py
Dec 06 09:25:18 np0005548789.localdomain sudo[118976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:18 np0005548789.localdomain python3.9[118978]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:18 np0005548789.localdomain sudo[118976]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:18 np0005548789.localdomain sudo[118983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:25:18 np0005548789.localdomain sudo[118983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:25:18 np0005548789.localdomain sudo[118983]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:19 np0005548789.localdomain sudo[119087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsejxlacvnfbxxrxgafcntbhsedjnvox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013119.0558422-247-144962134182167/AnsiballZ_file.py
Dec 06 09:25:19 np0005548789.localdomain sudo[119087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:19 np0005548789.localdomain python3.9[119089]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:25:19 np0005548789.localdomain sudo[119087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9672 DF PROTO=TCP SPT=35906 DPT=9105 SEQ=3358184643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C75CAF0000000001030307) 
Dec 06 09:25:20 np0005548789.localdomain sudo[119179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uplaugijsovzzhvrozianoolitnxqbgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013119.728273-274-7104306624035/AnsiballZ_file.py
Dec 06 09:25:20 np0005548789.localdomain sudo[119179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:20 np0005548789.localdomain python3.9[119181]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:25:20 np0005548789.localdomain sudo[119179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:25:20 np0005548789.localdomain python3.9[119271]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:25:21 np0005548789.localdomain network[119288]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:25:21 np0005548789.localdomain network[119289]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:25:21 np0005548789.localdomain network[119290]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:25:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:25:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9673 DF PROTO=TCP SPT=35906 DPT=9105 SEQ=3358184643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C76C6F0000000001030307) 
Dec 06 09:25:25 np0005548789.localdomain python3.9[119487]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:25:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40292 DF PROTO=TCP SPT=54604 DPT=9882 SEQ=3151936849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C773EF0000000001030307) 
Dec 06 09:25:25 np0005548789.localdomain python3.9[119577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:25:26 np0005548789.localdomain sudo[119671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkxabiqhkkknhelbjfjbbvgxfqtxkxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013126.3696005-376-63594016665423/AnsiballZ_command.py
Dec 06 09:25:26 np0005548789.localdomain sudo[119671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:25:26 np0005548789.localdomain python3.9[119673]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:25:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44466 DF PROTO=TCP SPT=41906 DPT=9101 SEQ=641563132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C783F00000000001030307) 
Dec 06 09:25:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9674 DF PROTO=TCP SPT=35906 DPT=9105 SEQ=3358184643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C78BEF0000000001030307) 
Dec 06 09:25:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60545 DF PROTO=TCP SPT=44536 DPT=9102 SEQ=2235910770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7972F0000000001030307) 
Dec 06 09:25:35 np0005548789.localdomain sshd[119704]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:36 np0005548789.localdomain sshd[45532]: Received signal 15; terminating.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: sshd.service: Unit process 119704 (sshd) remains running after unit stopped.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: sshd.service: Unit process 119705 (sshd) remains running after unit stopped.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: sshd.service: Consumed 8.991s CPU time, read 0B from disk, written 72.0K to disk.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:25:36 np0005548789.localdomain sshd[119718]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:36 np0005548789.localdomain sshd[119718]: Server listening on 0.0.0.0 port 22.
Dec 06 09:25:36 np0005548789.localdomain sshd[119718]: Server listening on :: port 22.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: run-ra9a8bdac78f142938721a263baf7b7ca.service: Deactivated successfully.
Dec 06 09:25:36 np0005548789.localdomain systemd[1]: run-r6956e17b585c4974a9e5f0de9a738586.service: Deactivated successfully.
Dec 06 09:25:36 np0005548789.localdomain sshd[119704]: Received disconnect from 103.192.152.59 port 53390:11: Bye Bye [preauth]
Dec 06 09:25:36 np0005548789.localdomain sshd[119704]: Disconnected from authenticating user root 103.192.152.59 port 53390 [preauth]
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:25:37 np0005548789.localdomain sshd[119718]: Received signal 15; terminating.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:25:37 np0005548789.localdomain sshd[119889]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:37 np0005548789.localdomain sshd[119889]: Server listening on 0.0.0.0 port 22.
Dec 06 09:25:37 np0005548789.localdomain sshd[119889]: Server listening on :: port 22.
Dec 06 09:25:37 np0005548789.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:25:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39992 DF PROTO=TCP SPT=53286 DPT=9102 SEQ=3523987682 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7A3EF0000000001030307) 
Dec 06 09:25:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60547 DF PROTO=TCP SPT=44536 DPT=9102 SEQ=2235910770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7AEF00000000001030307) 
Dec 06 09:25:42 np0005548789.localdomain sshd[119895]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20795 DF PROTO=TCP SPT=39928 DPT=9101 SEQ=1490110567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7BCA00000000001030307) 
Dec 06 09:25:44 np0005548789.localdomain sshd[119895]: Received disconnect from 103.157.25.60 port 36240:11: Bye Bye [preauth]
Dec 06 09:25:44 np0005548789.localdomain sshd[119895]: Disconnected from authenticating user root 103.157.25.60 port 36240 [preauth]
Dec 06 09:25:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20797 DF PROTO=TCP SPT=39928 DPT=9101 SEQ=1490110567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7C8AF0000000001030307) 
Dec 06 09:25:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33320 DF PROTO=TCP SPT=57988 DPT=9105 SEQ=3101264450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7D1AF0000000001030307) 
Dec 06 09:25:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33321 DF PROTO=TCP SPT=57988 DPT=9105 SEQ=3101264450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7E16F0000000001030307) 
Dec 06 09:25:54 np0005548789.localdomain sshd[119992]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:25:55 np0005548789.localdomain sshd[119992]: Received disconnect from 118.193.38.207 port 48312:11: Bye Bye [preauth]
Dec 06 09:25:55 np0005548789.localdomain sshd[119992]: Disconnected from authenticating user root 118.193.38.207 port 48312 [preauth]
Dec 06 09:25:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16256 DF PROTO=TCP SPT=58812 DPT=9882 SEQ=829556207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7E9EF0000000001030307) 
Dec 06 09:25:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20799 DF PROTO=TCP SPT=39928 DPT=9101 SEQ=1490110567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C7F7EF0000000001030307) 
Dec 06 09:26:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33322 DF PROTO=TCP SPT=57988 DPT=9105 SEQ=3101264450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C801F00000000001030307) 
Dec 06 09:26:04 np0005548789.localdomain sshd[120026]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:04 np0005548789.localdomain sshd[120028]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5459 DF PROTO=TCP SPT=45040 DPT=9102 SEQ=3298331109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C80C6F0000000001030307) 
Dec 06 09:26:04 np0005548789.localdomain sshd[120028]: Received disconnect from 12.156.67.18 port 56738:11: Bye Bye [preauth]
Dec 06 09:26:04 np0005548789.localdomain sshd[120028]: Disconnected from authenticating user root 12.156.67.18 port 56738 [preauth]
Dec 06 09:26:04 np0005548789.localdomain sshd[120030]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:06 np0005548789.localdomain sshd[120030]: Received disconnect from 64.227.156.63 port 55808:11: Bye Bye [preauth]
Dec 06 09:26:06 np0005548789.localdomain sshd[120030]: Disconnected from authenticating user root 64.227.156.63 port 55808 [preauth]
Dec 06 09:26:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62327 DF PROTO=TCP SPT=51760 DPT=9102 SEQ=1466230694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C817EF0000000001030307) 
Dec 06 09:26:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5461 DF PROTO=TCP SPT=45040 DPT=9102 SEQ=3298331109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8242F0000000001030307) 
Dec 06 09:26:14 np0005548789.localdomain sshd[120037]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25799 DF PROTO=TCP SPT=51538 DPT=9101 SEQ=2217985075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C831D00000000001030307) 
Dec 06 09:26:15 np0005548789.localdomain sshd[120026]: Received disconnect from 45.78.222.162 port 52680:11: Bye Bye [preauth]
Dec 06 09:26:15 np0005548789.localdomain sshd[120026]: Disconnected from authenticating user root 45.78.222.162 port 52680 [preauth]
Dec 06 09:26:15 np0005548789.localdomain sshd[120037]: Received disconnect from 103.234.151.178 port 26182:11: Bye Bye [preauth]
Dec 06 09:26:15 np0005548789.localdomain sshd[120037]: Disconnected from authenticating user root 103.234.151.178 port 26182 [preauth]
Dec 06 09:26:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25801 DF PROTO=TCP SPT=51538 DPT=9101 SEQ=2217985075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C83DEF0000000001030307) 
Dec 06 09:26:18 np0005548789.localdomain sshd[120039]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:19 np0005548789.localdomain sudo[120041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:26:19 np0005548789.localdomain sudo[120041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:19 np0005548789.localdomain sudo[120041]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:19 np0005548789.localdomain sudo[120056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:26:19 np0005548789.localdomain sudo[120056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49986 DF PROTO=TCP SPT=45616 DPT=9105 SEQ=2957778523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C846F00000000001030307) 
Dec 06 09:26:19 np0005548789.localdomain sudo[120056]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:20 np0005548789.localdomain sshd[120039]: Received disconnect from 81.192.46.35 port 57896:11: Bye Bye [preauth]
Dec 06 09:26:20 np0005548789.localdomain sshd[120039]: Disconnected from authenticating user root 81.192.46.35 port 57896 [preauth]
Dec 06 09:26:20 np0005548789.localdomain sudo[120103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:26:20 np0005548789.localdomain sudo[120103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:26:20 np0005548789.localdomain sudo[120103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49987 DF PROTO=TCP SPT=45616 DPT=9105 SEQ=2957778523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C856AF0000000001030307) 
Dec 06 09:26:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49351 DF PROTO=TCP SPT=53212 DPT=9882 SEQ=3580497107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8636F0000000001030307) 
Dec 06 09:26:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25803 DF PROTO=TCP SPT=51538 DPT=9101 SEQ=2217985075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C86DEF0000000001030307) 
Dec 06 09:26:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49988 DF PROTO=TCP SPT=45616 DPT=9105 SEQ=2957778523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C877EF0000000001030307) 
Dec 06 09:26:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8903 DF PROTO=TCP SPT=33538 DPT=9102 SEQ=2493667773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C881AF0000000001030307) 
Dec 06 09:26:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60550 DF PROTO=TCP SPT=44536 DPT=9102 SEQ=2235910770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C88DEF0000000001030307) 
Dec 06 09:26:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8905 DF PROTO=TCP SPT=33538 DPT=9102 SEQ=2493667773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8996F0000000001030307) 
Dec 06 09:26:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33152 DF PROTO=TCP SPT=48004 DPT=9101 SEQ=3197119332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8A7000000000001030307) 
Dec 06 09:26:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33154 DF PROTO=TCP SPT=48004 DPT=9101 SEQ=3197119332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8B2F00000000001030307) 
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  Converting 2754 SID table entries...
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:26:49 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:26:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33185 DF PROTO=TCP SPT=43664 DPT=9100 SEQ=1086256800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8BBEF0000000001030307) 
Dec 06 09:26:50 np0005548789.localdomain sudo[119671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:51 np0005548789.localdomain sshd[120466]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:26:52 np0005548789.localdomain sshd[120466]: Invalid user ubuntu from 92.118.39.95 port 55514
Dec 06 09:26:52 np0005548789.localdomain sshd[120466]: Connection closed by invalid user ubuntu 92.118.39.95 port 55514 [preauth]
Dec 06 09:26:53 np0005548789.localdomain sudo[120543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvcmzhsjagnhwljktsngnmafiqunddee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.0180209-403-171046103207149/AnsiballZ_file.py
Dec 06 09:26:53 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Dec 06 09:26:53 np0005548789.localdomain sudo[120543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:53 np0005548789.localdomain python3.9[120545]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:26:53 np0005548789.localdomain sudo[120543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4096 DF PROTO=TCP SPT=55170 DPT=9105 SEQ=1281169954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8CBEF0000000001030307) 
Dec 06 09:26:53 np0005548789.localdomain sudo[120635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvowpkxvgghfobissdpjxmygvyewsjtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.6757448-427-138852573395214/AnsiballZ_stat.py
Dec 06 09:26:53 np0005548789.localdomain sudo[120635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:54 np0005548789.localdomain python3.9[120637]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:26:54 np0005548789.localdomain sudo[120635]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:54 np0005548789.localdomain sudo[120708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-todcddhfjtnnfybfczhwyybwwlpthhxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013213.6757448-427-138852573395214/AnsiballZ_copy.py
Dec 06 09:26:54 np0005548789.localdomain sudo[120708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:54 np0005548789.localdomain python3.9[120710]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013213.6757448-427-138852573395214/.source.fact _original_basename=.7hkiu422 follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:26:54 np0005548789.localdomain sudo[120708]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:55 np0005548789.localdomain python3.9[120800]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:26:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49354 DF PROTO=TCP SPT=53212 DPT=9882 SEQ=3580497107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8D3EF0000000001030307) 
Dec 06 09:26:56 np0005548789.localdomain sudo[120896]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epetmgpcoflwrhwmfabnmshljogrdzss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013216.1866326-502-261353046662939/AnsiballZ_setup.py
Dec 06 09:26:56 np0005548789.localdomain sudo[120896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:56 np0005548789.localdomain python3.9[120898]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:26:57 np0005548789.localdomain sudo[120896]: pam_unix(sudo:session): session closed for user root
Dec 06 09:26:57 np0005548789.localdomain sudo[120950]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eguvhlzvayiqaguekoywbocztnhcljrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013216.1866326-502-261353046662939/AnsiballZ_dnf.py
Dec 06 09:26:57 np0005548789.localdomain sudo[120950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:26:57 np0005548789.localdomain python3.9[120952]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:26:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33156 DF PROTO=TCP SPT=48004 DPT=9101 SEQ=3197119332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8E3EF0000000001030307) 
Dec 06 09:27:00 np0005548789.localdomain sshd[120957]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:01 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:27:01 np0005548789.localdomain systemd-rc-local-generator[120987]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:01 np0005548789.localdomain systemd-sysv-generator[120990]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:01 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:01 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:27:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4097 DF PROTO=TCP SPT=55170 DPT=9105 SEQ=1281169954 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8EBEF0000000001030307) 
Dec 06 09:27:02 np0005548789.localdomain sudo[120950]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:03 np0005548789.localdomain sudo[121093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crcwhrhevyktpofalevobypclcmuwips ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013223.1705563-538-242409404888843/AnsiballZ_command.py
Dec 06 09:27:03 np0005548789.localdomain sudo[121093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:03 np0005548789.localdomain python3.9[121095]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:27:04 np0005548789.localdomain sshd[120957]: Received disconnect from 179.33.210.213 port 47174:11: Bye Bye [preauth]
Dec 06 09:27:04 np0005548789.localdomain sshd[120957]: Disconnected from authenticating user root 179.33.210.213 port 47174 [preauth]
Dec 06 09:27:04 np0005548789.localdomain sudo[121093]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60754 DF PROTO=TCP SPT=35742 DPT=9102 SEQ=3388254926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C8F6EF0000000001030307) 
Dec 06 09:27:05 np0005548789.localdomain sudo[121332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlsvnptqbdwpginqzwdthnhcokbzhsys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013225.2961478-562-201187188650242/AnsiballZ_selinux.py
Dec 06 09:27:05 np0005548789.localdomain sudo[121332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:06 np0005548789.localdomain python3.9[121334]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 06 09:27:06 np0005548789.localdomain sudo[121332]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:06 np0005548789.localdomain sudo[121424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sldpkzboneosianlkyfodvcpmishtgks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013226.5743194-595-256607054410312/AnsiballZ_command.py
Dec 06 09:27:06 np0005548789.localdomain sudo[121424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:07 np0005548789.localdomain python3.9[121426]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 06 09:27:07 np0005548789.localdomain sudo[121424]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:07 np0005548789.localdomain sudo[121517]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdgxhcrwtygdrezinbvnjzbtjqxulzxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013227.7136922-619-260760706923928/AnsiballZ_file.py
Dec 06 09:27:07 np0005548789.localdomain sudo[121517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:08 np0005548789.localdomain python3.9[121519]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:27:08 np0005548789.localdomain sudo[121517]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:08 np0005548789.localdomain sudo[121609]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyhfswrjjnhxwjeasudehzozxaoywqcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013228.417159-643-46631954564401/AnsiballZ_mount.py
Dec 06 09:27:08 np0005548789.localdomain sudo[121609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:09 np0005548789.localdomain python3.9[121611]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 06 09:27:09 np0005548789.localdomain sudo[121609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18284 DF PROTO=TCP SPT=45648 DPT=9882 SEQ=3869491977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C907EF0000000001030307) 
Dec 06 09:27:10 np0005548789.localdomain sudo[121701]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffymbyvpdchlurxpxraanfurnwjagftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013230.0048807-727-181323747015202/AnsiballZ_file.py
Dec 06 09:27:10 np0005548789.localdomain sudo[121701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:10 np0005548789.localdomain python3.9[121703]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:10 np0005548789.localdomain sudo[121701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60756 DF PROTO=TCP SPT=35742 DPT=9102 SEQ=3388254926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C90EAF0000000001030307) 
Dec 06 09:27:10 np0005548789.localdomain sudo[121793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tszntvezeogapxennhvzrcxaamuaekom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013230.659795-751-215459042769421/AnsiballZ_stat.py
Dec 06 09:27:10 np0005548789.localdomain sudo[121793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:11 np0005548789.localdomain python3.9[121795]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:11 np0005548789.localdomain sudo[121793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:11 np0005548789.localdomain sudo[121866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xocrscekroaoadppflprvtaixveetmja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013230.659795-751-215459042769421/AnsiballZ_copy.py
Dec 06 09:27:11 np0005548789.localdomain sudo[121866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:11 np0005548789.localdomain python3.9[121868]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013230.659795-751-215459042769421/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:27:11 np0005548789.localdomain sudo[121866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:11 np0005548789.localdomain sshd[121883]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:12 np0005548789.localdomain sshd[121883]: Received disconnect from 12.156.67.18 port 47680:11: Bye Bye [preauth]
Dec 06 09:27:12 np0005548789.localdomain sshd[121883]: Disconnected from authenticating user root 12.156.67.18 port 47680 [preauth]
Dec 06 09:27:12 np0005548789.localdomain sudo[121960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phvqernvwnunirwbzxyqplwpskezygjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013232.4303977-823-64390528404547/AnsiballZ_stat.py
Dec 06 09:27:12 np0005548789.localdomain sudo[121960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:12 np0005548789.localdomain python3.9[121962]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:12 np0005548789.localdomain sudo[121960]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:14 np0005548789.localdomain sshd[122024]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63011 DF PROTO=TCP SPT=35002 DPT=9101 SEQ=9154961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C91C300000000001030307) 
Dec 06 09:27:14 np0005548789.localdomain sudo[122056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtmzwggqqjdfawogdljcfdioxsvtgxkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013233.5542626-862-43914937114058/AnsiballZ_getent.py
Dec 06 09:27:14 np0005548789.localdomain sudo[122056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:14 np0005548789.localdomain python3.9[122058]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 06 09:27:14 np0005548789.localdomain sudo[122056]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:15 np0005548789.localdomain sudo[122149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibaszkrnmdjqhfddyrroytdlujqybfqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013234.972931-892-45843170062170/AnsiballZ_getent.py
Dec 06 09:27:15 np0005548789.localdomain sudo[122149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:15 np0005548789.localdomain python3.9[122151]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 06 09:27:15 np0005548789.localdomain sudo[122149]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:15 np0005548789.localdomain sshd[122024]: Received disconnect from 103.157.25.60 port 37918:11: Bye Bye [preauth]
Dec 06 09:27:15 np0005548789.localdomain sshd[122024]: Disconnected from authenticating user root 103.157.25.60 port 37918 [preauth]
Dec 06 09:27:16 np0005548789.localdomain sshd[122168]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:16 np0005548789.localdomain sudo[122244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vemyqelrplislalpuykowonpfmwsndvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013236.0404403-916-144227574038707/AnsiballZ_group.py
Dec 06 09:27:16 np0005548789.localdomain sudo[122244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:16 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1709 DF PROTO=TCP SPT=59634 DPT=9105 SEQ=3366381085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C925560000000001030307) 
Dec 06 09:27:16 np0005548789.localdomain python3.9[122246]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:27:16 np0005548789.localdomain groupmod[122247]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Dec 06 09:27:16 np0005548789.localdomain groupmod[122247]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Dec 06 09:27:16 np0005548789.localdomain sudo[122244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:17 np0005548789.localdomain sudo[122342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sauizvtydyqbylanfmvigymsbljhiakr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013236.969972-943-252828736244856/AnsiballZ_file.py
Dec 06 09:27:17 np0005548789.localdomain sudo[122342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:17 np0005548789.localdomain python3.9[122344]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 06 09:27:17 np0005548789.localdomain sudo[122342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:17 np0005548789.localdomain sshd[122168]: Received disconnect from 103.192.152.59 port 42262:11: Bye Bye [preauth]
Dec 06 09:27:17 np0005548789.localdomain sshd[122168]: Disconnected from authenticating user root 103.192.152.59 port 42262 [preauth]
Dec 06 09:27:18 np0005548789.localdomain sudo[122434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imwwiiquhkgawdaxvqfyszkrpixwaajn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013237.8927429-976-46554076767932/AnsiballZ_dnf.py
Dec 06 09:27:18 np0005548789.localdomain sudo[122434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:18 np0005548789.localdomain python3.9[122436]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:27:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1711 DF PROTO=TCP SPT=59634 DPT=9105 SEQ=3366381085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9316F0000000001030307) 
Dec 06 09:27:20 np0005548789.localdomain sudo[122439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:27:20 np0005548789.localdomain sudo[122439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:20 np0005548789.localdomain sudo[122439]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:20 np0005548789.localdomain sudo[122454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:27:20 np0005548789.localdomain sudo[122454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:21 np0005548789.localdomain sudo[122454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:21 np0005548789.localdomain sudo[122434]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:21 np0005548789.localdomain sudo[122587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfohwgsvyqtjhjsfrvtkglwzjprscjwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013241.63934-1000-124682667306719/AnsiballZ_file.py
Dec 06 09:27:21 np0005548789.localdomain sudo[122587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:22 np0005548789.localdomain sudo[122590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:27:22 np0005548789.localdomain sudo[122590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:27:22 np0005548789.localdomain sudo[122590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:22 np0005548789.localdomain python3.9[122589]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:22 np0005548789.localdomain sudo[122587]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:22 np0005548789.localdomain sudo[122694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enxpmjpcogstuaptuzzqksmhujytxnln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013242.2870924-1024-49473799356540/AnsiballZ_stat.py
Dec 06 09:27:22 np0005548789.localdomain sudo[122694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:22 np0005548789.localdomain python3.9[122696]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:22 np0005548789.localdomain sudo[122694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:23 np0005548789.localdomain sudo[122767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdphkywxhykhbqsaigwlghdnpezrvmzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013242.2870924-1024-49473799356540/AnsiballZ_copy.py
Dec 06 09:27:23 np0005548789.localdomain sudo[122767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:23 np0005548789.localdomain python3.9[122769]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013242.2870924-1024-49473799356540/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:23 np0005548789.localdomain sudo[122767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1712 DF PROTO=TCP SPT=59634 DPT=9105 SEQ=3366381085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9412F0000000001030307) 
Dec 06 09:27:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3035 DF PROTO=TCP SPT=57632 DPT=9882 SEQ=2170270327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C94DF00000000001030307) 
Dec 06 09:27:27 np0005548789.localdomain sshd[122784]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:28 np0005548789.localdomain sshd[122784]: Received disconnect from 81.192.46.35 port 56210:11: Bye Bye [preauth]
Dec 06 09:27:28 np0005548789.localdomain sshd[122784]: Disconnected from authenticating user root 81.192.46.35 port 56210 [preauth]
Dec 06 09:27:28 np0005548789.localdomain sudo[122861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nutddezvhktctklhaikcymtlgaxitnju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013248.02025-1069-80240600796860/AnsiballZ_systemd.py
Dec 06 09:27:28 np0005548789.localdomain sudo[122861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:28 np0005548789.localdomain python3.9[122863]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:27:28 np0005548789.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:27:28 np0005548789.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:27:28 np0005548789.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:27:28 np0005548789.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:27:28 np0005548789.localdomain systemd-modules-load[122867]: Module 'msr' is built in
Dec 06 09:27:28 np0005548789.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:27:29 np0005548789.localdomain sudo[122861]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63015 DF PROTO=TCP SPT=35002 DPT=9101 SEQ=9154961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C957EF0000000001030307) 
Dec 06 09:27:30 np0005548789.localdomain sudo[122957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipbnsugiatfabljmyawnxzizphjoxzsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013250.0094578-1093-62300419915202/AnsiballZ_stat.py
Dec 06 09:27:30 np0005548789.localdomain sudo[122957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:30 np0005548789.localdomain python3.9[122959]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:27:30 np0005548789.localdomain sudo[122957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:30 np0005548789.localdomain sudo[123030]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdrgoryltpkauuwfvgybnfphfujztypo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013250.0094578-1093-62300419915202/AnsiballZ_copy.py
Dec 06 09:27:30 np0005548789.localdomain sudo[123030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:30 np0005548789.localdomain python3.9[123032]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013250.0094578-1093-62300419915202/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:27:31 np0005548789.localdomain sudo[123030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1713 DF PROTO=TCP SPT=59634 DPT=9105 SEQ=3366381085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C961EF0000000001030307) 
Dec 06 09:27:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51167 DF PROTO=TCP SPT=54836 DPT=9102 SEQ=3373859316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C96BF00000000001030307) 
Dec 06 09:27:34 np0005548789.localdomain sshd[123047]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:36 np0005548789.localdomain sudo[123124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-broamvlrnfilbuxwbhwzgyoqjvvuqzlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013255.8787556-1147-60946695319138/AnsiballZ_dnf.py
Dec 06 09:27:36 np0005548789.localdomain sudo[123124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:36 np0005548789.localdomain sshd[123047]: Received disconnect from 118.193.38.207 port 57044:11: Bye Bye [preauth]
Dec 06 09:27:36 np0005548789.localdomain sshd[123047]: Disconnected from authenticating user root 118.193.38.207 port 57044 [preauth]
Dec 06 09:27:36 np0005548789.localdomain python3.9[123126]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:27:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8908 DF PROTO=TCP SPT=33538 DPT=9102 SEQ=2493667773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C977EF0000000001030307) 
Dec 06 09:27:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51169 DF PROTO=TCP SPT=54836 DPT=9102 SEQ=3373859316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C983AF0000000001030307) 
Dec 06 09:27:40 np0005548789.localdomain sshd[123129]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:42 np0005548789.localdomain sshd[123129]: Received disconnect from 103.234.151.178 port 52320:11: Bye Bye [preauth]
Dec 06 09:27:42 np0005548789.localdomain sshd[123129]: Disconnected from authenticating user root 103.234.151.178 port 52320 [preauth]
Dec 06 09:27:44 np0005548789.localdomain sudo[123124]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8657 DF PROTO=TCP SPT=45986 DPT=9101 SEQ=2014193925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C991600000000001030307) 
Dec 06 09:27:44 np0005548789.localdomain sshd[123220]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:45 np0005548789.localdomain python3.9[123221]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:45 np0005548789.localdomain python3.9[123314]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:27:46 np0005548789.localdomain sshd[123220]: Received disconnect from 64.227.156.63 port 58850:11: Bye Bye [preauth]
Dec 06 09:27:46 np0005548789.localdomain sshd[123220]: Disconnected from authenticating user root 64.227.156.63 port 58850 [preauth]
Dec 06 09:27:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8659 DF PROTO=TCP SPT=45986 DPT=9101 SEQ=2014193925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C99D6F0000000001030307) 
Dec 06 09:27:47 np0005548789.localdomain python3.9[123404]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:27:48 np0005548789.localdomain sudo[123494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdodyrxabprfvqgtrfqztguehdfrvkmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013268.2392564-1270-118127443459693/AnsiballZ_systemd.py
Dec 06 09:27:48 np0005548789.localdomain sudo[123494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:48 np0005548789.localdomain python3.9[123496]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:48 np0005548789.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 09:27:48 np0005548789.localdomain systemd[1]: tuned.service: Deactivated successfully.
Dec 06 09:27:48 np0005548789.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 09:27:48 np0005548789.localdomain systemd[1]: tuned.service: Consumed 1.810s CPU time, no IO.
Dec 06 09:27:48 np0005548789.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 09:27:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15776 DF PROTO=TCP SPT=52430 DPT=9105 SEQ=2927662920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9A66F0000000001030307) 
Dec 06 09:27:50 np0005548789.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 09:27:50 np0005548789.localdomain sudo[123494]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:52 np0005548789.localdomain sshd[123568]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:52 np0005548789.localdomain python3.9[123600]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:27:53 np0005548789.localdomain sshd[123568]: Connection closed by authenticating user root 161.248.200.221 port 35758 [preauth]
Dec 06 09:27:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15777 DF PROTO=TCP SPT=52430 DPT=9105 SEQ=2927662920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9B62F0000000001030307) 
Dec 06 09:27:53 np0005548789.localdomain sshd[123615]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:54 np0005548789.localdomain sshd[123615]: Invalid user admin from 161.248.200.221 port 35764
Dec 06 09:27:55 np0005548789.localdomain sshd[123615]: Connection closed by invalid user admin 161.248.200.221 port 35764 [preauth]
Dec 06 09:27:55 np0005548789.localdomain sshd[123617]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3038 DF PROTO=TCP SPT=57632 DPT=9882 SEQ=2170270327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9BDF00000000001030307) 
Dec 06 09:27:56 np0005548789.localdomain sudo[123694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piklkhregdevpmstnpfjpxkzulsdaqml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013276.0283701-1441-235124162922065/AnsiballZ_systemd.py
Dec 06 09:27:56 np0005548789.localdomain sudo[123694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:56 np0005548789.localdomain sshd[123617]: Invalid user dspace from 161.248.200.221 port 44630
Dec 06 09:27:56 np0005548789.localdomain python3.9[123696]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:56 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:27:56 np0005548789.localdomain sshd[123617]: Connection closed by invalid user dspace 161.248.200.221 port 44630 [preauth]
Dec 06 09:27:56 np0005548789.localdomain systemd-rc-local-generator[123719]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:56 np0005548789.localdomain systemd-sysv-generator[123723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:56 np0005548789.localdomain sshd[123735]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:56 np0005548789.localdomain sudo[123694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:57 np0005548789.localdomain sudo[123826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brufvrawlurqvnnoyaqnvugiwdetarmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013277.1045384-1441-247563364478861/AnsiballZ_systemd.py
Dec 06 09:27:57 np0005548789.localdomain sudo[123826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:57 np0005548789.localdomain python3.9[123828]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:27:57 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:27:57 np0005548789.localdomain systemd-rc-local-generator[123852]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:27:57 np0005548789.localdomain systemd-sysv-generator[123858]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:27:57 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:27:58 np0005548789.localdomain sshd[123735]: Invalid user kali from 161.248.200.221 port 44636
Dec 06 09:27:58 np0005548789.localdomain sudo[123826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:58 np0005548789.localdomain sshd[123735]: Connection closed by invalid user kali 161.248.200.221 port 44636 [preauth]
Dec 06 09:27:58 np0005548789.localdomain sshd[123881]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:27:59 np0005548789.localdomain sshd[123881]: Invalid user postgres from 161.248.200.221 port 44652
Dec 06 09:27:59 np0005548789.localdomain sudo[123958]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsfsgaqcqmrphanaqtomlxipqcwdudcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013279.3850079-1489-269814160932418/AnsiballZ_command.py
Dec 06 09:27:59 np0005548789.localdomain sudo[123958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8661 DF PROTO=TCP SPT=45986 DPT=9101 SEQ=2014193925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9CDF00000000001030307) 
Dec 06 09:27:59 np0005548789.localdomain python3.9[123960]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:27:59 np0005548789.localdomain sudo[123958]: pam_unix(sudo:session): session closed for user root
Dec 06 09:27:59 np0005548789.localdomain sshd[123881]: Connection closed by invalid user postgres 161.248.200.221 port 44652 [preauth]
Dec 06 09:28:00 np0005548789.localdomain sshd[124009]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:00 np0005548789.localdomain sudo[124053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxkyzjymloszifqidvlfzffyqwaevxlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013280.083296-1513-71341924287179/AnsiballZ_command.py
Dec 06 09:28:00 np0005548789.localdomain sudo[124053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:00 np0005548789.localdomain python3.9[124055]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:00 np0005548789.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Dec 06 09:28:00 np0005548789.localdomain sudo[124053]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:00 np0005548789.localdomain sudo[124146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yunkxgioirrpaogmhvolzbbkrlqdmzjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013280.7580009-1537-126762951370178/AnsiballZ_command.py
Dec 06 09:28:00 np0005548789.localdomain sudo[124146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:01 np0005548789.localdomain python3.9[124148]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:01 np0005548789.localdomain sshd[124009]: Invalid user kafka from 161.248.200.221 port 44660
Dec 06 09:28:01 np0005548789.localdomain sshd[124009]: Connection closed by invalid user kafka 161.248.200.221 port 44660 [preauth]
Dec 06 09:28:01 np0005548789.localdomain sshd[124153]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15778 DF PROTO=TCP SPT=52430 DPT=9105 SEQ=2927662920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9D5EF0000000001030307) 
Dec 06 09:28:02 np0005548789.localdomain sudo[124146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:02 np0005548789.localdomain sshd[124153]: Invalid user web from 161.248.200.221 port 44668
Dec 06 09:28:03 np0005548789.localdomain sshd[124153]: Connection closed by invalid user web 161.248.200.221 port 44668 [preauth]
Dec 06 09:28:03 np0005548789.localdomain sshd[124217]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:03 np0005548789.localdomain sudo[124249]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdkmaognpowoobpqwilwwbdewyrmhmxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013283.1571774-1561-201736289921524/AnsiballZ_command.py
Dec 06 09:28:03 np0005548789.localdomain sudo[124249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:03 np0005548789.localdomain python3.9[124251]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:03 np0005548789.localdomain sudo[124249]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:04 np0005548789.localdomain sudo[124342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrrvugchrxlcckpiujdsfbmrvvxctjud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013283.8117397-1585-200430931861426/AnsiballZ_systemd.py
Dec 06 09:28:04 np0005548789.localdomain sudo[124342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:04 np0005548789.localdomain python3.9[124344]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: Stopped Apply Kernel Variables.
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: Stopping Apply Kernel Variables...
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: Starting Apply Kernel Variables...
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 09:28:04 np0005548789.localdomain sshd[124217]: Invalid user guest from 161.248.200.221 port 44676
Dec 06 09:28:04 np0005548789.localdomain systemd[1]: Finished Apply Kernel Variables.
Dec 06 09:28:04 np0005548789.localdomain sudo[124342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:04 np0005548789.localdomain sshd[124217]: Connection closed by invalid user guest 161.248.200.221 port 44676 [preauth]
Dec 06 09:28:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41929 DF PROTO=TCP SPT=36116 DPT=9102 SEQ=2957430391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9E12F0000000001030307) 
Dec 06 09:28:04 np0005548789.localdomain sshd[124366]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:05 np0005548789.localdomain sshd[118184]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:28:05 np0005548789.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Dec 06 09:28:05 np0005548789.localdomain systemd[1]: session-39.scope: Consumed 1min 55.959s CPU time.
Dec 06 09:28:05 np0005548789.localdomain systemd-logind[766]: Session 39 logged out. Waiting for processes to exit.
Dec 06 09:28:05 np0005548789.localdomain systemd-logind[766]: Removed session 39.
Dec 06 09:28:05 np0005548789.localdomain sshd[124366]: Invalid user vpn from 161.248.200.221 port 55668
Dec 06 09:28:06 np0005548789.localdomain sshd[124366]: Connection closed by invalid user vpn 161.248.200.221 port 55668 [preauth]
Dec 06 09:28:06 np0005548789.localdomain sshd[124369]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:07 np0005548789.localdomain sshd[124369]: Invalid user cassandra from 161.248.200.221 port 55674
Dec 06 09:28:07 np0005548789.localdomain sshd[124369]: Connection closed by invalid user cassandra 161.248.200.221 port 55674 [preauth]
Dec 06 09:28:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60759 DF PROTO=TCP SPT=35742 DPT=9102 SEQ=3388254926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9EDEF0000000001030307) 
Dec 06 09:28:08 np0005548789.localdomain sshd[124371]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:09 np0005548789.localdomain sshd[124371]: Invalid user ubuntu from 161.248.200.221 port 55686
Dec 06 09:28:09 np0005548789.localdomain sshd[124371]: Connection closed by invalid user ubuntu 161.248.200.221 port 55686 [preauth]
Dec 06 09:28:09 np0005548789.localdomain sshd[124373]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:10 np0005548789.localdomain sshd[124373]: Invalid user solr from 161.248.200.221 port 55698
Dec 06 09:28:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41931 DF PROTO=TCP SPT=36116 DPT=9102 SEQ=2957430391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52C9F8EF0000000001030307) 
Dec 06 09:28:11 np0005548789.localdomain sshd[124373]: Connection closed by invalid user solr 161.248.200.221 port 55698 [preauth]
Dec 06 09:28:11 np0005548789.localdomain sshd[124375]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:12 np0005548789.localdomain sshd[124375]: Invalid user linaro from 161.248.200.221 port 55712
Dec 06 09:28:12 np0005548789.localdomain sshd[124375]: Connection closed by invalid user linaro 161.248.200.221 port 55712 [preauth]
Dec 06 09:28:12 np0005548789.localdomain sshd[124377]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:13 np0005548789.localdomain sshd[124379]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:13 np0005548789.localdomain sshd[124379]: Accepted publickey for zuul from 192.168.122.30 port 53372 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:28:13 np0005548789.localdomain systemd-logind[766]: New session 40 of user zuul.
Dec 06 09:28:13 np0005548789.localdomain systemd[1]: Started Session 40 of User zuul.
Dec 06 09:28:13 np0005548789.localdomain sshd[124379]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:28:13 np0005548789.localdomain sshd[124377]: Invalid user deploy from 161.248.200.221 port 55714
Dec 06 09:28:14 np0005548789.localdomain python3.9[124472]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:14 np0005548789.localdomain sshd[124377]: Connection closed by invalid user deploy 161.248.200.221 port 55714 [preauth]
Dec 06 09:28:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41231 DF PROTO=TCP SPT=60810 DPT=9101 SEQ=476356393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA06900000000001030307) 
Dec 06 09:28:14 np0005548789.localdomain sshd[124477]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:15 np0005548789.localdomain sshd[124477]: Invalid user user1 from 161.248.200.221 port 55730
Dec 06 09:28:15 np0005548789.localdomain python3.9[124568]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:15 np0005548789.localdomain sshd[124477]: Connection closed by invalid user user1 161.248.200.221 port 55730 [preauth]
Dec 06 09:28:15 np0005548789.localdomain sshd[124573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:16 np0005548789.localdomain sshd[124573]: Invalid user dev from 161.248.200.221 port 43238
Dec 06 09:28:17 np0005548789.localdomain sshd[124573]: Connection closed by invalid user dev 161.248.200.221 port 43238 [preauth]
Dec 06 09:28:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41233 DF PROTO=TCP SPT=60810 DPT=9101 SEQ=476356393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA12AF0000000001030307) 
Dec 06 09:28:17 np0005548789.localdomain sshd[124589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:18 np0005548789.localdomain sshd[124591]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:18 np0005548789.localdomain sshd[124589]: Invalid user www from 161.248.200.221 port 43248
Dec 06 09:28:18 np0005548789.localdomain sshd[124591]: Received disconnect from 12.156.67.18 port 52722:11: Bye Bye [preauth]
Dec 06 09:28:18 np0005548789.localdomain sshd[124591]: Disconnected from authenticating user root 12.156.67.18 port 52722 [preauth]
Dec 06 09:28:18 np0005548789.localdomain sshd[124589]: Connection closed by invalid user www 161.248.200.221 port 43248 [preauth]
Dec 06 09:28:18 np0005548789.localdomain sudo[124668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebrbcvptetvpywmfiiwgfeepfmziyzuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013298.4744883-111-217061369213711/AnsiballZ_command.py
Dec 06 09:28:18 np0005548789.localdomain sudo[124668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:19 np0005548789.localdomain sshd[124671]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:19 np0005548789.localdomain python3.9[124670]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:19 np0005548789.localdomain sudo[124668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13460 DF PROTO=TCP SPT=39288 DPT=9105 SEQ=3189584631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA1BB00000000001030307) 
Dec 06 09:28:20 np0005548789.localdomain sshd[124671]: Invalid user ubuntu from 161.248.200.221 port 43252
Dec 06 09:28:20 np0005548789.localdomain python3.9[124763]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:20 np0005548789.localdomain sshd[124671]: Connection closed by invalid user ubuntu 161.248.200.221 port 43252 [preauth]
Dec 06 09:28:20 np0005548789.localdomain sshd[124796]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:20 np0005548789.localdomain sudo[124859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppxocaoxozmydrtewisbxdoaoxvydpzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013300.5611331-171-142645556524242/AnsiballZ_setup.py
Dec 06 09:28:20 np0005548789.localdomain sudo[124859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:21 np0005548789.localdomain python3.9[124861]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:28:21 np0005548789.localdomain sudo[124859]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:21 np0005548789.localdomain sshd[124796]: Connection closed by authenticating user root 161.248.200.221 port 43260 [preauth]
Dec 06 09:28:22 np0005548789.localdomain sudo[124913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhnbgpikoioeqpfhqnrffrlnimiwwkxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013300.5611331-171-142645556524242/AnsiballZ_dnf.py
Dec 06 09:28:22 np0005548789.localdomain sudo[124913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:22 np0005548789.localdomain sudo[124916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:28:22 np0005548789.localdomain sudo[124916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:22 np0005548789.localdomain sudo[124916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:22 np0005548789.localdomain sshd[124935]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:22 np0005548789.localdomain sudo[124931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:28:22 np0005548789.localdomain sudo[124931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:22 np0005548789.localdomain python3.9[124915]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:28:22 np0005548789.localdomain sudo[124931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:23 np0005548789.localdomain sshd[124935]: Connection closed by authenticating user root 161.248.200.221 port 43268 [preauth]
Dec 06 09:28:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13461 DF PROTO=TCP SPT=39288 DPT=9105 SEQ=3189584631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA2B6F0000000001030307) 
Dec 06 09:28:23 np0005548789.localdomain sshd[124983]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:24 np0005548789.localdomain sudo[124985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:28:24 np0005548789.localdomain sudo[124985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:28:24 np0005548789.localdomain sudo[124985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:24 np0005548789.localdomain sshd[124983]: Invalid user hduser from 161.248.200.221 port 43276
Dec 06 09:28:25 np0005548789.localdomain sshd[124983]: Connection closed by invalid user hduser 161.248.200.221 port 43276 [preauth]
Dec 06 09:28:25 np0005548789.localdomain sudo[124913]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:25 np0005548789.localdomain sshd[125000]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:25 np0005548789.localdomain sudo[125091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtozswzhfddhpfcayevegrmdkqwnxtwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013305.575577-207-157225955161926/AnsiballZ_setup.py
Dec 06 09:28:25 np0005548789.localdomain sudo[125091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53129 DF PROTO=TCP SPT=36238 DPT=9882 SEQ=998623718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA33F00000000001030307) 
Dec 06 09:28:26 np0005548789.localdomain python3.9[125093]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:28:26 np0005548789.localdomain sudo[125091]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:26 np0005548789.localdomain sshd[125000]: Connection closed by authenticating user root 161.248.200.221 port 47424 [preauth]
Dec 06 09:28:26 np0005548789.localdomain sshd[125203]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:27 np0005548789.localdomain sudo[125248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onlqjafiqgmuszrnqsmbyjzypvppblha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013306.7794344-240-254087695024971/AnsiballZ_file.py
Dec 06 09:28:27 np0005548789.localdomain sudo[125248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:27 np0005548789.localdomain python3.9[125250]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:27 np0005548789.localdomain sudo[125248]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:28 np0005548789.localdomain sshd[125203]: Invalid user hadoop from 161.248.200.221 port 47432
Dec 06 09:28:28 np0005548789.localdomain sshd[125203]: Connection closed by invalid user hadoop 161.248.200.221 port 47432 [preauth]
Dec 06 09:28:28 np0005548789.localdomain sudo[125340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nviddavpynzozubaqdqdedvvymruqalm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013307.5435402-264-197620746427279/AnsiballZ_command.py
Dec 06 09:28:28 np0005548789.localdomain sudo[125340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:28 np0005548789.localdomain sshd[125343]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:28 np0005548789.localdomain python3.9[125342]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:28 np0005548789.localdomain sudo[125340]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:29 np0005548789.localdomain sudo[125445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxuyxrzcvrufhkhcovjnexmnyauuihaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013308.778268-288-241374105287596/AnsiballZ_stat.py
Dec 06 09:28:29 np0005548789.localdomain sudo[125445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:29 np0005548789.localdomain python3.9[125447]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:28:29 np0005548789.localdomain sudo[125445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41235 DF PROTO=TCP SPT=60810 DPT=9101 SEQ=476356393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA41EF0000000001030307) 
Dec 06 09:28:29 np0005548789.localdomain sudo[125493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdahpcdgnfqfbmrlegsnftadobrmdbgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013308.778268-288-241374105287596/AnsiballZ_file.py
Dec 06 09:28:29 np0005548789.localdomain sudo[125493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:29 np0005548789.localdomain sshd[125343]: Invalid user admin from 161.248.200.221 port 47440
Dec 06 09:28:29 np0005548789.localdomain python3.9[125495]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:29 np0005548789.localdomain sudo[125493]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:29 np0005548789.localdomain sshd[125343]: Connection closed by invalid user admin 161.248.200.221 port 47440 [preauth]
Dec 06 09:28:30 np0005548789.localdomain sshd[125511]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:30 np0005548789.localdomain sudo[125587]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xywbgmybfqznweanrzyvhnrcvwmyspcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013310.1091344-324-60315192490320/AnsiballZ_stat.py
Dec 06 09:28:30 np0005548789.localdomain sudo[125587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:30 np0005548789.localdomain python3.9[125589]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:28:30 np0005548789.localdomain sudo[125587]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:31 np0005548789.localdomain sudo[125660]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhtaazphcylmvhirkrlxbkecscftmcjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013310.1091344-324-60315192490320/AnsiballZ_copy.py
Dec 06 09:28:31 np0005548789.localdomain sudo[125660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:31 np0005548789.localdomain sshd[125511]: Invalid user steam from 161.248.200.221 port 47444
Dec 06 09:28:31 np0005548789.localdomain python3.9[125662]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013310.1091344-324-60315192490320/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:31 np0005548789.localdomain sudo[125660]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:31 np0005548789.localdomain sshd[125511]: Connection closed by invalid user steam 161.248.200.221 port 47444 [preauth]
Dec 06 09:28:31 np0005548789.localdomain sshd[125709]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:31 np0005548789.localdomain sudo[125754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbaprcghkwwzlvyyvcahxemuzgcrhpzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013311.515567-372-70710575752446/AnsiballZ_ini_file.py
Dec 06 09:28:31 np0005548789.localdomain sudo[125754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13462 DF PROTO=TCP SPT=39288 DPT=9105 SEQ=3189584631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA4BEF0000000001030307) 
Dec 06 09:28:32 np0005548789.localdomain python3.9[125756]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:32 np0005548789.localdomain sudo[125754]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:32 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 06 09:28:32 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:28:32 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:28:32 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:28:32 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:28:32 np0005548789.localdomain sudo[125847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoissjnrshpxueidcgecuhilsmanrcmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013312.2276518-372-152152498301175/AnsiballZ_ini_file.py
Dec 06 09:28:32 np0005548789.localdomain sudo[125847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:32 np0005548789.localdomain python3.9[125849]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:32 np0005548789.localdomain sudo[125847]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:32 np0005548789.localdomain auditd[725]: Audit daemon rotating log files
Dec 06 09:28:32 np0005548789.localdomain sshd[125709]: Invalid user maria from 161.248.200.221 port 47450
Dec 06 09:28:32 np0005548789.localdomain sshd[125709]: Connection closed by invalid user maria 161.248.200.221 port 47450 [preauth]
Dec 06 09:28:33 np0005548789.localdomain sudo[125939]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdugtdzbilgvifvsbrhmcfgvljklhllw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013312.8147078-372-225407194115930/AnsiballZ_ini_file.py
Dec 06 09:28:33 np0005548789.localdomain sudo[125939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:33 np0005548789.localdomain sshd[125942]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:33 np0005548789.localdomain python3.9[125941]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:33 np0005548789.localdomain sudo[125939]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:33 np0005548789.localdomain sudo[126033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnbphnrartnzbltsvsxayltuxurpddtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013313.384203-372-103258001927925/AnsiballZ_ini_file.py
Dec 06 09:28:33 np0005548789.localdomain sudo[126033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:33 np0005548789.localdomain python3.9[126035]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:33 np0005548789.localdomain sudo[126033]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:34 np0005548789.localdomain sshd[125942]: Invalid user ftptest from 161.248.200.221 port 47456
Dec 06 09:28:34 np0005548789.localdomain sshd[125942]: Connection closed by invalid user ftptest 161.248.200.221 port 47456 [preauth]
Dec 06 09:28:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36607 DF PROTO=TCP SPT=32972 DPT=9102 SEQ=3652515637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA566F0000000001030307) 
Dec 06 09:28:34 np0005548789.localdomain sshd[126050]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:35 np0005548789.localdomain sshd[126052]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:35 np0005548789.localdomain sshd[126050]: Invalid user vps from 161.248.200.221 port 45040
Dec 06 09:28:36 np0005548789.localdomain sshd[126050]: Connection closed by invalid user vps 161.248.200.221 port 45040 [preauth]
Dec 06 09:28:36 np0005548789.localdomain sshd[126052]: Received disconnect from 81.192.46.35 port 54534:11: Bye Bye [preauth]
Dec 06 09:28:36 np0005548789.localdomain sshd[126052]: Disconnected from authenticating user root 81.192.46.35 port 54534 [preauth]
Dec 06 09:28:36 np0005548789.localdomain sshd[126113]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:37 np0005548789.localdomain python3.9[126131]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:37 np0005548789.localdomain sudo[126223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvseplhekaattbeixtzhwohbkqqjzmlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013317.3486629-492-210128920690666/AnsiballZ_dnf.py
Dec 06 09:28:37 np0005548789.localdomain sudo[126223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:37 np0005548789.localdomain sshd[126113]: Connection closed by authenticating user root 161.248.200.221 port 45052 [preauth]
Dec 06 09:28:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51172 DF PROTO=TCP SPT=54836 DPT=9102 SEQ=3373859316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA61F00000000001030307) 
Dec 06 09:28:37 np0005548789.localdomain python3.9[126225]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:37 np0005548789.localdomain sshd[126227]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:39 np0005548789.localdomain sshd[126227]: Invalid user deploy from 161.248.200.221 port 45066
Dec 06 09:28:39 np0005548789.localdomain sshd[126227]: Connection closed by invalid user deploy 161.248.200.221 port 45066 [preauth]
Dec 06 09:28:39 np0005548789.localdomain sshd[126230]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:40 np0005548789.localdomain sshd[126230]: Invalid user ubnt from 161.248.200.221 port 45080
Dec 06 09:28:40 np0005548789.localdomain sudo[126223]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36609 DF PROTO=TCP SPT=32972 DPT=9102 SEQ=3652515637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA6E2F0000000001030307) 
Dec 06 09:28:40 np0005548789.localdomain sshd[126230]: Connection closed by invalid user ubnt 161.248.200.221 port 45080 [preauth]
Dec 06 09:28:41 np0005548789.localdomain sshd[126246]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:41 np0005548789.localdomain sudo[126323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mddhapnolmvnpxvnnpfkvolzefazzpom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013321.2411513-516-122177645497719/AnsiballZ_dnf.py
Dec 06 09:28:41 np0005548789.localdomain sudo[126323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:41 np0005548789.localdomain sshd[126326]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:41 np0005548789.localdomain python3.9[126325]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:42 np0005548789.localdomain sshd[126246]: Connection closed by authenticating user root 161.248.200.221 port 45092 [preauth]
Dec 06 09:28:42 np0005548789.localdomain sshd[126330]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:43 np0005548789.localdomain sshd[126326]: Received disconnect from 103.157.25.60 port 39590:11: Bye Bye [preauth]
Dec 06 09:28:43 np0005548789.localdomain sshd[126326]: Disconnected from authenticating user root 103.157.25.60 port 39590 [preauth]
Dec 06 09:28:43 np0005548789.localdomain sshd[126330]: Invalid user ansadmin from 161.248.200.221 port 45096
Dec 06 09:28:44 np0005548789.localdomain sshd[126330]: Connection closed by invalid user ansadmin 161.248.200.221 port 45096 [preauth]
Dec 06 09:28:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60358 DF PROTO=TCP SPT=54274 DPT=9101 SEQ=1618947386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA7BC00000000001030307) 
Dec 06 09:28:44 np0005548789.localdomain sshd[126332]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:44 np0005548789.localdomain sudo[126323]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:45 np0005548789.localdomain sshd[126332]: Invalid user orangepi from 161.248.200.221 port 45104
Dec 06 09:28:45 np0005548789.localdomain sudo[126423]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abniwibypaztjgrjuqxllapxgvqfenpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013325.1516652-546-21798884195803/AnsiballZ_dnf.py
Dec 06 09:28:45 np0005548789.localdomain sudo[126423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:45 np0005548789.localdomain sshd[126332]: Connection closed by invalid user orangepi 161.248.200.221 port 45104 [preauth]
Dec 06 09:28:45 np0005548789.localdomain python3.9[126425]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:45 np0005548789.localdomain sshd[126427]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:47 np0005548789.localdomain sshd[126427]: Connection closed by authenticating user root 161.248.200.221 port 57610 [preauth]
Dec 06 09:28:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60360 DF PROTO=TCP SPT=54274 DPT=9101 SEQ=1618947386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA87B00000000001030307) 
Dec 06 09:28:47 np0005548789.localdomain sshd[126430]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:48 np0005548789.localdomain sshd[126430]: Invalid user odoo from 161.248.200.221 port 57620
Dec 06 09:28:48 np0005548789.localdomain sshd[126430]: Connection closed by invalid user odoo 161.248.200.221 port 57620 [preauth]
Dec 06 09:28:48 np0005548789.localdomain sudo[126423]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:49 np0005548789.localdomain sshd[126452]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:49 np0005548789.localdomain sudo[126529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzycevchuxebhqcdfpwoaqokpadftowr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013329.157195-573-4706848397754/AnsiballZ_dnf.py
Dec 06 09:28:49 np0005548789.localdomain sudo[126529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:49 np0005548789.localdomain python3.9[126531]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11914 DF PROTO=TCP SPT=43406 DPT=9105 SEQ=512472526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CA90F00000000001030307) 
Dec 06 09:28:50 np0005548789.localdomain sshd[126452]: Invalid user pi from 161.248.200.221 port 57636
Dec 06 09:28:50 np0005548789.localdomain sshd[126452]: Connection closed by invalid user pi 161.248.200.221 port 57636 [preauth]
Dec 06 09:28:50 np0005548789.localdomain sshd[126534]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:50 np0005548789.localdomain sshd[126536]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:51 np0005548789.localdomain sshd[126534]: Invalid user cloud from 161.248.200.221 port 57648
Dec 06 09:28:51 np0005548789.localdomain sshd[126534]: Connection closed by invalid user cloud 161.248.200.221 port 57648 [preauth]
Dec 06 09:28:52 np0005548789.localdomain sshd[126536]: Received disconnect from 118.193.38.207 port 45856:11: Bye Bye [preauth]
Dec 06 09:28:52 np0005548789.localdomain sshd[126536]: Disconnected from authenticating user root 118.193.38.207 port 45856 [preauth]
Dec 06 09:28:52 np0005548789.localdomain sshd[126538]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:52 np0005548789.localdomain sudo[126529]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:53 np0005548789.localdomain sshd[126538]: Invalid user oracle from 161.248.200.221 port 57662
Dec 06 09:28:53 np0005548789.localdomain sudo[126629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghbwdsmqkihuyjtocqtpourknqfflrpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013333.209952-609-97816158928599/AnsiballZ_dnf.py
Dec 06 09:28:53 np0005548789.localdomain sudo[126629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:53 np0005548789.localdomain sshd[126538]: Connection closed by invalid user oracle 161.248.200.221 port 57662 [preauth]
Dec 06 09:28:53 np0005548789.localdomain python3.9[126631]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11915 DF PROTO=TCP SPT=43406 DPT=9105 SEQ=512472526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAA0AF0000000001030307) 
Dec 06 09:28:53 np0005548789.localdomain sshd[126633]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:55 np0005548789.localdomain sshd[126633]: Connection closed by authenticating user root 161.248.200.221 port 57670 [preauth]
Dec 06 09:28:55 np0005548789.localdomain sshd[126636]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:55 np0005548789.localdomain sshd[126638]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:56 np0005548789.localdomain sshd[126636]: Invalid user odroid from 161.248.200.221 port 57452
Dec 06 09:28:56 np0005548789.localdomain sshd[126636]: Connection closed by invalid user odroid 161.248.200.221 port 57452 [preauth]
Dec 06 09:28:56 np0005548789.localdomain sudo[126629]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:56 np0005548789.localdomain sshd[126654]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38103 DF PROTO=TCP SPT=44998 DPT=9882 SEQ=629359971 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAAD700000000001030307) 
Dec 06 09:28:57 np0005548789.localdomain sudo[126731]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvhopzwnhxfnjsgevqcczejdhpqalzra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013337.3114383-636-124267835774187/AnsiballZ_dnf.py
Dec 06 09:28:57 np0005548789.localdomain sudo[126731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:57 np0005548789.localdomain sshd[126638]: Received disconnect from 103.192.152.59 port 56358:11: Bye Bye [preauth]
Dec 06 09:28:57 np0005548789.localdomain sshd[126638]: Disconnected from authenticating user root 103.192.152.59 port 56358 [preauth]
Dec 06 09:28:57 np0005548789.localdomain python3.9[126733]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:28:58 np0005548789.localdomain sshd[126654]: Invalid user moxa from 161.248.200.221 port 57458
Dec 06 09:28:58 np0005548789.localdomain sshd[126654]: Connection closed by invalid user moxa 161.248.200.221 port 57458 [preauth]
Dec 06 09:28:58 np0005548789.localdomain sshd[126736]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:28:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60362 DF PROTO=TCP SPT=54274 DPT=9101 SEQ=1618947386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAB7EF0000000001030307) 
Dec 06 09:28:59 np0005548789.localdomain sshd[126736]: Connection closed by authenticating user root 161.248.200.221 port 57472 [preauth]
Dec 06 09:29:00 np0005548789.localdomain sshd[126738]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:00 np0005548789.localdomain sudo[126731]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:01 np0005548789.localdomain sshd[126738]: Invalid user ubuntu from 161.248.200.221 port 57480
Dec 06 09:29:01 np0005548789.localdomain sshd[126738]: Connection closed by invalid user ubuntu 161.248.200.221 port 57480 [preauth]
Dec 06 09:29:01 np0005548789.localdomain sshd[126754]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:01 np0005548789.localdomain sudo[126831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muqmqoifoxrzcexeukztndfmzcezbnci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013341.63078-663-22975615466060/AnsiballZ_dnf.py
Dec 06 09:29:01 np0005548789.localdomain sudo[126831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:02 np0005548789.localdomain python3.9[126833]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:29:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11916 DF PROTO=TCP SPT=43406 DPT=9105 SEQ=512472526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAC1F00000000001030307) 
Dec 06 09:29:02 np0005548789.localdomain sshd[126836]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:02 np0005548789.localdomain sshd[126754]: Connection closed by authenticating user root 161.248.200.221 port 57482 [preauth]
Dec 06 09:29:03 np0005548789.localdomain sshd[126838]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:03 np0005548789.localdomain sshd[126836]: Invalid user ubuntu from 92.118.39.95 port 42282
Dec 06 09:29:03 np0005548789.localdomain sshd[126836]: Connection closed by invalid user ubuntu 92.118.39.95 port 42282 [preauth]
Dec 06 09:29:04 np0005548789.localdomain sshd[126838]: Invalid user admin from 161.248.200.221 port 57494
Dec 06 09:29:04 np0005548789.localdomain sshd[126838]: Connection closed by invalid user admin 161.248.200.221 port 57494 [preauth]
Dec 06 09:29:04 np0005548789.localdomain sshd[126840]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26227 DF PROTO=TCP SPT=44978 DPT=9102 SEQ=2002269931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CACBAF0000000001030307) 
Dec 06 09:29:05 np0005548789.localdomain sshd[126848]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:05 np0005548789.localdomain sshd[126840]: Connection closed by authenticating user root 161.248.200.221 port 44944 [preauth]
Dec 06 09:29:06 np0005548789.localdomain sshd[126850]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:07 np0005548789.localdomain sshd[126848]: Received disconnect from 103.234.151.178 port 14908:11: Bye Bye [preauth]
Dec 06 09:29:07 np0005548789.localdomain sshd[126848]: Disconnected from authenticating user root 103.234.151.178 port 14908 [preauth]
Dec 06 09:29:07 np0005548789.localdomain sshd[126850]: Invalid user odoo18 from 161.248.200.221 port 44946
Dec 06 09:29:07 np0005548789.localdomain sshd[126850]: Connection closed by invalid user odoo18 161.248.200.221 port 44946 [preauth]
Dec 06 09:29:07 np0005548789.localdomain sshd[126853]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41934 DF PROTO=TCP SPT=36116 DPT=9102 SEQ=2957430391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAD7EF0000000001030307) 
Dec 06 09:29:08 np0005548789.localdomain sshd[126853]: Invalid user devopsadmin from 161.248.200.221 port 44956
Dec 06 09:29:09 np0005548789.localdomain sshd[126853]: Connection closed by invalid user devopsadmin 161.248.200.221 port 44956 [preauth]
Dec 06 09:29:09 np0005548789.localdomain sshd[126856]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:10 np0005548789.localdomain sshd[126856]: Invalid user centos from 161.248.200.221 port 44970
Dec 06 09:29:10 np0005548789.localdomain sshd[126856]: Connection closed by invalid user centos 161.248.200.221 port 44970 [preauth]
Dec 06 09:29:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26229 DF PROTO=TCP SPT=44978 DPT=9102 SEQ=2002269931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAE36F0000000001030307) 
Dec 06 09:29:10 np0005548789.localdomain sshd[126904]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:11 np0005548789.localdomain sshd[126925]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:11 np0005548789.localdomain sudo[126831]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:12 np0005548789.localdomain sshd[126904]: Connection closed by authenticating user root 161.248.200.221 port 44978 [preauth]
Dec 06 09:29:12 np0005548789.localdomain sshd[126941]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:13 np0005548789.localdomain sudo[127018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdmicdvezdhqpkygdxztiumsoiyxtmoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013352.8206294-699-9992966641496/AnsiballZ_file.py
Dec 06 09:29:13 np0005548789.localdomain sudo[127018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:13 np0005548789.localdomain python3.9[127020]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:29:13 np0005548789.localdomain sudo[127018]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:13 np0005548789.localdomain sshd[126925]: Connection reset by authenticating user root 45.135.232.92 port 60328 [preauth]
Dec 06 09:29:13 np0005548789.localdomain sshd[127106]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:13 np0005548789.localdomain sudo[127124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irwiydmnbttmdgnrdtazvuvyulczkqai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013353.443281-723-221557040268258/AnsiballZ_stat.py
Dec 06 09:29:13 np0005548789.localdomain sudo[127124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:13 np0005548789.localdomain sshd[126941]: Connection closed by authenticating user root 161.248.200.221 port 44992 [preauth]
Dec 06 09:29:13 np0005548789.localdomain python3.9[127126]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:29:13 np0005548789.localdomain sudo[127124]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:14 np0005548789.localdomain sshd[127168]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:14 np0005548789.localdomain sudo[127200]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddvesotbsdforomsjmnfvyzdaqievtds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013353.443281-723-221557040268258/AnsiballZ_copy.py
Dec 06 09:29:14 np0005548789.localdomain sudo[127200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14834 DF PROTO=TCP SPT=39114 DPT=9101 SEQ=167283131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAF0F00000000001030307) 
Dec 06 09:29:14 np0005548789.localdomain python3.9[127202]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765013353.443281-723-221557040268258/.source.json _original_basename=.bi8r63uj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:29:14 np0005548789.localdomain sudo[127200]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:15 np0005548789.localdomain sshd[127168]: Invalid user admin from 161.248.200.221 port 44994
Dec 06 09:29:15 np0005548789.localdomain sudo[127292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxeiebotuorfrupdtczymiswfrpfeysr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013354.9456036-777-275031263365340/AnsiballZ_podman_image.py
Dec 06 09:29:15 np0005548789.localdomain sshd[127168]: Connection closed by invalid user admin 161.248.200.221 port 44994 [preauth]
Dec 06 09:29:15 np0005548789.localdomain sudo[127292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:15 np0005548789.localdomain sshd[127106]: Connection reset by authenticating user root 45.135.232.92 port 60338 [preauth]
Dec 06 09:29:15 np0005548789.localdomain python3.9[127294]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:15 np0005548789.localdomain sshd[127305]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:15 np0005548789.localdomain sshd[127319]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:16 np0005548789.localdomain sshd[127305]: Invalid user test from 161.248.200.221 port 38740
Dec 06 09:29:16 np0005548789.localdomain sshd[127305]: Connection closed by invalid user test 161.248.200.221 port 38740 [preauth]
Dec 06 09:29:17 np0005548789.localdomain sshd[127323]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14836 DF PROTO=TCP SPT=39114 DPT=9101 SEQ=167283131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CAFCEF0000000001030307) 
Dec 06 09:29:18 np0005548789.localdomain sshd[127323]: Invalid user user from 161.248.200.221 port 38742
Dec 06 09:29:18 np0005548789.localdomain sshd[127319]: Connection reset by authenticating user root 45.135.232.92 port 21678 [preauth]
Dec 06 09:29:18 np0005548789.localdomain sshd[127323]: Connection closed by invalid user user 161.248.200.221 port 38742 [preauth]
Dec 06 09:29:18 np0005548789.localdomain sshd[127331]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:18 np0005548789.localdomain sshd[127338]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14840 DF PROTO=TCP SPT=34580 DPT=9105 SEQ=105657166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB062F0000000001030307) 
Dec 06 09:29:19 np0005548789.localdomain sshd[127338]: Invalid user z from 161.248.200.221 port 38754
Dec 06 09:29:19 np0005548789.localdomain sshd[127341]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:20 np0005548789.localdomain sshd[127338]: Connection closed by invalid user z 161.248.200.221 port 38754 [preauth]
Dec 06 09:29:20 np0005548789.localdomain sshd[127355]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:21 np0005548789.localdomain sshd[127331]: Connection reset by authenticating user root 45.135.232.92 port 21724 [preauth]
Dec 06 09:29:21 np0005548789.localdomain sshd[127355]: Invalid user teste from 161.248.200.221 port 38762
Dec 06 09:29:21 np0005548789.localdomain sshd[127341]: Received disconnect from 64.227.156.63 port 46406:11: Bye Bye [preauth]
Dec 06 09:29:21 np0005548789.localdomain sshd[127341]: Disconnected from authenticating user root 64.227.156.63 port 46406 [preauth]
Dec 06 09:29:21 np0005548789.localdomain sshd[127375]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:21 np0005548789.localdomain podman[127308]: 2025-12-06 09:29:15.691597694 +0000 UTC m=+0.043209802 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:29:21 np0005548789.localdomain sshd[127355]: Connection closed by invalid user teste 161.248.200.221 port 38762 [preauth]
Dec 06 09:29:21 np0005548789.localdomain sshd[127431]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:21 np0005548789.localdomain sudo[127292]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:22 np0005548789.localdomain sshd[127431]: Invalid user user from 161.248.200.221 port 38778
Dec 06 09:29:23 np0005548789.localdomain sshd[127431]: Connection closed by invalid user user 161.248.200.221 port 38778 [preauth]
Dec 06 09:29:23 np0005548789.localdomain sshd[127375]: Connection reset by authenticating user root 45.135.232.92 port 21740 [preauth]
Dec 06 09:29:23 np0005548789.localdomain sshd[127514]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:23 np0005548789.localdomain sudo[127524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyanzgxukhygffdbvfvidqzftltustxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013362.6415708-810-173754080536191/AnsiballZ_podman_image.py
Dec 06 09:29:23 np0005548789.localdomain sudo[127524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:23 np0005548789.localdomain python3.9[127526]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14841 DF PROTO=TCP SPT=34580 DPT=9105 SEQ=105657166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB15EF0000000001030307) 
Dec 06 09:29:23 np0005548789.localdomain sshd[127551]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:24 np0005548789.localdomain sudo[127553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:29:24 np0005548789.localdomain sudo[127553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:24 np0005548789.localdomain sudo[127553]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:24 np0005548789.localdomain sudo[127568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:29:24 np0005548789.localdomain sudo[127568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:24 np0005548789.localdomain sshd[127551]: Received disconnect from 12.156.67.18 port 40898:11: Bye Bye [preauth]
Dec 06 09:29:24 np0005548789.localdomain sshd[127551]: Disconnected from authenticating user root 12.156.67.18 port 40898 [preauth]
Dec 06 09:29:24 np0005548789.localdomain sshd[127514]: Invalid user elastic from 161.248.200.221 port 38784
Dec 06 09:29:24 np0005548789.localdomain sshd[127514]: Connection closed by invalid user elastic 161.248.200.221 port 38784 [preauth]
Dec 06 09:29:24 np0005548789.localdomain sshd[127608]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:25 np0005548789.localdomain sudo[127568]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:25 np0005548789.localdomain sudo[127616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:29:25 np0005548789.localdomain sudo[127616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:25 np0005548789.localdomain sudo[127616]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:25 np0005548789.localdomain sudo[127631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 09:29:25 np0005548789.localdomain sudo[127631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38106 DF PROTO=TCP SPT=44998 DPT=9882 SEQ=629359971 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB1DEF0000000001030307) 
Dec 06 09:29:26 np0005548789.localdomain sshd[127608]: Invalid user alex from 161.248.200.221 port 38288
Dec 06 09:29:26 np0005548789.localdomain sshd[127608]: Connection closed by invalid user alex 161.248.200.221 port 38288 [preauth]
Dec 06 09:29:26 np0005548789.localdomain sshd[127670]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:27 np0005548789.localdomain sshd[127670]: Invalid user linux from 161.248.200.221 port 38298
Dec 06 09:29:27 np0005548789.localdomain sshd[127670]: Connection closed by invalid user linux 161.248.200.221 port 38298 [preauth]
Dec 06 09:29:28 np0005548789.localdomain sshd[127684]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:29 np0005548789.localdomain sshd[127684]: Connection closed by authenticating user root 161.248.200.221 port 38308 [preauth]
Dec 06 09:29:29 np0005548789.localdomain sshd[127686]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:29 np0005548789.localdomain sudo[127631]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14838 DF PROTO=TCP SPT=39114 DPT=9101 SEQ=167283131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB2DEF0000000001030307) 
Dec 06 09:29:30 np0005548789.localdomain sshd[127686]: Invalid user testuser from 161.248.200.221 port 38314
Dec 06 09:29:30 np0005548789.localdomain sshd[127686]: Connection closed by invalid user testuser 161.248.200.221 port 38314 [preauth]
Dec 06 09:29:30 np0005548789.localdomain podman[127538]: 2025-12-06 09:29:23.810098684 +0000 UTC m=+0.036690083 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:29:31 np0005548789.localdomain sudo[127748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:29:31 np0005548789.localdomain sudo[127748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:29:31 np0005548789.localdomain sudo[127748]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:31 np0005548789.localdomain sshd[127781]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:31 np0005548789.localdomain sudo[127524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14842 DF PROTO=TCP SPT=34580 DPT=9105 SEQ=105657166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB35F00000000001030307) 
Dec 06 09:29:32 np0005548789.localdomain sshd[127781]: Connection closed by authenticating user root 161.248.200.221 port 38324 [preauth]
Dec 06 09:29:32 np0005548789.localdomain sudo[127872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujepczxaoknsacsocgtmgahdlgwsinuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013372.2700539-846-167774954821055/AnsiballZ_podman_image.py
Dec 06 09:29:32 np0005548789.localdomain sudo[127872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:32 np0005548789.localdomain python3.9[127874]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:32 np0005548789.localdomain sshd[127887]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:33 np0005548789.localdomain sshd[127887]: Invalid user test from 161.248.200.221 port 38330
Dec 06 09:29:34 np0005548789.localdomain sshd[127887]: Connection closed by invalid user test 161.248.200.221 port 38330 [preauth]
Dec 06 09:29:34 np0005548789.localdomain sshd[127914]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:34 np0005548789.localdomain podman[127888]: 2025-12-06 09:29:32.816493596 +0000 UTC m=+0.044544664 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:29:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46070 DF PROTO=TCP SPT=43424 DPT=9102 SEQ=3390634261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB40AF0000000001030307) 
Dec 06 09:29:34 np0005548789.localdomain sudo[127872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:35 np0005548789.localdomain sshd[127914]: Invalid user oracle from 161.248.200.221 port 38346
Dec 06 09:29:35 np0005548789.localdomain sudo[128052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmblegzrhjpsjgppjgdezuumdvxcrjmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013375.1700995-873-213794144705539/AnsiballZ_podman_image.py
Dec 06 09:29:35 np0005548789.localdomain sudo[128052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:35 np0005548789.localdomain sshd[127914]: Connection closed by invalid user oracle 161.248.200.221 port 38346 [preauth]
Dec 06 09:29:35 np0005548789.localdomain python3.9[128054]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:35 np0005548789.localdomain sshd[128079]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:36 np0005548789.localdomain podman[128066]: 2025-12-06 09:29:35.763366962 +0000 UTC m=+0.029144342 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:29:36 np0005548789.localdomain sshd[128079]: Invalid user jenkins from 161.248.200.221 port 37942
Dec 06 09:29:37 np0005548789.localdomain sudo[128052]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:37 np0005548789.localdomain sshd[128079]: Connection closed by invalid user jenkins 161.248.200.221 port 37942 [preauth]
Dec 06 09:29:37 np0005548789.localdomain sshd[128157]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:37 np0005548789.localdomain sudo[128234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhewzlnxeyrrxezrasxsafltfqngtdam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013377.446562-900-24182909034360/AnsiballZ_podman_image.py
Dec 06 09:29:37 np0005548789.localdomain sudo[128234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:37 np0005548789.localdomain python3.9[128236]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:38 np0005548789.localdomain sshd[128157]: Invalid user solr from 161.248.200.221 port 37946
Dec 06 09:29:38 np0005548789.localdomain sshd[128157]: Connection closed by invalid user solr 161.248.200.221 port 37946 [preauth]
Dec 06 09:29:38 np0005548789.localdomain sshd[128262]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52445 DF PROTO=TCP SPT=39700 DPT=9882 SEQ=3386255856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB51F00000000001030307) 
Dec 06 09:29:39 np0005548789.localdomain sshd[128262]: Invalid user postgres from 161.248.200.221 port 37962
Dec 06 09:29:40 np0005548789.localdomain sshd[128262]: Connection closed by invalid user postgres 161.248.200.221 port 37962 [preauth]
Dec 06 09:29:40 np0005548789.localdomain sshd[128276]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46072 DF PROTO=TCP SPT=43424 DPT=9102 SEQ=3390634261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB586F0000000001030307) 
Dec 06 09:29:41 np0005548789.localdomain podman[128248]: 2025-12-06 09:29:38.0137414 +0000 UTC m=+0.044833372 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:29:41 np0005548789.localdomain sudo[128234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:41 np0005548789.localdomain sshd[128276]: Invalid user ubuntu from 161.248.200.221 port 37964
Dec 06 09:29:41 np0005548789.localdomain sudo[128430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haedthshctocontobcqoisgmszzkfkyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013381.5058002-900-32435169074771/AnsiballZ_podman_image.py
Dec 06 09:29:41 np0005548789.localdomain sudo[128430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:41 np0005548789.localdomain sshd[128276]: Connection closed by invalid user ubuntu 161.248.200.221 port 37964 [preauth]
Dec 06 09:29:41 np0005548789.localdomain python3.9[128432]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 06 09:29:41 np0005548789.localdomain sshd[128443]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:42 np0005548789.localdomain sshd[128448]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:42 np0005548789.localdomain sshd[128462]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:42 np0005548789.localdomain sshd[128448]: Received disconnect from 179.33.210.213 port 47580:11: Bye Bye [preauth]
Dec 06 09:29:42 np0005548789.localdomain sshd[128448]: Disconnected from authenticating user root 179.33.210.213 port 47580 [preauth]
Dec 06 09:29:42 np0005548789.localdomain sshd[128443]: Invalid user ecs-user from 161.248.200.221 port 37978
Dec 06 09:29:43 np0005548789.localdomain sshd[128443]: Connection closed by invalid user ecs-user 161.248.200.221 port 37978 [preauth]
Dec 06 09:29:43 np0005548789.localdomain podman[128447]: 2025-12-06 09:29:42.054575691 +0000 UTC m=+0.040404606 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 06 09:29:43 np0005548789.localdomain sshd[128509]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:43 np0005548789.localdomain sshd[128462]: Received disconnect from 81.192.46.35 port 52850:11: Bye Bye [preauth]
Dec 06 09:29:43 np0005548789.localdomain sshd[128462]: Disconnected from authenticating user root 81.192.46.35 port 52850 [preauth]
Dec 06 09:29:43 np0005548789.localdomain sudo[128430]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22771 DF PROTO=TCP SPT=43078 DPT=9101 SEQ=4011535923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB66200000000001030307) 
Dec 06 09:29:44 np0005548789.localdomain sshd[128509]: Invalid user test from 161.248.200.221 port 37986
Dec 06 09:29:44 np0005548789.localdomain sshd[128509]: Connection closed by invalid user test 161.248.200.221 port 37986 [preauth]
Dec 06 09:29:45 np0005548789.localdomain sshd[128560]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:45 np0005548789.localdomain sshd[124379]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:29:45 np0005548789.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Dec 06 09:29:45 np0005548789.localdomain systemd[1]: session-40.scope: Consumed 1min 25.599s CPU time.
Dec 06 09:29:45 np0005548789.localdomain systemd-logind[766]: Session 40 logged out. Waiting for processes to exit.
Dec 06 09:29:45 np0005548789.localdomain systemd-logind[766]: Removed session 40.
Dec 06 09:29:46 np0005548789.localdomain sshd[128560]: Invalid user oracle from 161.248.200.221 port 49420
Dec 06 09:29:46 np0005548789.localdomain sshd[128560]: Connection closed by invalid user oracle 161.248.200.221 port 49420 [preauth]
Dec 06 09:29:46 np0005548789.localdomain sshd[128563]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22773 DF PROTO=TCP SPT=43078 DPT=9101 SEQ=4011535923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB722F0000000001030307) 
Dec 06 09:29:47 np0005548789.localdomain sshd[128563]: Connection closed by authenticating user root 161.248.200.221 port 49422 [preauth]
Dec 06 09:29:48 np0005548789.localdomain sshd[128565]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:49 np0005548789.localdomain sshd[128565]: Invalid user fa from 161.248.200.221 port 49436
Dec 06 09:29:49 np0005548789.localdomain sshd[128565]: Connection closed by invalid user fa 161.248.200.221 port 49436 [preauth]
Dec 06 09:29:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8048 DF PROTO=TCP SPT=36620 DPT=9105 SEQ=3809635373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB7B2F0000000001030307) 
Dec 06 09:29:49 np0005548789.localdomain sshd[128567]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:50 np0005548789.localdomain sshd[128567]: Invalid user esuser from 161.248.200.221 port 49442
Dec 06 09:29:50 np0005548789.localdomain sshd[128569]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:51 np0005548789.localdomain sshd[128567]: Connection closed by invalid user esuser 161.248.200.221 port 49442 [preauth]
Dec 06 09:29:51 np0005548789.localdomain sshd[128569]: Accepted publickey for zuul from 192.168.122.30 port 52224 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:29:51 np0005548789.localdomain systemd-logind[766]: New session 41 of user zuul.
Dec 06 09:29:51 np0005548789.localdomain systemd[1]: Started Session 41 of User zuul.
Dec 06 09:29:51 np0005548789.localdomain sshd[128569]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:29:51 np0005548789.localdomain sshd[128605]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:52 np0005548789.localdomain python3.9[128664]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:29:52 np0005548789.localdomain sshd[128605]: Connection closed by authenticating user root 161.248.200.221 port 49452 [preauth]
Dec 06 09:29:52 np0005548789.localdomain sshd[128715]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:53 np0005548789.localdomain sudo[128886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpasmjaiygmbwgcbtmqesrxlreivhoqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013392.720298-69-129246177842261/AnsiballZ_getent.py
Dec 06 09:29:53 np0005548789.localdomain sudo[128886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8049 DF PROTO=TCP SPT=36620 DPT=9105 SEQ=3809635373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB8AEF0000000001030307) 
Dec 06 09:29:54 np0005548789.localdomain sshd[128715]: Connection closed by authenticating user root 161.248.200.221 port 49460 [preauth]
Dec 06 09:29:54 np0005548789.localdomain python3.9[128888]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 06 09:29:54 np0005548789.localdomain sudo[128886]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:54 np0005548789.localdomain sshd[129025]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:54 np0005548789.localdomain sudo[129103]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbnwlrjbvllwmxxnsfvupppphdsxlinh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013394.6569927-105-261948721262112/AnsiballZ_setup.py
Dec 06 09:29:54 np0005548789.localdomain sudo[129103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:55 np0005548789.localdomain python3.9[129105]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:29:55 np0005548789.localdomain sshd[129025]: Invalid user devuser from 161.248.200.221 port 49466
Dec 06 09:29:55 np0005548789.localdomain sudo[129103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:29:55 np0005548789.localdomain sshd[129025]: Connection closed by invalid user devuser 161.248.200.221 port 49466 [preauth]
Dec 06 09:29:55 np0005548789.localdomain sshd[129127]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:56 np0005548789.localdomain sudo[129159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmutdfaiezqospttullhfvsqskzuxaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013394.6569927-105-261948721262112/AnsiballZ_dnf.py
Dec 06 09:29:56 np0005548789.localdomain sudo[129159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:29:56 np0005548789.localdomain python3.9[129161]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:29:56 np0005548789.localdomain sshd[129127]: Invalid user vagrant from 161.248.200.221 port 33470
Dec 06 09:29:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28777 DF PROTO=TCP SPT=51968 DPT=9882 SEQ=3764173009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CB97AF0000000001030307) 
Dec 06 09:29:57 np0005548789.localdomain sshd[129127]: Connection closed by invalid user vagrant 161.248.200.221 port 33470 [preauth]
Dec 06 09:29:57 np0005548789.localdomain sshd[129164]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:59 np0005548789.localdomain sshd[129164]: Connection closed by authenticating user root 161.248.200.221 port 33478 [preauth]
Dec 06 09:29:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22775 DF PROTO=TCP SPT=43078 DPT=9101 SEQ=4011535923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBA1F00000000001030307) 
Dec 06 09:29:59 np0005548789.localdomain sshd[129422]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:29:59 np0005548789.localdomain sudo[129159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:00 np0005548789.localdomain sudo[129513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xffmvkwuzwmpodezcacbtnoowmdvvzsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013400.3358421-147-135928356111227/AnsiballZ_dnf.py
Dec 06 09:30:00 np0005548789.localdomain sudo[129513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:00 np0005548789.localdomain python3.9[129515]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:00 np0005548789.localdomain sshd[129422]: Invalid user deploy from 161.248.200.221 port 33482
Dec 06 09:30:01 np0005548789.localdomain sshd[129422]: Connection closed by invalid user deploy 161.248.200.221 port 33482 [preauth]
Dec 06 09:30:01 np0005548789.localdomain sshd[129518]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8050 DF PROTO=TCP SPT=36620 DPT=9105 SEQ=3809635373 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBABEF0000000001030307) 
Dec 06 09:30:02 np0005548789.localdomain sshd[129518]: Invalid user docker from 161.248.200.221 port 33486
Dec 06 09:30:02 np0005548789.localdomain sshd[129518]: Connection closed by invalid user docker 161.248.200.221 port 33486 [preauth]
Dec 06 09:30:03 np0005548789.localdomain sshd[129520]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:04 np0005548789.localdomain sshd[129520]: Invalid user admin from 161.248.200.221 port 33496
Dec 06 09:30:04 np0005548789.localdomain sudo[129513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:04 np0005548789.localdomain sshd[129534]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:04 np0005548789.localdomain sshd[129520]: Connection closed by invalid user admin 161.248.200.221 port 33496 [preauth]
Dec 06 09:30:04 np0005548789.localdomain sshd[129582]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52667 DF PROTO=TCP SPT=37752 DPT=9102 SEQ=1262336413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBB5EF0000000001030307) 
Dec 06 09:30:05 np0005548789.localdomain sudo[129627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdhcegeximeeagomifbgidomxhhfnfro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013404.2687645-171-75015117629623/AnsiballZ_systemd.py
Dec 06 09:30:05 np0005548789.localdomain sudo[129627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:05 np0005548789.localdomain sshd[129534]: Received disconnect from 103.157.25.60 port 41272:11: Bye Bye [preauth]
Dec 06 09:30:05 np0005548789.localdomain sshd[129534]: Disconnected from authenticating user root 103.157.25.60 port 41272 [preauth]
Dec 06 09:30:05 np0005548789.localdomain python3.9[129629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:30:05 np0005548789.localdomain sshd[129582]: Connection closed by authenticating user root 161.248.200.221 port 33500 [preauth]
Dec 06 09:30:06 np0005548789.localdomain sshd[129632]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:06 np0005548789.localdomain sudo[129627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26232 DF PROTO=TCP SPT=44978 DPT=9102 SEQ=2002269931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBC1EF0000000001030307) 
Dec 06 09:30:07 np0005548789.localdomain sshd[129632]: Invalid user max from 161.248.200.221 port 36736
Dec 06 09:30:08 np0005548789.localdomain sshd[129632]: Connection closed by invalid user max 161.248.200.221 port 36736 [preauth]
Dec 06 09:30:08 np0005548789.localdomain sshd[129725]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:08 np0005548789.localdomain python3.9[129724]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:30:09 np0005548789.localdomain sshd[129725]: Invalid user deploy from 161.248.200.221 port 36742
Dec 06 09:30:09 np0005548789.localdomain sudo[129816]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbtxzgddynxafsirxtbveempkgnlahlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013409.1811793-225-251347273924491/AnsiballZ_sefcontext.py
Dec 06 09:30:09 np0005548789.localdomain sudo[129816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:09 np0005548789.localdomain sshd[129725]: Connection closed by invalid user deploy 161.248.200.221 port 36742 [preauth]
Dec 06 09:30:09 np0005548789.localdomain python3.9[129818]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 06 09:30:10 np0005548789.localdomain sshd[129819]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52669 DF PROTO=TCP SPT=37752 DPT=9102 SEQ=1262336413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBCDB00000000001030307) 
Dec 06 09:30:11 np0005548789.localdomain sshd[129819]: Invalid user odoo from 161.248.200.221 port 36756
Dec 06 09:30:11 np0005548789.localdomain sshd[129819]: Connection closed by invalid user odoo 161.248.200.221 port 36756 [preauth]
Dec 06 09:30:11 np0005548789.localdomain sshd[129825]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  Converting 2756 SID table entries...
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:30:11 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:30:11 np0005548789.localdomain sudo[129816]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:12 np0005548789.localdomain sshd[129878]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:12 np0005548789.localdomain sshd[129825]: Invalid user a from 161.248.200.221 port 36758
Dec 06 09:30:12 np0005548789.localdomain python3.9[129919]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:12 np0005548789.localdomain sshd[129825]: Connection closed by invalid user a 161.248.200.221 port 36758 [preauth]
Dec 06 09:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:30:13 np0005548789.localdomain sshd[129940]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:13 np0005548789.localdomain sudo[130017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjgxcapkibxpejojcjorfytkihvareng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013413.2514286-279-50255511683148/AnsiballZ_dnf.py
Dec 06 09:30:13 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Dec 06 09:30:13 np0005548789.localdomain sudo[130017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:13 np0005548789.localdomain python3.9[130019]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:13 np0005548789.localdomain sshd[129878]: Received disconnect from 118.193.38.207 port 57380:11: Bye Bye [preauth]
Dec 06 09:30:13 np0005548789.localdomain sshd[129878]: Disconnected from authenticating user root 118.193.38.207 port 57380 [preauth]
Dec 06 09:30:14 np0005548789.localdomain sshd[129940]: Invalid user bitnami from 161.248.200.221 port 36762
Dec 06 09:30:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63456 DF PROTO=TCP SPT=46218 DPT=9101 SEQ=4105108448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBDB500000000001030307) 
Dec 06 09:30:14 np0005548789.localdomain sshd[129940]: Connection closed by invalid user bitnami 161.248.200.221 port 36762 [preauth]
Dec 06 09:30:14 np0005548789.localdomain sshd[130022]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:15 np0005548789.localdomain sshd[130022]: Invalid user test from 161.248.200.221 port 44548
Dec 06 09:30:15 np0005548789.localdomain sshd[130022]: Connection closed by invalid user test 161.248.200.221 port 44548 [preauth]
Dec 06 09:30:16 np0005548789.localdomain sshd[130024]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:17 np0005548789.localdomain sshd[130024]: Invalid user ec2-user from 161.248.200.221 port 44562
Dec 06 09:30:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63458 DF PROTO=TCP SPT=46218 DPT=9101 SEQ=4105108448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBE76F0000000001030307) 
Dec 06 09:30:17 np0005548789.localdomain sudo[130017]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:17 np0005548789.localdomain sshd[130024]: Connection closed by invalid user ec2-user 161.248.200.221 port 44562 [preauth]
Dec 06 09:30:17 np0005548789.localdomain sshd[130040]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:18 np0005548789.localdomain sudo[130117]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpiqmtpwifujjufrksvzfsnifaexfkmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013417.983705-303-105414729152288/AnsiballZ_command.py
Dec 06 09:30:18 np0005548789.localdomain sudo[130117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:18 np0005548789.localdomain python3.9[130119]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:30:18 np0005548789.localdomain sshd[130040]: Invalid user postgres from 161.248.200.221 port 44564
Dec 06 09:30:19 np0005548789.localdomain sshd[130040]: Connection closed by invalid user postgres 161.248.200.221 port 44564 [preauth]
Dec 06 09:30:19 np0005548789.localdomain sshd[130273]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:19 np0005548789.localdomain sudo[130117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40440 DF PROTO=TCP SPT=56598 DPT=9105 SEQ=3260359937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CBF06F0000000001030307) 
Dec 06 09:30:19 np0005548789.localdomain sudo[130364]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvkwzrktwxmgzhhqryhqaqneimfqfaug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013419.603433-327-255433343299639/AnsiballZ_file.py
Dec 06 09:30:19 np0005548789.localdomain sudo[130364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:20 np0005548789.localdomain python3.9[130366]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:30:20 np0005548789.localdomain sudo[130364]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:20 np0005548789.localdomain sshd[130273]: Invalid user debian from 161.248.200.221 port 44572
Dec 06 09:30:20 np0005548789.localdomain sshd[130273]: Connection closed by invalid user debian 161.248.200.221 port 44572 [preauth]
Dec 06 09:30:20 np0005548789.localdomain python3.9[130456]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:30:20 np0005548789.localdomain sshd[130459]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:21 np0005548789.localdomain sudo[130550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obdqmhxzvslhjdxsnnszywvqiqgttkyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013421.1652255-381-210107624322824/AnsiballZ_dnf.py
Dec 06 09:30:21 np0005548789.localdomain sudo[130550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:21 np0005548789.localdomain python3.9[130552]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:22 np0005548789.localdomain sshd[130459]: Connection closed by authenticating user root 161.248.200.221 port 44576 [preauth]
Dec 06 09:30:22 np0005548789.localdomain sshd[130555]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:22 np0005548789.localdomain sshd[130557]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:23 np0005548789.localdomain sshd[130555]: Invalid user pi from 161.248.200.221 port 44588
Dec 06 09:30:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40441 DF PROTO=TCP SPT=56598 DPT=9105 SEQ=3260359937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC002F0000000001030307) 
Dec 06 09:30:23 np0005548789.localdomain sshd[130555]: Connection closed by invalid user pi 161.248.200.221 port 44588 [preauth]
Dec 06 09:30:24 np0005548789.localdomain sshd[130559]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:24 np0005548789.localdomain sshd[130561]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:24 np0005548789.localdomain sshd[130557]: Received disconnect from 103.234.151.178 port 41044:11: Bye Bye [preauth]
Dec 06 09:30:24 np0005548789.localdomain sshd[130557]: Disconnected from authenticating user root 103.234.151.178 port 41044 [preauth]
Dec 06 09:30:24 np0005548789.localdomain sudo[130550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:25 np0005548789.localdomain sshd[130559]: Invalid user linuxadmin from 161.248.200.221 port 44592
Dec 06 09:30:25 np0005548789.localdomain sshd[130559]: Connection closed by invalid user linuxadmin 161.248.200.221 port 44592 [preauth]
Dec 06 09:30:25 np0005548789.localdomain sudo[130652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfbrybclejwyjskvaqqgqajyqtrpwgng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013425.241892-405-266862786451651/AnsiballZ_dnf.py
Dec 06 09:30:25 np0005548789.localdomain sudo[130652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:25 np0005548789.localdomain sshd[130655]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28780 DF PROTO=TCP SPT=51968 DPT=9882 SEQ=3764173009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC07EF0000000001030307) 
Dec 06 09:30:25 np0005548789.localdomain python3.9[130654]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:30:26 np0005548789.localdomain sshd[130561]: Received disconnect from 103.192.152.59 port 53598:11: Bye Bye [preauth]
Dec 06 09:30:26 np0005548789.localdomain sshd[130561]: Disconnected from authenticating user root 103.192.152.59 port 53598 [preauth]
Dec 06 09:30:26 np0005548789.localdomain sshd[130655]: Invalid user test from 161.248.200.221 port 53016
Dec 06 09:30:26 np0005548789.localdomain sshd[130655]: Connection closed by invalid user test 161.248.200.221 port 53016 [preauth]
Dec 06 09:30:27 np0005548789.localdomain sshd[130659]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:27 np0005548789.localdomain sshd[130661]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:28 np0005548789.localdomain sshd[130661]: Received disconnect from 12.156.67.18 port 54804:11: Bye Bye [preauth]
Dec 06 09:30:28 np0005548789.localdomain sshd[130661]: Disconnected from authenticating user root 12.156.67.18 port 54804 [preauth]
Dec 06 09:30:28 np0005548789.localdomain sshd[130659]: Connection closed by authenticating user root 161.248.200.221 port 53020 [preauth]
Dec 06 09:30:28 np0005548789.localdomain sudo[130652]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:28 np0005548789.localdomain sshd[130663]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:29 np0005548789.localdomain sudo[130754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpntbcsiupvpzfcmxpusenxiqlrpurrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013429.141228-429-125068492269701/AnsiballZ_systemd.py
Dec 06 09:30:29 np0005548789.localdomain sudo[130754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:29 np0005548789.localdomain python3.9[130756]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:30:29 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:30:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63460 DF PROTO=TCP SPT=46218 DPT=9101 SEQ=4105108448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC17EF0000000001030307) 
Dec 06 09:30:29 np0005548789.localdomain systemd-rc-local-generator[130784]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:30:29 np0005548789.localdomain systemd-sysv-generator[130789]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:30:29 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:30:30 np0005548789.localdomain sshd[130663]: Invalid user testuser from 161.248.200.221 port 53026
Dec 06 09:30:30 np0005548789.localdomain sudo[130754]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:30 np0005548789.localdomain sshd[130663]: Connection closed by invalid user testuser 161.248.200.221 port 53026 [preauth]
Dec 06 09:30:30 np0005548789.localdomain sshd[130810]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:31 np0005548789.localdomain sudo[130812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:30:31 np0005548789.localdomain sudo[130812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:31 np0005548789.localdomain sudo[130812]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:31 np0005548789.localdomain sudo[130827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:30:31 np0005548789.localdomain sudo[130827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:31 np0005548789.localdomain sudo[130932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbnovsiplxzwxhzrfdsufospbmbqsvhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013431.5978441-459-198153746597258/AnsiballZ_stat.py
Dec 06 09:30:31 np0005548789.localdomain sudo[130932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40442 DF PROTO=TCP SPT=56598 DPT=9105 SEQ=3260359937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC1FEF0000000001030307) 
Dec 06 09:30:31 np0005548789.localdomain sshd[130810]: Invalid user deployer from 161.248.200.221 port 53028
Dec 06 09:30:32 np0005548789.localdomain sudo[130827]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:32 np0005548789.localdomain python3.9[130937]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:30:32 np0005548789.localdomain sudo[130932]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:32 np0005548789.localdomain sshd[130810]: Connection closed by invalid user deployer 161.248.200.221 port 53028 [preauth]
Dec 06 09:30:32 np0005548789.localdomain sshd[130998]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:32 np0005548789.localdomain sudo[131043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uausqsdpaebzhvdongcpyztbifyijbbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013432.3223605-486-102467820769836/AnsiballZ_ini_file.py
Dec 06 09:30:32 np0005548789.localdomain sudo[131043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:32 np0005548789.localdomain python3.9[131045]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:32 np0005548789.localdomain sudo[131043]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:33 np0005548789.localdomain sudo[131137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvwexuthsuvzsbpvvkxdjqpoxprnvedy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013433.1218228-510-40430286500594/AnsiballZ_ini_file.py
Dec 06 09:30:33 np0005548789.localdomain sudo[131137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:33 np0005548789.localdomain sshd[130998]: Invalid user jenkins from 161.248.200.221 port 53036
Dec 06 09:30:33 np0005548789.localdomain python3.9[131139]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:33 np0005548789.localdomain sudo[131137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:33 np0005548789.localdomain sshd[130998]: Connection closed by invalid user jenkins 161.248.200.221 port 53036 [preauth]
Dec 06 09:30:33 np0005548789.localdomain sudo[131229]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orcsgafsccibgmxueuqrazcxoohzyuge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013433.7660358-534-152321038624155/AnsiballZ_ini_file.py
Dec 06 09:30:34 np0005548789.localdomain sudo[131229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:34 np0005548789.localdomain python3.9[131231]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:34 np0005548789.localdomain sudo[131229]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35371 DF PROTO=TCP SPT=42664 DPT=9102 SEQ=2431858908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC2B2F0000000001030307) 
Dec 06 09:30:34 np0005548789.localdomain sudo[131321]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daljhfhugxggfpcvoymqlkyqxjaglptr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013434.746427-564-252547989457630/AnsiballZ_stat.py
Dec 06 09:30:34 np0005548789.localdomain sudo[131321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:35 np0005548789.localdomain python3.9[131323]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:35 np0005548789.localdomain sudo[131321]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:35 np0005548789.localdomain sudo[131337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:30:35 np0005548789.localdomain sudo[131337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:30:35 np0005548789.localdomain sudo[131337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:35 np0005548789.localdomain sudo[131409]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uawvkwkdxpjdqivvyounnalbnvzhjqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013434.746427-564-252547989457630/AnsiballZ_copy.py
Dec 06 09:30:35 np0005548789.localdomain sudo[131409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:35 np0005548789.localdomain python3.9[131411]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013434.746427-564-252547989457630/.source _original_basename=.1xvvien5 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:35 np0005548789.localdomain sudo[131409]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:36 np0005548789.localdomain sudo[131501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xawbzhuiupbmflalbblqsgifakysdmgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013436.037976-609-238675976373811/AnsiballZ_file.py
Dec 06 09:30:36 np0005548789.localdomain sudo[131501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:36 np0005548789.localdomain python3.9[131503]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:36 np0005548789.localdomain sudo[131501]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:37 np0005548789.localdomain sudo[131593]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfcnrmqpijzqsmvabejpacexibplkzgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013436.719895-633-154705629344091/AnsiballZ_edpm_os_net_config_mappings.py
Dec 06 09:30:37 np0005548789.localdomain sudo[131593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:37 np0005548789.localdomain python3.9[131595]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 06 09:30:37 np0005548789.localdomain sudo[131593]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:37 np0005548789.localdomain sudo[131685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tupszropmwuiknaaunfacglhrbinpwhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013437.5826526-660-114941807881508/AnsiballZ_file.py
Dec 06 09:30:37 np0005548789.localdomain sudo[131685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:37 np0005548789.localdomain python3.9[131687]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:38 np0005548789.localdomain sudo[131685]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:38 np0005548789.localdomain sudo[131777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zopufsxhcdfrpyqytjquyffpfswwzojd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013438.4440038-690-203314695616937/AnsiballZ_stat.py
Dec 06 09:30:38 np0005548789.localdomain sudo[131777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:38 np0005548789.localdomain python3.9[131779]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:38 np0005548789.localdomain sudo[131777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:39 np0005548789.localdomain sudo[131850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cetenjehphtinqxnfnkeiaolpzmeaquj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013438.4440038-690-203314695616937/AnsiballZ_copy.py
Dec 06 09:30:39 np0005548789.localdomain sudo[131850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:39 np0005548789.localdomain python3.9[131852]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013438.4440038-690-203314695616937/.source.yaml _original_basename=.kv_jn06l follow=False checksum=06d744ebe702728c19f6d1a8f97158d086012058 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:39 np0005548789.localdomain sudo[131850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62219 DF PROTO=TCP SPT=50060 DPT=9882 SEQ=6242818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC3DF00000000001030307) 
Dec 06 09:30:40 np0005548789.localdomain sudo[131942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szsqdbbanlgvandvtlfdihaluxihsyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013439.6781807-735-266316895836970/AnsiballZ_slurp.py
Dec 06 09:30:40 np0005548789.localdomain sudo[131942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:40 np0005548789.localdomain python3.9[131944]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 06 09:30:40 np0005548789.localdomain sudo[131942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35373 DF PROTO=TCP SPT=42664 DPT=9102 SEQ=2431858908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC42EF0000000001030307) 
Dec 06 09:30:41 np0005548789.localdomain sudo[132047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llnokbqosptumgnqbaehmvzgbsnvcjun ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.1552155-762-249806775900804/async_wrapper.py j406301226221 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.1552155-762-249806775900804/AnsiballZ_edpm_os_net_config.py _
Dec 06 09:30:41 np0005548789.localdomain sudo[132047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132049]: Invoked with j406301226221 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013441.1552155-762-249806775900804/AnsiballZ_edpm_os_net_config.py _
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132052]: Starting module and watcher
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132052]: Start watching 132053 (300)
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132053]: Start module (132053)
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132049]: Return async_wrapper task started.
Dec 06 09:30:42 np0005548789.localdomain sudo[132047]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:42 np0005548789.localdomain python3.9[132054]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Dec 06 09:30:42 np0005548789.localdomain ansible-async_wrapper.py[132053]: Module complete (132053)
Dec 06 09:30:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50038 DF PROTO=TCP SPT=55042 DPT=9101 SEQ=3105452605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC50800000000001030307) 
Dec 06 09:30:45 np0005548789.localdomain sudo[132144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwazkjbsnmrmuqcgjhrcyzctnytyrobx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013445.2267601-762-253070417440878/AnsiballZ_async_status.py
Dec 06 09:30:45 np0005548789.localdomain sudo[132144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:45 np0005548789.localdomain python3.9[132146]: ansible-ansible.legacy.async_status Invoked with jid=j406301226221.132049 mode=status _async_dir=/root/.ansible_async
Dec 06 09:30:45 np0005548789.localdomain sudo[132144]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:46 np0005548789.localdomain sudo[132203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teiwdcjlssqzrqxnfrzlxedjjcgfkihc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013445.2267601-762-253070417440878/AnsiballZ_async_status.py
Dec 06 09:30:46 np0005548789.localdomain sudo[132203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:46 np0005548789.localdomain sshd[132206]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:46 np0005548789.localdomain python3.9[132205]: ansible-ansible.legacy.async_status Invoked with jid=j406301226221.132049 mode=cleanup _async_dir=/root/.ansible_async
Dec 06 09:30:46 np0005548789.localdomain sudo[132203]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:46 np0005548789.localdomain sudo[132297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjecysnyacudkuxutesxtmfuzxjerase ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013446.5525827-828-211742844102186/AnsiballZ_stat.py
Dec 06 09:30:46 np0005548789.localdomain sudo[132297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:46 np0005548789.localdomain sshd[132206]: Received disconnect from 81.192.46.35 port 51170:11: Bye Bye [preauth]
Dec 06 09:30:46 np0005548789.localdomain sshd[132206]: Disconnected from authenticating user root 81.192.46.35 port 51170 [preauth]
Dec 06 09:30:47 np0005548789.localdomain python3.9[132299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:47 np0005548789.localdomain sudo[132297]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:47 np0005548789.localdomain ansible-async_wrapper.py[132052]: Done in kid B.
Dec 06 09:30:47 np0005548789.localdomain sudo[132370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klaijytmwqecxnpmtmrzpvcimugkckxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013446.5525827-828-211742844102186/AnsiballZ_copy.py
Dec 06 09:30:47 np0005548789.localdomain sudo[132370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50040 DF PROTO=TCP SPT=55042 DPT=9101 SEQ=3105452605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC5C6F0000000001030307) 
Dec 06 09:30:47 np0005548789.localdomain python3.9[132372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013446.5525827-828-211742844102186/.source.returncode _original_basename=.483a2nfj follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:47 np0005548789.localdomain sudo[132370]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:48 np0005548789.localdomain sudo[132462]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jisbrojobplhvwryunepflirdzcvrfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013447.8015974-876-142619129736780/AnsiballZ_stat.py
Dec 06 09:30:48 np0005548789.localdomain sudo[132462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:48 np0005548789.localdomain python3.9[132464]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:30:48 np0005548789.localdomain sudo[132462]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:48 np0005548789.localdomain sudo[132535]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlqdepazncoawnnjyxedvptqkjymnvrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013447.8015974-876-142619129736780/AnsiballZ_copy.py
Dec 06 09:30:48 np0005548789.localdomain sudo[132535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:48 np0005548789.localdomain python3.9[132537]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013447.8015974-876-142619129736780/.source.cfg _original_basename=.jttoc4pf follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:48 np0005548789.localdomain sudo[132535]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:49 np0005548789.localdomain sudo[132627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlxqtczmffldwztbuqskswdvbasdanow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013448.9160829-921-104060115179398/AnsiballZ_systemd.py
Dec 06 09:30:49 np0005548789.localdomain sudo[132627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:49 np0005548789.localdomain python3.9[132629]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:30:49 np0005548789.localdomain systemd[1]: Reloading Network Manager...
Dec 06 09:30:49 np0005548789.localdomain NetworkManager[5973]: <info>  [1765013449.5901] audit: op="reload" arg="0" pid=132633 uid=0 result="success"
Dec 06 09:30:49 np0005548789.localdomain NetworkManager[5973]: <info>  [1765013449.5911] config: signal: SIGHUP (no changes from disk)
Dec 06 09:30:49 np0005548789.localdomain systemd[1]: Reloaded Network Manager.
Dec 06 09:30:49 np0005548789.localdomain sudo[132627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61084 DF PROTO=TCP SPT=57008 DPT=9105 SEQ=58946200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC65AF0000000001030307) 
Dec 06 09:30:49 np0005548789.localdomain sshd[128569]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:30:49 np0005548789.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Dec 06 09:30:49 np0005548789.localdomain systemd[1]: session-41.scope: Consumed 34.557s CPU time.
Dec 06 09:30:49 np0005548789.localdomain systemd-logind[766]: Session 41 logged out. Waiting for processes to exit.
Dec 06 09:30:49 np0005548789.localdomain systemd-logind[766]: Removed session 41.
Dec 06 09:30:52 np0005548789.localdomain sshd[132648]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61085 DF PROTO=TCP SPT=57008 DPT=9105 SEQ=58946200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC75700000000001030307) 
Dec 06 09:30:53 np0005548789.localdomain sshd[132650]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:53 np0005548789.localdomain sshd[132648]: Received disconnect from 64.227.156.63 port 53144:11: Bye Bye [preauth]
Dec 06 09:30:53 np0005548789.localdomain sshd[132648]: Disconnected from authenticating user root 64.227.156.63 port 53144 [preauth]
Dec 06 09:30:55 np0005548789.localdomain sshd[132650]: Received disconnect from 45.78.222.162 port 51852:11: Bye Bye [preauth]
Dec 06 09:30:55 np0005548789.localdomain sshd[132650]: Disconnected from authenticating user root 45.78.222.162 port 51852 [preauth]
Dec 06 09:30:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62220 DF PROTO=TCP SPT=50060 DPT=9882 SEQ=6242818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC7DEF0000000001030307) 
Dec 06 09:30:56 np0005548789.localdomain sshd[132652]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:56 np0005548789.localdomain sshd[132652]: Accepted publickey for zuul from 192.168.122.30 port 60518 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:30:56 np0005548789.localdomain systemd-logind[766]: New session 42 of user zuul.
Dec 06 09:30:56 np0005548789.localdomain systemd[1]: Started Session 42 of User zuul.
Dec 06 09:30:56 np0005548789.localdomain sshd[132652]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:30:57 np0005548789.localdomain python3.9[132745]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:30:58 np0005548789.localdomain python3.9[132839]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:30:59 np0005548789.localdomain sshd[132917]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50042 DF PROTO=TCP SPT=55042 DPT=9101 SEQ=3105452605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC8BEF0000000001030307) 
Dec 06 09:30:59 np0005548789.localdomain sshd[132918]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:30:59 np0005548789.localdomain sshd[132918]: error: kex_exchange_identification: read: Connection reset by peer
Dec 06 09:30:59 np0005548789.localdomain sshd[132918]: Connection reset by 45.140.17.97 port 60592
Dec 06 09:31:01 np0005548789.localdomain python3.9[132994]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:01 np0005548789.localdomain sshd[132652]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:31:01 np0005548789.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Dec 06 09:31:01 np0005548789.localdomain systemd[1]: session-42.scope: Consumed 2.043s CPU time.
Dec 06 09:31:01 np0005548789.localdomain systemd-logind[766]: Session 42 logged out. Waiting for processes to exit.
Dec 06 09:31:01 np0005548789.localdomain systemd-logind[766]: Removed session 42.
Dec 06 09:31:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61086 DF PROTO=TCP SPT=57008 DPT=9105 SEQ=58946200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CC95F00000000001030307) 
Dec 06 09:31:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24569 DF PROTO=TCP SPT=43864 DPT=9102 SEQ=3374735870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCA0700000000001030307) 
Dec 06 09:31:06 np0005548789.localdomain sshd[133010]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:06 np0005548789.localdomain sshd[133012]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:07 np0005548789.localdomain sshd[133012]: Accepted publickey for zuul from 192.168.122.30 port 50656 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:31:07 np0005548789.localdomain systemd-logind[766]: New session 43 of user zuul.
Dec 06 09:31:07 np0005548789.localdomain systemd[1]: Started Session 43 of User zuul.
Dec 06 09:31:07 np0005548789.localdomain sshd[133012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:31:07 np0005548789.localdomain sshd[133010]: Invalid user ubuntu from 92.118.39.95 port 57286
Dec 06 09:31:07 np0005548789.localdomain sshd[133010]: Connection closed by invalid user ubuntu 92.118.39.95 port 57286 [preauth]
Dec 06 09:31:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52672 DF PROTO=TCP SPT=37752 DPT=9102 SEQ=1262336413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCABF00000000001030307) 
Dec 06 09:31:08 np0005548789.localdomain python3.9[133105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:09 np0005548789.localdomain python3.9[133199]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:10 np0005548789.localdomain sudo[133293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxymlxmbnnrogeolfygbenwshldtxzia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013470.17933-81-157482280201728/AnsiballZ_setup.py
Dec 06 09:31:10 np0005548789.localdomain sudo[133293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:10 np0005548789.localdomain python3.9[133295]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24571 DF PROTO=TCP SPT=43864 DPT=9102 SEQ=3374735870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCB82F0000000001030307) 
Dec 06 09:31:11 np0005548789.localdomain sudo[133293]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:11 np0005548789.localdomain sudo[133347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dksbmeyyclwdvhlwhlvewzuwzwigwwnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013470.17933-81-157482280201728/AnsiballZ_dnf.py
Dec 06 09:31:11 np0005548789.localdomain sudo[133347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:11 np0005548789.localdomain python3.9[133349]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12430 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=2424013065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCC5B00000000001030307) 
Dec 06 09:31:14 np0005548789.localdomain sudo[133347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:15 np0005548789.localdomain sudo[133441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cupblnprusjbotytjagrxmaeuiuqlasw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013474.9100702-117-162687717209856/AnsiballZ_setup.py
Dec 06 09:31:15 np0005548789.localdomain sudo[133441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:15 np0005548789.localdomain python3.9[133443]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:16 np0005548789.localdomain sudo[133441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12432 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=2424013065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCD1AF0000000001030307) 
Dec 06 09:31:17 np0005548789.localdomain sudo[133596]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkrhyzotiwoiagyplzplnsqkpptovtlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013477.0426035-150-180595735248185/AnsiballZ_file.py
Dec 06 09:31:17 np0005548789.localdomain sudo[133596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:17 np0005548789.localdomain python3.9[133598]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:17 np0005548789.localdomain sudo[133596]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:18 np0005548789.localdomain sudo[133688]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbvqozeptxnaqrguumlpfdxrvqsshxsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013477.842059-174-231741610273463/AnsiballZ_command.py
Dec 06 09:31:18 np0005548789.localdomain sudo[133688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:18 np0005548789.localdomain python3.9[133690]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:18 np0005548789.localdomain sudo[133688]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:19 np0005548789.localdomain sudo[133793]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svcphjbxgniowvthsewzpfxnomnwxtkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013478.706023-198-135035629543124/AnsiballZ_stat.py
Dec 06 09:31:19 np0005548789.localdomain sudo[133793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:19 np0005548789.localdomain python3.9[133795]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:19 np0005548789.localdomain sudo[133793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:19 np0005548789.localdomain sudo[133841]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjkbrsnzillgunmavetrshgshlpypqhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013478.706023-198-135035629543124/AnsiballZ_file.py
Dec 06 09:31:19 np0005548789.localdomain sudo[133841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:19 np0005548789.localdomain python3.9[133843]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42092 DF PROTO=TCP SPT=46140 DPT=9105 SEQ=3322961043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCDAEF0000000001030307) 
Dec 06 09:31:19 np0005548789.localdomain sudo[133841]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:20 np0005548789.localdomain sudo[133933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arzbzwjwxadufwrafwsxiafffnepizvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013479.9626708-234-217206109885772/AnsiballZ_stat.py
Dec 06 09:31:20 np0005548789.localdomain sudo[133933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:20 np0005548789.localdomain python3.9[133935]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:20 np0005548789.localdomain sudo[133933]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:20 np0005548789.localdomain sudo[133981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tewglvzjgxasujupnykdugbyvbqpjgvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013479.9626708-234-217206109885772/AnsiballZ_file.py
Dec 06 09:31:20 np0005548789.localdomain sudo[133981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:20 np0005548789.localdomain python3.9[133983]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:20 np0005548789.localdomain sudo[133981]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:21 np0005548789.localdomain sudo[134073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrcxdpthewlllnzloesyqxevhvuzeufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013481.112778-273-107400823047949/AnsiballZ_ini_file.py
Dec 06 09:31:21 np0005548789.localdomain sudo[134073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:21 np0005548789.localdomain python3.9[134075]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:21 np0005548789.localdomain sudo[134073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:22 np0005548789.localdomain sudo[134165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvoekszzkmpjrjrfnmsqhjpyipaucasz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013481.8103933-273-104839808938646/AnsiballZ_ini_file.py
Dec 06 09:31:22 np0005548789.localdomain sudo[134165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:22 np0005548789.localdomain python3.9[134167]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:22 np0005548789.localdomain sudo[134165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:22 np0005548789.localdomain sudo[134257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqmcrgazlxqfgctfaehgusvfrxrqahqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013482.40475-273-230951077950592/AnsiballZ_ini_file.py
Dec 06 09:31:22 np0005548789.localdomain sudo[134257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:22 np0005548789.localdomain python3.9[134259]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:22 np0005548789.localdomain sudo[134257]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:23 np0005548789.localdomain sudo[134349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srojcxcyclehqqhtlvpnbyicqrcvnscj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013482.9831758-273-2992061683190/AnsiballZ_ini_file.py
Dec 06 09:31:23 np0005548789.localdomain sudo[134349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:23 np0005548789.localdomain python3.9[134351]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:23 np0005548789.localdomain sudo[134349]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42093 DF PROTO=TCP SPT=46140 DPT=9105 SEQ=3322961043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCEAAF0000000001030307) 
Dec 06 09:31:24 np0005548789.localdomain sudo[134441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysnqdboctrtutnhgrmbqstrizywwnxqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013483.7606153-366-266892210949361/AnsiballZ_dnf.py
Dec 06 09:31:24 np0005548789.localdomain sudo[134441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:24 np0005548789.localdomain python3.9[134443]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:26 np0005548789.localdomain sshd[134446]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28301 DF PROTO=TCP SPT=45950 DPT=9882 SEQ=1914039869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CCF7700000000001030307) 
Dec 06 09:31:27 np0005548789.localdomain sudo[134441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:28 np0005548789.localdomain sshd[134507]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:28 np0005548789.localdomain sudo[134539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpwgzqtyncwpthlbhjuakjfqjnfimcmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013487.8991501-399-82431479111288/AnsiballZ_setup.py
Dec 06 09:31:28 np0005548789.localdomain sudo[134539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:28 np0005548789.localdomain sshd[134446]: Received disconnect from 118.193.38.207 port 52644:11: Bye Bye [preauth]
Dec 06 09:31:28 np0005548789.localdomain sshd[134446]: Disconnected from authenticating user root 118.193.38.207 port 52644 [preauth]
Dec 06 09:31:28 np0005548789.localdomain python3.9[134541]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:31:28 np0005548789.localdomain sudo[134539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:28 np0005548789.localdomain sudo[134633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cszjjrgfjubequuooqsrcwmeafcgoxih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013488.650447-423-205097405968242/AnsiballZ_stat.py
Dec 06 09:31:28 np0005548789.localdomain sudo[134633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:29 np0005548789.localdomain python3.9[134635]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:29 np0005548789.localdomain sudo[134633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:29 np0005548789.localdomain sudo[134725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmfpiflndsxwbhwiqbmuucnccxmfjoxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013489.3732517-450-220434605277277/AnsiballZ_stat.py
Dec 06 09:31:29 np0005548789.localdomain sudo[134725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:29 np0005548789.localdomain sshd[134507]: Received disconnect from 103.157.25.60 port 42942:11: Bye Bye [preauth]
Dec 06 09:31:29 np0005548789.localdomain sshd[134507]: Disconnected from authenticating user root 103.157.25.60 port 42942 [preauth]
Dec 06 09:31:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12434 DF PROTO=TCP SPT=38062 DPT=9101 SEQ=2424013065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD01F00000000001030307) 
Dec 06 09:31:29 np0005548789.localdomain python3.9[134727]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:29 np0005548789.localdomain sudo[134725]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:30 np0005548789.localdomain sudo[134817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rziokncgbwwkigerixmpviblbrkkbhqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013490.1761281-480-215727641898866/AnsiballZ_command.py
Dec 06 09:31:30 np0005548789.localdomain sudo[134817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:30 np0005548789.localdomain python3.9[134819]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:30 np0005548789.localdomain sudo[134817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:31 np0005548789.localdomain sudo[134910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peyxavstdesznombzbkxgkzrczsbxhrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013490.9711258-510-12364970354477/AnsiballZ_service_facts.py
Dec 06 09:31:31 np0005548789.localdomain sudo[134910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:31 np0005548789.localdomain python3.9[134912]: ansible-service_facts Invoked
Dec 06 09:31:31 np0005548789.localdomain sshd[134915]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42094 DF PROTO=TCP SPT=46140 DPT=9105 SEQ=3322961043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD0BF00000000001030307) 
Dec 06 09:31:32 np0005548789.localdomain sshd[134915]: Received disconnect from 12.156.67.18 port 34318:11: Bye Bye [preauth]
Dec 06 09:31:32 np0005548789.localdomain sshd[134915]: Disconnected from authenticating user root 12.156.67.18 port 34318 [preauth]
Dec 06 09:31:32 np0005548789.localdomain network[134931]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:31:32 np0005548789.localdomain network[134932]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:31:32 np0005548789.localdomain network[134933]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:31:33 np0005548789.localdomain sshd[134960]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53742 DF PROTO=TCP SPT=32768 DPT=9102 SEQ=4197401112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD156F0000000001030307) 
Dec 06 09:31:34 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:31:35 np0005548789.localdomain sudo[135003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:31:35 np0005548789.localdomain sudo[135003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:35 np0005548789.localdomain sudo[135003]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:35 np0005548789.localdomain sudo[135021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:31:35 np0005548789.localdomain sudo[135021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:35 np0005548789.localdomain sshd[134960]: Received disconnect from 179.33.210.213 port 32948:11: Bye Bye [preauth]
Dec 06 09:31:35 np0005548789.localdomain sshd[134960]: Disconnected from authenticating user root 179.33.210.213 port 32948 [preauth]
Dec 06 09:31:36 np0005548789.localdomain sudo[135021]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548789.localdomain sudo[134910]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548789.localdomain sudo[135122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:31:36 np0005548789.localdomain sudo[135122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:36 np0005548789.localdomain sudo[135122]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:36 np0005548789.localdomain sudo[135137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:31:36 np0005548789.localdomain sudo[135137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:37.03302051 +0000 UTC m=+0.078648836 container create f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: Started libpod-conmon-f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976.scope.
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:36.999713061 +0000 UTC m=+0.045341397 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:37.105019142 +0000 UTC m=+0.150647468 container init f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:37.113984556 +0000 UTC m=+0.159612882 container start f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, name=rhceph, build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:37.114295096 +0000 UTC m=+0.159923482 container attach f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.buildah.version=1.41.4, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main)
Dec 06 09:31:37 np0005548789.localdomain jolly_booth[135208]: 167 167
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: libpod-f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976.scope: Deactivated successfully.
Dec 06 09:31:37 np0005548789.localdomain podman[135193]: 2025-12-06 09:31:37.118483104 +0000 UTC m=+0.164111460 container died f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 09:31:37 np0005548789.localdomain podman[135213]: 2025-12-06 09:31:37.22953654 +0000 UTC m=+0.099018689 container remove f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_booth, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, distribution-scope=public, version=7, io.openshift.expose-services=)
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: libpod-conmon-f7dea120b01e5da8ca5fc0c0f656130ff391ea6f4ac52f14f31e2b6fc7790976.scope: Deactivated successfully.
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:37.436824348 +0000 UTC m=+0.064920366 container create 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, version=7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: Started libpod-conmon-54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d.scope.
Dec 06 09:31:37 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:31:37 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda23162c6a2b4a6cfbe7e1aa75e3125b9693ec8c2c2e213eee44327d1862ee1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda23162c6a2b4a6cfbe7e1aa75e3125b9693ec8c2c2e213eee44327d1862ee1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dda23162c6a2b4a6cfbe7e1aa75e3125b9693ec8c2c2e213eee44327d1862ee1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:37.491346076 +0000 UTC m=+0.119442094 container init 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, ceph=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:37.501221018 +0000 UTC m=+0.129317026 container start 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:37.501372072 +0000 UTC m=+0.129468120 container attach 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main)
Dec 06 09:31:37 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:37.403932033 +0000 UTC m=+0.032028121 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 09:31:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35376 DF PROTO=TCP SPT=42664 DPT=9102 SEQ=2431858908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD21F00000000001030307) 
Dec 06 09:31:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-051e186c773fa279ab709de94d6a2c9095ecda4c868e7aaf9dbef01413e6f953-merged.mount: Deactivated successfully.
Dec 06 09:31:38 np0005548789.localdomain sudo[135753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tptwexknlcfsqgoibmjdaqsngmhvwzsw ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765013497.2921927-555-159576954969950/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765013497.2921927-555-159576954969950/args
Dec 06 09:31:38 np0005548789.localdomain sudo[135753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:38 np0005548789.localdomain sudo[135753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]: [
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:     {
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "available": false,
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "ceph_device": false,
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "lsm_data": {},
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "lvs": [],
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "path": "/dev/sr0",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "rejected_reasons": [
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "Has a FileSystem",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "Insufficient space (<5GB)"
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         ],
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         "sys_api": {
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "actuators": null,
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "device_nodes": "sr0",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "human_readable_size": "482.00 KB",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "id_bus": "ata",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "model": "QEMU DVD-ROM",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "nr_requests": "2",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "partitions": {},
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "path": "/dev/sr0",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "removable": "1",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "rev": "2.5+",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "ro": "0",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "rotational": "1",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "sas_address": "",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "sas_device_handle": "",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "scheduler_mode": "mq-deadline",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "sectors": 0,
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "sectorsize": "2048",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "size": 493568.0,
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "support_discard": "0",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "type": "disk",
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:             "vendor": "QEMU"
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:         }
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]:     }
Dec 06 09:31:38 np0005548789.localdomain practical_joliot[135312]: ]
Dec 06 09:31:38 np0005548789.localdomain systemd[1]: libpod-54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d.scope: Deactivated successfully.
Dec 06 09:31:38 np0005548789.localdomain podman[135268]: 2025-12-06 09:31:38.357094691 +0000 UTC m=+0.985190719 container died 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 09:31:38 np0005548789.localdomain systemd[1]: tmp-crun.xyT7md.mount: Deactivated successfully.
Dec 06 09:31:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-dda23162c6a2b4a6cfbe7e1aa75e3125b9693ec8c2c2e213eee44327d1862ee1-merged.mount: Deactivated successfully.
Dec 06 09:31:38 np0005548789.localdomain podman[136777]: 2025-12-06 09:31:38.439019737 +0000 UTC m=+0.071123586 container remove 54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_joliot, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True)
Dec 06 09:31:38 np0005548789.localdomain systemd[1]: libpod-conmon-54aa923a29c9494ae51712763b0891f5557af30c5c5bfc7c39c5a9a586c4691d.scope: Deactivated successfully.
Dec 06 09:31:38 np0005548789.localdomain sudo[135137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:38 np0005548789.localdomain sudo[136867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plcwvpwkqmximriwbsidsrpfowmtgycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013498.5307724-588-117945931861278/AnsiballZ_dnf.py
Dec 06 09:31:38 np0005548789.localdomain sudo[136867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:39 np0005548789.localdomain python3.9[136869]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:39 np0005548789.localdomain sudo[136871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:31:39 np0005548789.localdomain sudo[136871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:31:39 np0005548789.localdomain sudo[136871]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53744 DF PROTO=TCP SPT=32768 DPT=9102 SEQ=4197401112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD2D300000000001030307) 
Dec 06 09:31:41 np0005548789.localdomain sshd[136887]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:42 np0005548789.localdomain sudo[136867]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:43 np0005548789.localdomain sshd[136887]: Received disconnect from 103.234.151.178 port 3636:11: Bye Bye [preauth]
Dec 06 09:31:43 np0005548789.localdomain sshd[136887]: Disconnected from authenticating user root 103.234.151.178 port 3636 [preauth]
Dec 06 09:31:43 np0005548789.localdomain sudo[136978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldknhpriymhldflapqtejafqgjnjvigk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013502.7289844-627-132938242534731/AnsiballZ_package_facts.py
Dec 06 09:31:43 np0005548789.localdomain sudo[136978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:43 np0005548789.localdomain python3.9[136980]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 06 09:31:43 np0005548789.localdomain sudo[136978]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35408 DF PROTO=TCP SPT=45486 DPT=9101 SEQ=4065589046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD3AE00000000001030307) 
Dec 06 09:31:44 np0005548789.localdomain sudo[137070]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prdeixvhebuzsgedxhcfygecsbegjekz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013504.6597707-658-179656407375413/AnsiballZ_stat.py
Dec 06 09:31:44 np0005548789.localdomain sudo[137070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:45 np0005548789.localdomain python3.9[137072]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:45 np0005548789.localdomain sudo[137070]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:45 np0005548789.localdomain sudo[137145]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wolupsqxmmzqimtbqxxmtlowwwxovued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013504.6597707-658-179656407375413/AnsiballZ_copy.py
Dec 06 09:31:45 np0005548789.localdomain sudo[137145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:45 np0005548789.localdomain python3.9[137147]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013504.6597707-658-179656407375413/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:45 np0005548789.localdomain sudo[137145]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:46 np0005548789.localdomain sudo[137239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntkcjgarywwtawfhdgxgqdgzkbkkcgpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013506.1301033-703-231410338939100/AnsiballZ_stat.py
Dec 06 09:31:46 np0005548789.localdomain sudo[137239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:46 np0005548789.localdomain python3.9[137241]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:46 np0005548789.localdomain sudo[137239]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:46 np0005548789.localdomain sudo[137314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guddgfywhusmzrgcwmjsbgicvrztdqzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013506.1301033-703-231410338939100/AnsiballZ_copy.py
Dec 06 09:31:46 np0005548789.localdomain sudo[137314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:47 np0005548789.localdomain python3.9[137316]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013506.1301033-703-231410338939100/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:47 np0005548789.localdomain sudo[137314]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35410 DF PROTO=TCP SPT=45486 DPT=9101 SEQ=4065589046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD46F00000000001030307) 
Dec 06 09:31:48 np0005548789.localdomain sudo[137408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewtcqkbpsmuirubceemcwmrynbpqyzrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013508.138134-766-223029753237497/AnsiballZ_lineinfile.py
Dec 06 09:31:48 np0005548789.localdomain sudo[137408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:48 np0005548789.localdomain python3.9[137410]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:48 np0005548789.localdomain sudo[137408]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33707 DF PROTO=TCP SPT=55974 DPT=9105 SEQ=3389028362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD4FEF0000000001030307) 
Dec 06 09:31:50 np0005548789.localdomain sudo[137502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjrfsfyxzxxbdiaqnkmbvsesqdelaluv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013509.8544989-811-67321325161046/AnsiballZ_setup.py
Dec 06 09:31:50 np0005548789.localdomain sudo[137502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:50 np0005548789.localdomain python3.9[137504]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:50 np0005548789.localdomain sudo[137502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:51 np0005548789.localdomain sudo[137556]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hexkftkuxaykwgzolsqhudrtbexdqzhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013509.8544989-811-67321325161046/AnsiballZ_systemd.py
Dec 06 09:31:51 np0005548789.localdomain sudo[137556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:52 np0005548789.localdomain python3.9[137558]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:31:52 np0005548789.localdomain sudo[137556]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:53 np0005548789.localdomain sshd[137575]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33708 DF PROTO=TCP SPT=55974 DPT=9105 SEQ=3389028362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD5FB00000000001030307) 
Dec 06 09:31:54 np0005548789.localdomain sshd[137609]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:31:54 np0005548789.localdomain sudo[137654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqldrblrwawbbjxlivtgkhdmhoowpetj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013514.0188231-859-176983069569348/AnsiballZ_setup.py
Dec 06 09:31:54 np0005548789.localdomain sudo[137654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:54 np0005548789.localdomain python3.9[137656]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:31:54 np0005548789.localdomain sshd[137575]: Received disconnect from 103.192.152.59 port 39878:11: Bye Bye [preauth]
Dec 06 09:31:54 np0005548789.localdomain sshd[137575]: Disconnected from authenticating user root 103.192.152.59 port 39878 [preauth]
Dec 06 09:31:54 np0005548789.localdomain sudo[137654]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:55 np0005548789.localdomain sudo[137708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byprzsnlscfmuzhjvqyxnidsilegnlen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013514.0188231-859-176983069569348/AnsiballZ_systemd.py
Dec 06 09:31:55 np0005548789.localdomain sudo[137708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:55 np0005548789.localdomain python3.9[137710]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:31:55 np0005548789.localdomain chronyd[25988]: chronyd exiting
Dec 06 09:31:55 np0005548789.localdomain systemd[1]: Stopping NTP client/server...
Dec 06 09:31:55 np0005548789.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 09:31:55 np0005548789.localdomain systemd[1]: Stopped NTP client/server.
Dec 06 09:31:55 np0005548789.localdomain systemd[1]: Starting NTP client/server...
Dec 06 09:31:55 np0005548789.localdomain chronyd[137718]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 09:31:55 np0005548789.localdomain chronyd[137718]: Frequency -30.244 +/- 0.187 ppm read from /var/lib/chrony/drift
Dec 06 09:31:55 np0005548789.localdomain chronyd[137718]: Loaded seccomp filter (level 2)
Dec 06 09:31:55 np0005548789.localdomain systemd[1]: Started NTP client/server.
Dec 06 09:31:55 np0005548789.localdomain sudo[137708]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:55 np0005548789.localdomain sshd[137609]: Received disconnect from 81.192.46.35 port 49494:11: Bye Bye [preauth]
Dec 06 09:31:55 np0005548789.localdomain sshd[137609]: Disconnected from authenticating user root 81.192.46.35 port 49494 [preauth]
Dec 06 09:31:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28304 DF PROTO=TCP SPT=45950 DPT=9882 SEQ=1914039869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD67F00000000001030307) 
Dec 06 09:31:56 np0005548789.localdomain sshd[133012]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:31:56 np0005548789.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Dec 06 09:31:56 np0005548789.localdomain systemd[1]: session-43.scope: Consumed 27.337s CPU time.
Dec 06 09:31:56 np0005548789.localdomain systemd-logind[766]: Session 43 logged out. Waiting for processes to exit.
Dec 06 09:31:56 np0005548789.localdomain systemd-logind[766]: Removed session 43.
Dec 06 09:31:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35412 DF PROTO=TCP SPT=45486 DPT=9101 SEQ=4065589046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD77F00000000001030307) 
Dec 06 09:32:01 np0005548789.localdomain sshd[137734]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33709 DF PROTO=TCP SPT=55974 DPT=9105 SEQ=3389028362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD7FEF0000000001030307) 
Dec 06 09:32:01 np0005548789.localdomain sshd[137734]: Accepted publickey for zuul from 192.168.122.30 port 34744 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:32:02 np0005548789.localdomain systemd-logind[766]: New session 44 of user zuul.
Dec 06 09:32:02 np0005548789.localdomain systemd[1]: Started Session 44 of User zuul.
Dec 06 09:32:02 np0005548789.localdomain sshd[137734]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:32:03 np0005548789.localdomain python3.9[137827]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:32:04 np0005548789.localdomain sudo[137921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plkpbyaunhghcnyynhlbpstncanzkezk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013523.8933346-60-178030215159630/AnsiballZ_file.py
Dec 06 09:32:04 np0005548789.localdomain sudo[137921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:04 np0005548789.localdomain python3.9[137923]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:04 np0005548789.localdomain sudo[137921]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55749 DF PROTO=TCP SPT=32946 DPT=9102 SEQ=4129418666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD8AAF0000000001030307) 
Dec 06 09:32:05 np0005548789.localdomain sudo[138026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umrxyvmlksqaptfvqqicevbkilufiuul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013524.6550267-84-7769053226362/AnsiballZ_stat.py
Dec 06 09:32:05 np0005548789.localdomain sudo[138026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:05 np0005548789.localdomain python3.9[138028]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:05 np0005548789.localdomain sudo[138026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:05 np0005548789.localdomain sudo[138074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kycjxnzbhoqxofvmvjnrumtbydycovtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013524.6550267-84-7769053226362/AnsiballZ_file.py
Dec 06 09:32:05 np0005548789.localdomain sudo[138074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:05 np0005548789.localdomain python3.9[138076]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.3lnedvcb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:05 np0005548789.localdomain sudo[138074]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:06 np0005548789.localdomain sudo[138166]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thvzwyrpalsiihhxcdzrhnhjowloxxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013526.2429948-144-32310357103542/AnsiballZ_stat.py
Dec 06 09:32:06 np0005548789.localdomain sudo[138166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:06 np0005548789.localdomain python3.9[138168]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:06 np0005548789.localdomain sudo[138166]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:07 np0005548789.localdomain sudo[138241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdvakzojsxybzffcociznojwgekdindj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013526.2429948-144-32310357103542/AnsiballZ_copy.py
Dec 06 09:32:07 np0005548789.localdomain sudo[138241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:07 np0005548789.localdomain python3.9[138243]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013526.2429948-144-32310357103542/.source _original_basename=.fol479re follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:07 np0005548789.localdomain sudo[138241]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:07 np0005548789.localdomain sudo[138333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozfgtcixzowxzqjsrzrittcrkqzelsjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013527.6226726-192-63754162814324/AnsiballZ_file.py
Dec 06 09:32:07 np0005548789.localdomain sudo[138333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:08 np0005548789.localdomain python3.9[138335]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:08 np0005548789.localdomain sudo[138333]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:08 np0005548789.localdomain sudo[138425]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glpztinikxiytcambeiwrnfkwhsbidpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013528.2452762-216-5857502528035/AnsiballZ_stat.py
Dec 06 09:32:08 np0005548789.localdomain sudo[138425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:08 np0005548789.localdomain python3.9[138427]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:08 np0005548789.localdomain sudo[138425]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:09 np0005548789.localdomain sudo[138498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhuwvpgmwdncawqozvnhpnvkegnukomv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013528.2452762-216-5857502528035/AnsiballZ_copy.py
Dec 06 09:32:09 np0005548789.localdomain sudo[138498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17450 DF PROTO=TCP SPT=58168 DPT=9882 SEQ=21909413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CD9BEF0000000001030307) 
Dec 06 09:32:09 np0005548789.localdomain python3.9[138500]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013528.2452762-216-5857502528035/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:09 np0005548789.localdomain sudo[138498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:09 np0005548789.localdomain sudo[138590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elyzrtmxvgrzqlwovmzhhjusewkobbik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013529.329488-216-48221136956556/AnsiballZ_stat.py
Dec 06 09:32:09 np0005548789.localdomain sudo[138590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:09 np0005548789.localdomain python3.9[138592]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:09 np0005548789.localdomain sudo[138590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:10 np0005548789.localdomain sudo[138663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaavmdkefisbslcambbvrcvzpvvgburt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013529.329488-216-48221136956556/AnsiballZ_copy.py
Dec 06 09:32:10 np0005548789.localdomain sudo[138663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:10 np0005548789.localdomain python3.9[138665]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013529.329488-216-48221136956556/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:32:10 np0005548789.localdomain sudo[138663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:10 np0005548789.localdomain sudo[138755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adjyyqptbbznriynkitguvqtcecpmffk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013530.4622412-303-274087310736956/AnsiballZ_file.py
Dec 06 09:32:10 np0005548789.localdomain sudo[138755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55751 DF PROTO=TCP SPT=32946 DPT=9102 SEQ=4129418666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDA26F0000000001030307) 
Dec 06 09:32:10 np0005548789.localdomain python3.9[138757]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:10 np0005548789.localdomain sudo[138755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:11 np0005548789.localdomain sudo[138847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haezbabxzzwmuzccnzxqiamcuaaxaupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013531.0994515-327-41990860339714/AnsiballZ_stat.py
Dec 06 09:32:11 np0005548789.localdomain sudo[138847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:12 np0005548789.localdomain python3.9[138849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:12 np0005548789.localdomain sudo[138847]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:12 np0005548789.localdomain sudo[138920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgcdbobpefbpfegiedsqwhtfbrldmyks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013531.0994515-327-41990860339714/AnsiballZ_copy.py
Dec 06 09:32:12 np0005548789.localdomain sudo[138920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:12 np0005548789.localdomain python3.9[138922]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013531.0994515-327-41990860339714/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:12 np0005548789.localdomain sudo[138920]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:13 np0005548789.localdomain sudo[139012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edvjfcoovixedtveheeyuvxifxsykrhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013532.8076408-372-59111087322722/AnsiballZ_stat.py
Dec 06 09:32:13 np0005548789.localdomain sudo[139012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:13 np0005548789.localdomain python3.9[139014]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:13 np0005548789.localdomain sudo[139012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:14 np0005548789.localdomain sudo[139085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtziidlyhvtcjdqsrghwphlhyjitkkbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013532.8076408-372-59111087322722/AnsiballZ_copy.py
Dec 06 09:32:14 np0005548789.localdomain sudo[139085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1368 DF PROTO=TCP SPT=49982 DPT=9101 SEQ=225342361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDB0100000000001030307) 
Dec 06 09:32:14 np0005548789.localdomain python3.9[139087]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013532.8076408-372-59111087322722/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:14 np0005548789.localdomain sudo[139085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:15 np0005548789.localdomain sudo[139177]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzswmgbjqrgsvhcpghmmvkpzyfannsml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013534.5630352-417-155073684986153/AnsiballZ_systemd.py
Dec 06 09:32:15 np0005548789.localdomain sudo[139177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:15 np0005548789.localdomain python3.9[139179]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:32:15 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:32:15 np0005548789.localdomain systemd-sysv-generator[139208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:15 np0005548789.localdomain systemd-rc-local-generator[139204]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:15 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:32:15 np0005548789.localdomain systemd-rc-local-generator[139239]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:15 np0005548789.localdomain systemd-sysv-generator[139243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:16 np0005548789.localdomain systemd[1]: Starting EDPM Container Shutdown...
Dec 06 09:32:16 np0005548789.localdomain systemd[1]: Finished EDPM Container Shutdown.
Dec 06 09:32:16 np0005548789.localdomain sudo[139177]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:16 np0005548789.localdomain sudo[139345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asxaghjlfonooywgkuhletspbppmxhuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013536.3184662-441-48610634782922/AnsiballZ_stat.py
Dec 06 09:32:16 np0005548789.localdomain sudo[139345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:16 np0005548789.localdomain python3.9[139347]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:16 np0005548789.localdomain sudo[139345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:17 np0005548789.localdomain sudo[139418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gasgmrsqngjroumtrskhdfrhdvntivmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013536.3184662-441-48610634782922/AnsiballZ_copy.py
Dec 06 09:32:17 np0005548789.localdomain sudo[139418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:17 np0005548789.localdomain python3.9[139420]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013536.3184662-441-48610634782922/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:17 np0005548789.localdomain sudo[139418]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1370 DF PROTO=TCP SPT=49982 DPT=9101 SEQ=225342361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDBC300000000001030307) 
Dec 06 09:32:17 np0005548789.localdomain sudo[139510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xluyjtarcvqoxzqiwopfcqngqzebmxdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013537.5362775-486-247914366815403/AnsiballZ_stat.py
Dec 06 09:32:17 np0005548789.localdomain sudo[139510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:18 np0005548789.localdomain python3.9[139512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:18 np0005548789.localdomain sudo[139510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:18 np0005548789.localdomain sudo[139583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gugahbxkguhptxsyjhztfearvwixdpie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013537.5362775-486-247914366815403/AnsiballZ_copy.py
Dec 06 09:32:18 np0005548789.localdomain sudo[139583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:18 np0005548789.localdomain python3.9[139585]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013537.5362775-486-247914366815403/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:18 np0005548789.localdomain sudo[139583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:18 np0005548789.localdomain sudo[139675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stlpkxnxecbxlfhvylggmcoivldvwumu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013538.7080455-531-249789184260126/AnsiballZ_systemd.py
Dec 06 09:32:18 np0005548789.localdomain sudo[139675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:19 np0005548789.localdomain python3.9[139677]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:32:19 np0005548789.localdomain systemd-rc-local-generator[139699]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:19 np0005548789.localdomain systemd-sysv-generator[139704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:32:19 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:32:19 np0005548789.localdomain sudo[139675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=812 DF PROTO=TCP SPT=38992 DPT=9105 SEQ=2343405215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDC52F0000000001030307) 
Dec 06 09:32:20 np0005548789.localdomain python3.9[139808]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:32:20 np0005548789.localdomain network[139825]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:32:20 np0005548789.localdomain network[139826]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:32:20 np0005548789.localdomain network[139827]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:32:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:32:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=813 DF PROTO=TCP SPT=38992 DPT=9105 SEQ=2343405215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDD4EF0000000001030307) 
Dec 06 09:32:24 np0005548789.localdomain sudo[140026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysaasivextcesmtmkrjtuigzlyiakmpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013544.2972224-609-19980025123247/AnsiballZ_stat.py
Dec 06 09:32:24 np0005548789.localdomain sudo[140026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:24 np0005548789.localdomain python3.9[140028]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:24 np0005548789.localdomain sudo[140026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:25 np0005548789.localdomain sudo[140101]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frgkppaqlfgvrolydhbrbhwjyybruaaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013544.2972224-609-19980025123247/AnsiballZ_copy.py
Dec 06 09:32:25 np0005548789.localdomain sudo[140101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:25 np0005548789.localdomain python3.9[140103]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013544.2972224-609-19980025123247/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:25 np0005548789.localdomain sudo[140101]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:26 np0005548789.localdomain sudo[140194]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wystrbmtxzkbzpqiaqrpieyhjaioavpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013546.2952979-654-134129344278078/AnsiballZ_systemd.py
Dec 06 09:32:26 np0005548789.localdomain sudo[140194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:26 np0005548789.localdomain python3.9[140196]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:32:26 np0005548789.localdomain systemd[1]: Reloading OpenSSH server daemon...
Dec 06 09:32:26 np0005548789.localdomain sshd[119889]: Received SIGHUP; restarting.
Dec 06 09:32:26 np0005548789.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Dec 06 09:32:26 np0005548789.localdomain sshd[119889]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:26 np0005548789.localdomain sshd[119889]: Server listening on 0.0.0.0 port 22.
Dec 06 09:32:26 np0005548789.localdomain sshd[119889]: Server listening on :: port 22.
Dec 06 09:32:26 np0005548789.localdomain sudo[140194]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41129 DF PROTO=TCP SPT=50666 DPT=9882 SEQ=2894400643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDE1AF0000000001030307) 
Dec 06 09:32:28 np0005548789.localdomain sudo[140290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfpmdpnihadnqwhbpstdhmicdsnpocnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.1020038-678-78746295794742/AnsiballZ_file.py
Dec 06 09:32:28 np0005548789.localdomain sudo[140290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:28 np0005548789.localdomain python3.9[140292]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:28 np0005548789.localdomain sudo[140290]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:29 np0005548789.localdomain sudo[140382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zykiiqpyiwucjwovldsggweinnpwjtlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.794004-702-261466426779047/AnsiballZ_stat.py
Dec 06 09:32:29 np0005548789.localdomain sudo[140382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:29 np0005548789.localdomain sshd[140385]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:29 np0005548789.localdomain python3.9[140384]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:29 np0005548789.localdomain sudo[140382]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1372 DF PROTO=TCP SPT=49982 DPT=9101 SEQ=225342361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDEBEF0000000001030307) 
Dec 06 09:32:29 np0005548789.localdomain sudo[140457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbianmydktwcckyrdtkyytchldpubkcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013548.794004-702-261466426779047/AnsiballZ_copy.py
Dec 06 09:32:29 np0005548789.localdomain sudo[140457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:29 np0005548789.localdomain python3.9[140459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013548.794004-702-261466426779047/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:29 np0005548789.localdomain sudo[140457]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:30 np0005548789.localdomain sshd[140385]: Received disconnect from 64.227.156.63 port 50796:11: Bye Bye [preauth]
Dec 06 09:32:30 np0005548789.localdomain sshd[140385]: Disconnected from authenticating user root 64.227.156.63 port 50796 [preauth]
Dec 06 09:32:30 np0005548789.localdomain sudo[140549]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzavjllxvckbtwomvyueexuwczipzxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013550.309082-756-52244196912513/AnsiballZ_timezone.py
Dec 06 09:32:30 np0005548789.localdomain sudo[140549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:31 np0005548789.localdomain python3.9[140551]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 09:32:31 np0005548789.localdomain systemd[1]: Starting Time & Date Service...
Dec 06 09:32:31 np0005548789.localdomain systemd[1]: Started Time & Date Service.
Dec 06 09:32:31 np0005548789.localdomain sudo[140549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:31 np0005548789.localdomain sudo[140645]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwquymshrtbdsvwrnbduhxpzypabblvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013551.6359758-783-168543785004341/AnsiballZ_file.py
Dec 06 09:32:31 np0005548789.localdomain sudo[140645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:32 np0005548789.localdomain python3.9[140647]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:32 np0005548789.localdomain sudo[140645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=814 DF PROTO=TCP SPT=38992 DPT=9105 SEQ=2343405215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDF5F00000000001030307) 
Dec 06 09:32:32 np0005548789.localdomain sudo[140737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyclxnkbeyoczigpshqbjsvstkcgxrlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013552.3764398-807-128530496483931/AnsiballZ_stat.py
Dec 06 09:32:32 np0005548789.localdomain sudo[140737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:32 np0005548789.localdomain python3.9[140739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:32 np0005548789.localdomain sudo[140737]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:33 np0005548789.localdomain sudo[140810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgdkrdpjvpivkjtplglwmpzsykfmcrkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013552.3764398-807-128530496483931/AnsiballZ_copy.py
Dec 06 09:32:33 np0005548789.localdomain sudo[140810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:33 np0005548789.localdomain python3.9[140812]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013552.3764398-807-128530496483931/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:33 np0005548789.localdomain sudo[140810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:33 np0005548789.localdomain sudo[140902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tufccouynrvezwwsbvialrcwuwykcheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013553.5411372-852-221618421100880/AnsiballZ_stat.py
Dec 06 09:32:33 np0005548789.localdomain sudo[140902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:33 np0005548789.localdomain python3.9[140904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:34 np0005548789.localdomain sudo[140902]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3009 DF PROTO=TCP SPT=56102 DPT=9102 SEQ=3789048056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CDFFEF0000000001030307) 
Dec 06 09:32:34 np0005548789.localdomain sudo[140975]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cazropwdajrjovlhiafatbcxqmrwpqbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013553.5411372-852-221618421100880/AnsiballZ_copy.py
Dec 06 09:32:34 np0005548789.localdomain sudo[140975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:34 np0005548789.localdomain python3.9[140977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013553.5411372-852-221618421100880/.source.yaml _original_basename=.ng53fc6z follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:34 np0005548789.localdomain sudo[140975]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:35 np0005548789.localdomain sudo[141067]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apsojtorakqylubnmzhepjptjrkkpvak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013555.2275996-898-180336099374098/AnsiballZ_stat.py
Dec 06 09:32:35 np0005548789.localdomain sudo[141067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:35 np0005548789.localdomain python3.9[141069]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:35 np0005548789.localdomain sudo[141067]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:36 np0005548789.localdomain sudo[141142]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tawphvgpuvwvhkhsmufcmfstbwlgxfgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013555.2275996-898-180336099374098/AnsiballZ_copy.py
Dec 06 09:32:36 np0005548789.localdomain sudo[141142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:36 np0005548789.localdomain python3.9[141144]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013555.2275996-898-180336099374098/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:36 np0005548789.localdomain sudo[141142]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:36 np0005548789.localdomain sshd[141145]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:37 np0005548789.localdomain sshd[141145]: Received disconnect from 12.156.67.18 port 38542:11: Bye Bye [preauth]
Dec 06 09:32:37 np0005548789.localdomain sshd[141145]: Disconnected from authenticating user root 12.156.67.18 port 38542 [preauth]
Dec 06 09:32:37 np0005548789.localdomain sudo[141236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmiuozjobgmklobopdpyupnjtjwkrxid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013556.9069176-943-210095922492909/AnsiballZ_command.py
Dec 06 09:32:37 np0005548789.localdomain sudo[141236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:37 np0005548789.localdomain python3.9[141238]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:37 np0005548789.localdomain sudo[141236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53747 DF PROTO=TCP SPT=32768 DPT=9102 SEQ=4197401112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE0BEF0000000001030307) 
Dec 06 09:32:38 np0005548789.localdomain sudo[141329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvslwfexnyibghiqaengifiazrtdupea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013557.7945077-967-187161324175567/AnsiballZ_command.py
Dec 06 09:32:38 np0005548789.localdomain sudo[141329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:38 np0005548789.localdomain python3.9[141331]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:38 np0005548789.localdomain sudo[141329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:38 np0005548789.localdomain sudo[141422]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzgmodvwavihwkislmbkrubnkxvbgtzs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013558.502565-990-115963890884215/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:32:38 np0005548789.localdomain sudo[141422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:39 np0005548789.localdomain python3[141424]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:32:39 np0005548789.localdomain sudo[141422]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548789.localdomain sudo[141449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:32:39 np0005548789.localdomain sudo[141449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:39 np0005548789.localdomain sudo[141449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548789.localdomain sudo[141486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:32:39 np0005548789.localdomain sudo[141486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:39 np0005548789.localdomain sudo[141544]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifbdciotjezjdbkhrbvvbkagutkgxnca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013559.3508754-1014-220633779897061/AnsiballZ_stat.py
Dec 06 09:32:39 np0005548789.localdomain sudo[141544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:39 np0005548789.localdomain python3.9[141546]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:39 np0005548789.localdomain sudo[141544]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 np0005548789.localdomain sudo[141486]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548789.localdomain sudo[141608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:32:40 np0005548789.localdomain sudo[141608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:40 np0005548789.localdomain sudo[141608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548789.localdomain sudo[141623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:32:40 np0005548789.localdomain sudo[141623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:40 np0005548789.localdomain sudo[141668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvyhxdbdltwwgwqufancdcihnmtfvwzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013559.3508754-1014-220633779897061/AnsiballZ_copy.py
Dec 06 09:32:40 np0005548789.localdomain sudo[141668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:40 np0005548789.localdomain python3.9[141670]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013559.3508754-1014-220633779897061/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:40 np0005548789.localdomain sudo[141668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548789.localdomain sudo[141623]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3011 DF PROTO=TCP SPT=56102 DPT=9102 SEQ=3789048056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE17AF0000000001030307) 
Dec 06 09:32:40 np0005548789.localdomain sudo[141794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjbwlqrrfsyyfdzvlgxptzgpdkveeldi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013560.601365-1059-153563874272434/AnsiballZ_stat.py
Dec 06 09:32:40 np0005548789.localdomain sudo[141794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:41 np0005548789.localdomain python3.9[141796]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:41 np0005548789.localdomain sudo[141794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:41 np0005548789.localdomain sudo[141830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:32:41 np0005548789.localdomain sudo[141830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:32:41 np0005548789.localdomain sudo[141830]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:41 np0005548789.localdomain sudo[141882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yalxynxxdhrxabkqlckajyurhvdmzrmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013560.601365-1059-153563874272434/AnsiballZ_copy.py
Dec 06 09:32:41 np0005548789.localdomain sudo[141882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:41 np0005548789.localdomain python3.9[141884]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013560.601365-1059-153563874272434/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:41 np0005548789.localdomain sudo[141882]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:42 np0005548789.localdomain sudo[141974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnsfwtkmlqbdpyvbwcoydpmfeiugqoir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013561.9440055-1105-203600522259937/AnsiballZ_stat.py
Dec 06 09:32:42 np0005548789.localdomain sudo[141974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:42 np0005548789.localdomain sshd[141977]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:42 np0005548789.localdomain python3.9[141976]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:42 np0005548789.localdomain sudo[141974]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:42 np0005548789.localdomain sudo[142049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gklkstuwzrpnnhcighmtdkwwjgcxmqke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013561.9440055-1105-203600522259937/AnsiballZ_copy.py
Dec 06 09:32:42 np0005548789.localdomain sudo[142049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:42 np0005548789.localdomain python3.9[142051]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013561.9440055-1105-203600522259937/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:42 np0005548789.localdomain sudo[142049]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:43 np0005548789.localdomain sudo[142141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmpvneifkwbigyqtpiirpfxzgtvnlzht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013563.1466417-1149-54972388325639/AnsiballZ_stat.py
Dec 06 09:32:43 np0005548789.localdomain sudo[142141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:43 np0005548789.localdomain python3.9[142143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:43 np0005548789.localdomain sudo[142141]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:43 np0005548789.localdomain sshd[141977]: Received disconnect from 118.193.38.207 port 50986:11: Bye Bye [preauth]
Dec 06 09:32:43 np0005548789.localdomain sshd[141977]: Disconnected from authenticating user root 118.193.38.207 port 50986 [preauth]
Dec 06 09:32:43 np0005548789.localdomain sudo[142214]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phylujuiaajjcdljbagtrljotjrmsbpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013563.1466417-1149-54972388325639/AnsiballZ_copy.py
Dec 06 09:32:43 np0005548789.localdomain sudo[142214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:44 np0005548789.localdomain python3.9[142216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013563.1466417-1149-54972388325639/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:44 np0005548789.localdomain sudo[142214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56704 DF PROTO=TCP SPT=52330 DPT=9101 SEQ=1489643712 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE25410000000001030307) 
Dec 06 09:32:45 np0005548789.localdomain sudo[142306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqkkjdqmbtipaiyfbthqqnhexxtnpsxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013564.683013-1195-128072970491707/AnsiballZ_stat.py
Dec 06 09:32:45 np0005548789.localdomain sudo[142306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:45 np0005548789.localdomain python3.9[142308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:32:45 np0005548789.localdomain sudo[142306]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:45 np0005548789.localdomain sudo[142379]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjaksmqnwnwpmkdvgizcakjyqqsmtbnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013564.683013-1195-128072970491707/AnsiballZ_copy.py
Dec 06 09:32:45 np0005548789.localdomain sudo[142379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:45 np0005548789.localdomain python3.9[142381]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013564.683013-1195-128072970491707/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:45 np0005548789.localdomain sudo[142379]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:46 np0005548789.localdomain sudo[142471]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykhmimlgntbcfgfkyletxfavbyuomotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013566.001267-1239-161850097961376/AnsiballZ_file.py
Dec 06 09:32:46 np0005548789.localdomain sudo[142471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:46 np0005548789.localdomain python3.9[142473]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:46 np0005548789.localdomain sudo[142471]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:47 np0005548789.localdomain sudo[142563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atfobczyvoprvpancgdxmemliktgkled ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013567.328633-1263-266834332264198/AnsiballZ_command.py
Dec 06 09:32:47 np0005548789.localdomain sudo[142563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38454 DF PROTO=TCP SPT=54342 DPT=9100 SEQ=289234418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE32380000000001030307) 
Dec 06 09:32:47 np0005548789.localdomain python3.9[142565]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:47 np0005548789.localdomain sudo[142563]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:48 np0005548789.localdomain sudo[142658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqmrltcqhdxgtmjjxumzbazgrgujehvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013568.0174224-1287-126328303151767/AnsiballZ_blockinfile.py
Dec 06 09:32:48 np0005548789.localdomain sudo[142658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:48 np0005548789.localdomain python3.9[142660]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:48 np0005548789.localdomain sudo[142658]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:49 np0005548789.localdomain sudo[142751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyysxozugjbnkqtkatyynivqjwyjcedu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013568.928706-1314-171326169179941/AnsiballZ_file.py
Dec 06 09:32:49 np0005548789.localdomain sudo[142751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:49 np0005548789.localdomain python3.9[142753]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:49 np0005548789.localdomain sudo[142751]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:49 np0005548789.localdomain sudo[142843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtlzqkzxknunozqsinmtmecupsuqaijd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013569.5274765-1314-204413956703031/AnsiballZ_file.py
Dec 06 09:32:49 np0005548789.localdomain sudo[142843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:49 np0005548789.localdomain python3.9[142845]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:32:49 np0005548789.localdomain sudo[142843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:50 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33711 DF PROTO=TCP SPT=55974 DPT=9105 SEQ=3389028362 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE3DEF0000000001030307) 
Dec 06 09:32:50 np0005548789.localdomain sudo[142935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmgwpjpkieviotyjtcwfpwokmvwzfgpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013570.247761-1359-264510050655187/AnsiballZ_mount.py
Dec 06 09:32:50 np0005548789.localdomain sudo[142935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:50 np0005548789.localdomain python3.9[142937]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:32:50 np0005548789.localdomain sudo[142935]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:51 np0005548789.localdomain sudo[143028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fttpbgbetbpqrtwnvvauwbspxcgtxebu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013571.101988-1359-44596936456012/AnsiballZ_mount.py
Dec 06 09:32:51 np0005548789.localdomain sudo[143028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:51 np0005548789.localdomain python3.9[143030]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:32:51 np0005548789.localdomain sudo[143028]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:52 np0005548789.localdomain sshd[137734]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:32:52 np0005548789.localdomain systemd-logind[766]: Session 44 logged out. Waiting for processes to exit.
Dec 06 09:32:52 np0005548789.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Dec 06 09:32:52 np0005548789.localdomain systemd[1]: session-44.scope: Consumed 27.557s CPU time.
Dec 06 09:32:52 np0005548789.localdomain systemd-logind[766]: Removed session 44.
Dec 06 09:32:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27453 DF PROTO=TCP SPT=59210 DPT=9882 SEQ=1642654696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE4AD90000000001030307) 
Dec 06 09:32:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17452 DF PROTO=TCP SPT=58168 DPT=9882 SEQ=21909413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE59EF0000000001030307) 
Dec 06 09:32:58 np0005548789.localdomain sshd[143046]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:32:58 np0005548789.localdomain sshd[143046]: Accepted publickey for zuul from 192.168.122.30 port 55654 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:32:58 np0005548789.localdomain systemd-logind[766]: New session 45 of user zuul.
Dec 06 09:32:58 np0005548789.localdomain systemd[1]: Started Session 45 of User zuul.
Dec 06 09:32:58 np0005548789.localdomain sshd[143046]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:32:59 np0005548789.localdomain sudo[143139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lapnplrrijaebmljbsfjbcpxqivnsnqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013578.6615033-22-149193389674905/AnsiballZ_tempfile.py
Dec 06 09:32:59 np0005548789.localdomain sudo[143139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:59 np0005548789.localdomain python3.9[143141]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 09:32:59 np0005548789.localdomain sudo[143139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:00 np0005548789.localdomain sudo[143231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drhyepztyqsqetkdwqowkdipktucclql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013580.3062768-94-11946533758119/AnsiballZ_stat.py
Dec 06 09:33:00 np0005548789.localdomain sudo[143231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:00 np0005548789.localdomain python3.9[143233]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:00 np0005548789.localdomain sudo[143231]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:01 np0005548789.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 09:33:02 np0005548789.localdomain sudo[143327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igfvydzxedhszoqhagtcnuattmvmbter ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013581.635955-142-53463656694089/AnsiballZ_slurp.py
Dec 06 09:33:02 np0005548789.localdomain sudo[143327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:02 np0005548789.localdomain python3.9[143329]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 06 09:33:02 np0005548789.localdomain sudo[143327]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:02 np0005548789.localdomain sshd[143344]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:03 np0005548789.localdomain sudo[143421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yooniyltskmxuhxbwprvxmbijwgaaovj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013582.9401598-190-217070317977165/AnsiballZ_stat.py
Dec 06 09:33:03 np0005548789.localdomain sudo[143421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:03 np0005548789.localdomain python3.9[143423]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.u691m5zd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:03 np0005548789.localdomain sudo[143421]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14146 DF PROTO=TCP SPT=54290 DPT=9102 SEQ=4089443986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE71070000000001030307) 
Dec 06 09:33:04 np0005548789.localdomain sudo[143496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxpyjmozyhoojnvmiwbuwfwziyrgtagc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013582.9401598-190-217070317977165/AnsiballZ_copy.py
Dec 06 09:33:04 np0005548789.localdomain sudo[143496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:04 np0005548789.localdomain sshd[143499]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:04 np0005548789.localdomain sshd[143344]: Received disconnect from 103.234.151.178 port 29768:11: Bye Bye [preauth]
Dec 06 09:33:04 np0005548789.localdomain sshd[143344]: Disconnected from authenticating user root 103.234.151.178 port 29768 [preauth]
Dec 06 09:33:04 np0005548789.localdomain python3.9[143498]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.u691m5zd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013582.9401598-190-217070317977165/.source.u691m5zd _original_basename=.cpktzjzx follow=False checksum=3e842c629948eb11ff005810a7264dbaf8a6d16e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:04 np0005548789.localdomain sudo[143496]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:04 np0005548789.localdomain sshd[143499]: Received disconnect from 81.192.46.35 port 47818:11: Bye Bye [preauth]
Dec 06 09:33:04 np0005548789.localdomain sshd[143499]: Disconnected from authenticating user root 81.192.46.35 port 47818 [preauth]
Dec 06 09:33:06 np0005548789.localdomain sudo[143590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ratyiadmwtzbhffnplcxmnsnpvjxapjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013586.1030514-280-66577239431634/AnsiballZ_setup.py
Dec 06 09:33:06 np0005548789.localdomain sudo[143590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:06 np0005548789.localdomain python3.9[143592]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:06 np0005548789.localdomain sudo[143590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55754 DF PROTO=TCP SPT=32946 DPT=9102 SEQ=4129418666 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE7FF00000000001030307) 
Dec 06 09:33:08 np0005548789.localdomain sshd[143593]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:09 np0005548789.localdomain sshd[143593]: Received disconnect from 103.157.25.60 port 44622:11: Bye Bye [preauth]
Dec 06 09:33:09 np0005548789.localdomain sshd[143593]: Disconnected from authenticating user root 103.157.25.60 port 44622 [preauth]
Dec 06 09:33:09 np0005548789.localdomain sudo[143684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlhworlpewqmrhdgcbyezpovoodsbbwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013589.3739994-329-268942476458811/AnsiballZ_blockinfile.py
Dec 06 09:33:09 np0005548789.localdomain sudo[143684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:10 np0005548789.localdomain python3.9[143686]: ansible-ansible.builtin.blockinfile Invoked with block=np0005548785.localdomain,192.168.122.103,np0005548785* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC89JzJHuRLDUgmU66VPdPVwYLrvslBwa5i2QfiUzrnpt1lKz8ayq6QMRy5y5GgfjQQhX/YZiAjUSoogVsYDkoDaImXdtfQHFlFMLTlJPiYcA/cGAwMAE/vifpWoztBHUXkJ5YWUojkXzGoR8d7ESx/tTLG/9QrQDsW6JcV18mcFCQZdeWYWGWdLn6ynmQOZ0N4U6mYK1FqE+GKgP6L9PEjkC1ePo81AnYcdQ5Z1IETdcCcJytdvvxH/Zie1PiAaMAgMYhsqu7+DZRRTvg+cEMw3mRVuodIyQEbpZs8MjR3itViRfZ+UqYi6uKDnz1viLL0aACaYhOLzrE7bQ6Sl4j1MnMrWncUOv3Sq2fus+Y6oYmed84E6HUNljte7vVP9jwPclbCAmj5WuC/Av9dSqqHEpPRbKJ4tAuBrO2LBKS7J62FjRYiY807V1viyxUgjK5FmsQyfVr3/YOirluSx54e4XwxxDrAjtrd0x68H7/Mt6HP/79cWKaVbC7XUckYRmE=
                                                            np0005548785.localdomain,192.168.122.103,np0005548785* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPnHRGHw2U3XDUZBfS69ZpwocvZ2haE6Sebzf3BV40dJ
                                                            np0005548785.localdomain,192.168.122.103,np0005548785* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMgorOAtIXk7BOknkR82ERwiBlDoAcpTTo8DwXwOeKFxueIG2AzGwqy/M3AlognMpbS9bigTSmXKYzfS5SNcGD8=
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDURzBA/aIGrwPgaIApy0UCTi4wdQhfDEx0QfkSAIn0ZptZcOkaR8BWtl9GijRPEp++Ep4qU04JcwHO1ZULd2UnCdDeg1Imwnf7x9HQBjAr0mH+tE0t4MBLtBbrk8Ep5ggyKATK1CvEl3NuGIS4gSSUWxzkR74Iju/GtrEMuVnMSsOw+auBofiv1ne4zyXqQWZORiK32DSolw1KyXGLyqG+JOpl3Kza5o79S1KUghfRzskZMm/AxFYciPmg4EQK/jL9Izj7qq3v8MaL8baeyqNlPaaRKCh+pkZlYtoPzDhe+vn/jwnDmQgqC1Bh+dkNiKEVlWz3mxoiMoeLY3jP/tMF2M4M8puGakPc2sqJxk1++Tv/lFRO3zBS+V2kECKI5DtQI6XThfLYXxIQl5SHr4yGEoxhMNt6YNQPLp6lg30kHO24YyNNA7LPFYYoOGUCaq5ZVUCF9lagMxcgkN0Bs+ZZqeni+53RqxoutiRZ0m9pIiqxGjrJjbNFXmofgfDBcUE=
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILIgwHZ/0Q8K6t9dlBCQwEO6OABCR0J0IF6hfmA44GBM
                                                            np0005548786.localdomain,192.168.122.104,np0005548786* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBItDJKfsljV78XBJL8EuwSxDvfxuZ9Jz6PgjXVap/GJqsza+9ApDVkNpmAVhdxO9qX1PPD9KOxQjcrD2A8MXQ10=
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXe0UZ2kJKcvYaHSnjIOf3QqkGhArLo32nvDm8Pl8ZVNWfdRV8R+e17etAicDq//fxWC+U9jiHp4qI6/0Jm64rPocmJKaA+r79sNpv+598NlGtVUfTYQ34Ze9bgaPkjAwKfPNrzjSDChyfkys4Hm0J7ttog5rvMcuRelxkFmoonOcuzBC+9ufI6qld7br5w4WDookwamkefbMCiwAZxrw2bSjoTu7/TEFbt7SM0lUIdqP5WvxpWK52OkjnakQ0BL4QHdRYz1kBx/vS0TFxXb2pMO291dfkxDl3H2oXXZZYK/LWy3nZyJEX+mD5J6WOEs5HC5GQQ+CNEV0wa2e/gJA7KBsyL5T6RBtH8id22sBHZkzcaDhUz1ZABGAiOx4rdrr4YFFFy/u00nX3ZCuRBPXYh37Pafl7GXcSKyhTmkCZI0591RdNmb1duh9ZIObRmPVp2+WIheAFvS7EU4B0+ZjAEbDJgiSa9VlUrlRFX0ajcFHR8FnwNRcoERO3A3h4/Tc=
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGClV/UHC6wrHLH6ofPCeG9Z3WpaSbH42qD4AsTbywke
                                                            np0005548787.localdomain,192.168.122.105,np0005548787* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD+VBma5zUGbc6C8yvVJH1yH01D2HwvgMwJZ3Ew/fQ9uangWsK7hoczIcWgUhEN67mue6bMYPNkv+zbE5QDlLqA=
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwH3rhRTvOINLmLdbeRXeXOiMzz+IXEuW2cXYAe50Wcc3ikH2RVGirWQrwLc8hAoA7UFCXADqEMxPg6/fLsQkbP7kLOpUtam8nuXvgt8VHM4RFl5wh9EOgZ7DWgjA7s3r2eQMcBhv82CjVMLY/YjnLuRNXCsJAqeG32qcKedKH/huEFvkb49U/UnNlxi5BfNrMlY9n5UQXE2rd6EKwP58aP/qQ1ie3p8nwHc36/MJcfEIABlLaoHK/LxnadOFTh93OkqVi7A0VQsKSmKD64nABiN7ML0NReoyRIQI5r3Dawe8v2K9jCBh5jY88TVsYUJqgwoZSSU73sYGHX4uF+PY8wL7qwn6mCzA17GGYeB8Dy0N8qwDqah6kUjpcLwGp7YaKf0FIZPBKcLVMrX6Tnwxer1j3kOIt3tgLZoz3mMfstWfCyvt9t+GEW5MCE+MBkY4Eree3uK7pI+wJ3vFQS9XVP00hjNiLWYmoaaW6rl8xtw7QtGhzmjcWbOxaZvHWE5E=
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM7zsgz8o1LOsRIDgDJ0j4aB+gvG7QE4PuIS5gi3px2U
                                                            np0005548789.localdomain,192.168.122.107,np0005548789* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNB22R613xD5iIn21fw712bqcytUxBHAFZPMSjpWL8XVTi6taleS2y8rpYqGoN21DgQgwO1SxmcqZLfwlh7T5/4=
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDmdMCy44p73Ui+o09YQitqR9FILqoJ6AGYYutFVH6wn5m1j6oEoI4XgVFPR3UpG3SXdoiG7m0DRxC/WZZMpZbaQ3ZHbJJioRh1hV5uQtK5k2gtmS8uePng5UprbLncMXf+HIxNRvirU3r6zdgNGAroK0rN0nWESi/FNb2flu9Aw9JAsgIAAouW4IUoeyMGZ1AflhRhsWsQMstM9UEeGU+iTqV7al1URVCSq1finY99m+QC+Pftpd2C/+agboOIiVa63+D/RqqfYqh4C/PYfDbssYjcZzk3P90+HQ6uMKexX3HRnFbyje4eLSBHC0pjr/4pNfk/eSpdHeyMAPsP+QlBztdcPj9OnjcmT9ymeJRKF7GwNIWg3Pn9L2yY50d8l9Zu6rNIDW786XNcbm88yHdCHA5FE1A8XTWQRQ3eUSUsmsvf03pExAouRM4Fj8dvCu6wzG2SuyWqmdT5yCNrUG0e1CeE6PcfTLBeS5CJAwn5HM8aUndQQldWmaUbMPL5Jis=
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPoyxTI8+8n9PWFBkZatum98GfJRQMd2qn9CijEFzfEz
                                                            np0005548790.localdomain,192.168.122.108,np0005548790* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNOlnHgYu82mRZ1QroLe1BG6rymOGDqDJGz5MpHZnXnhJ6iIwC87em0cGHiSKgU+UZ4DpWQTIlxwKsn9Jp9Hl1Y=
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxIoAQH9YZnGrAxYR5prFQwo6HY5mwdDjndb+bp2pwvtVLM4ABIdCi+K1wpbhOpoO7BsYOf/tdBqemvSDleNo/ZLh3v3MmoVtoTtQZqLWsAQWFgJCjcGUGB+H3CHhtbp706coVQMlGD+UQqpCBy8WamMB/Ldy+hSHbLHwzuMzj8tO90vUbEyuKgOuu/X3ZFa+Yjo/asQ+PTrVfirh1QvRQ9aK22xH89KbThA/1an4OjnNGLCP752auSQ894B21QLKfqaMGPlpbjU8Wr6MP4zKV9lUzpQiFr6IU6cd4CeIsJDj7FnAZuBSmi8ewgm/r4ZWkmCSlqw8OpMC5soJnm8Q4PJTIFvT9eyyFCh9xmQkMhzE8P332LtYjZ+vXhYFU14e04mOQx5UrtHN8uWJVbOAwtLNAcenHyRtCQGkAZ6f9q0OvSuYr+o3FhHhN5ABu32AKAD8YpkjLypi+PbaiKNQW8XzPAHHbV8CGZ4B09ZWeQY49VA0bPxIYBXd1mEBlXSE=
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJBkIOjRpLl815RvOqIZSSNUu/CGLqucfCRUist+ERWP
                                                            np0005548788.localdomain,192.168.122.106,np0005548788* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNyEL9+sMn9BF0LnCanz9jbKQTm6FNV71J4qGFTonom0KXHpLL1p0eyrgFY0iwGH2UtwJ6VWm5bm2RaQJmObwZI=
                                                             create=True mode=0644 path=/tmp/ansible.u691m5zd state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:10 np0005548789.localdomain sudo[143684]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:11 np0005548789.localdomain sudo[143776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjmxbvcktnhrdqmnqitgswailnudzgan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013590.7735195-377-269032770008226/AnsiballZ_command.py
Dec 06 09:33:11 np0005548789.localdomain sudo[143776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:11 np0005548789.localdomain python3.9[143778]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.u691m5zd' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:11 np0005548789.localdomain sudo[143776]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:12 np0005548789.localdomain sudo[143870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfxhsmephndaejpbuyjuctszimbjiijf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.1298392-425-154543651351803/AnsiballZ_file.py
Dec 06 09:33:12 np0005548789.localdomain sudo[143870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:12 np0005548789.localdomain python3.9[143872]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.u691m5zd state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:12 np0005548789.localdomain sudo[143870]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:13 np0005548789.localdomain sshd[143046]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:13 np0005548789.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Dec 06 09:33:13 np0005548789.localdomain systemd[1]: session-45.scope: Consumed 4.162s CPU time.
Dec 06 09:33:13 np0005548789.localdomain systemd-logind[766]: Session 45 logged out. Waiting for processes to exit.
Dec 06 09:33:13 np0005548789.localdomain systemd-logind[766]: Removed session 45.
Dec 06 09:33:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24703 DF PROTO=TCP SPT=39358 DPT=9101 SEQ=2857460432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CE9A730000000001030307) 
Dec 06 09:33:16 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8041 DF PROTO=TCP SPT=38842 DPT=9105 SEQ=4216625217 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEA3960000000001030307) 
Dec 06 09:33:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25227 DF PROTO=TCP SPT=32874 DPT=9100 SEQ=2169528005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEA7680000000001030307) 
Dec 06 09:33:19 np0005548789.localdomain sshd[143888]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:19 np0005548789.localdomain sshd[143890]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:20 np0005548789.localdomain sshd[143888]: Accepted publickey for zuul from 192.168.122.30 port 36966 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:20 np0005548789.localdomain systemd-logind[766]: New session 46 of user zuul.
Dec 06 09:33:20 np0005548789.localdomain systemd[1]: Started Session 46 of User zuul.
Dec 06 09:33:20 np0005548789.localdomain sshd[143888]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:21 np0005548789.localdomain python3.9[143983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:22 np0005548789.localdomain sudo[144077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvjmavjjieanthrtqogmxbkyctdnwgpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013601.7159956-57-252588977949088/AnsiballZ_systemd.py
Dec 06 09:33:22 np0005548789.localdomain sudo[144077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:22 np0005548789.localdomain sshd[144080]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:22 np0005548789.localdomain python3.9[144079]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:33:22 np0005548789.localdomain sudo[144077]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:23 np0005548789.localdomain sshd[143890]: Connection closed by 45.78.222.162 port 51540 [preauth]
Dec 06 09:33:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61947 DF PROTO=TCP SPT=37094 DPT=9882 SEQ=2579427058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEC0090000000001030307) 
Dec 06 09:33:24 np0005548789.localdomain sudo[144173]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvjkckaaiwpbrthjdspgdpuitugkpqtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013603.7681172-81-25421303562794/AnsiballZ_systemd.py
Dec 06 09:33:24 np0005548789.localdomain sudo[144173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:24 np0005548789.localdomain sshd[144080]: Received disconnect from 103.192.152.59 port 56166:11: Bye Bye [preauth]
Dec 06 09:33:24 np0005548789.localdomain sshd[144080]: Disconnected from authenticating user root 103.192.152.59 port 56166 [preauth]
Dec 06 09:33:24 np0005548789.localdomain python3.9[144175]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:33:24 np0005548789.localdomain sudo[144173]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:25 np0005548789.localdomain sudo[144266]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuacapxasumhbgfsjusiyweswywrtyft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013604.6520398-108-89217654642320/AnsiballZ_command.py
Dec 06 09:33:25 np0005548789.localdomain sudo[144266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:25 np0005548789.localdomain python3.9[144268]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:25 np0005548789.localdomain sudo[144266]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:25 np0005548789.localdomain sudo[144359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgnmxbkvgamqsygctzwuvjauraoxffyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013605.459498-133-142444027788179/AnsiballZ_stat.py
Dec 06 09:33:25 np0005548789.localdomain sudo[144359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:26 np0005548789.localdomain python3.9[144361]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:26 np0005548789.localdomain sudo[144359]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:26 np0005548789.localdomain sudo[144453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jueytqzfzjsgomampgevhxxhwekmkszt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013606.3052764-156-245104391837708/AnsiballZ_command.py
Dec 06 09:33:26 np0005548789.localdomain sudo[144453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:26 np0005548789.localdomain python3.9[144455]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:26 np0005548789.localdomain sudo[144453]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:27 np0005548789.localdomain sudo[144548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgmbfjuviyvrwjrthcumfljldhqewasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013606.9975991-180-178595083576526/AnsiballZ_file.py
Dec 06 09:33:27 np0005548789.localdomain sudo[144548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:27 np0005548789.localdomain python3.9[144550]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:27 np0005548789.localdomain sudo[144548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:28 np0005548789.localdomain sshd[143888]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:28 np0005548789.localdomain systemd-logind[766]: Session 46 logged out. Waiting for processes to exit.
Dec 06 09:33:28 np0005548789.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Dec 06 09:33:28 np0005548789.localdomain systemd[1]: session-46.scope: Consumed 3.725s CPU time.
Dec 06 09:33:28 np0005548789.localdomain systemd-logind[766]: Removed session 46.
Dec 06 09:33:33 np0005548789.localdomain sshd[144566]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:33 np0005548789.localdomain sshd[144566]: Accepted publickey for zuul from 192.168.122.30 port 49028 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:33 np0005548789.localdomain systemd-logind[766]: New session 47 of user zuul.
Dec 06 09:33:33 np0005548789.localdomain systemd[1]: Started Session 47 of User zuul.
Dec 06 09:33:33 np0005548789.localdomain sshd[144566]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27948 DF PROTO=TCP SPT=42274 DPT=9102 SEQ=3809184943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEE6360000000001030307) 
Dec 06 09:33:34 np0005548789.localdomain python3.9[144659]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27949 DF PROTO=TCP SPT=42274 DPT=9102 SEQ=3809184943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEEA2F0000000001030307) 
Dec 06 09:33:35 np0005548789.localdomain sudo[144753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiwyrzigihrhminknetdsnnajkzzrdye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013615.1627667-63-167391328270039/AnsiballZ_setup.py
Dec 06 09:33:35 np0005548789.localdomain sudo[144753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:35 np0005548789.localdomain python3.9[144755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:33:36 np0005548789.localdomain sudo[144753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:36 np0005548789.localdomain sudo[144807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-femvwckxjhplpeehmgucicvverkwnrad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013615.1627667-63-167391328270039/AnsiballZ_dnf.py
Dec 06 09:33:36 np0005548789.localdomain sudo[144807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:36 np0005548789.localdomain python3.9[144809]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:33:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27950 DF PROTO=TCP SPT=42274 DPT=9102 SEQ=3809184943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CEF22F0000000001030307) 
Dec 06 09:33:39 np0005548789.localdomain sudo[144807]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27951 DF PROTO=TCP SPT=42274 DPT=9102 SEQ=3809184943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF01EF0000000001030307) 
Dec 06 09:33:41 np0005548789.localdomain python3.9[144901]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:41 np0005548789.localdomain sudo[144903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:33:41 np0005548789.localdomain sudo[144903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:41 np0005548789.localdomain sudo[144903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:41 np0005548789.localdomain sudo[144918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:33:41 np0005548789.localdomain sudo[144918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:42 np0005548789.localdomain sudo[144918]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:42 np0005548789.localdomain sudo[145055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqhrgfjxqevealscpokqquqhsomojipj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013622.2487633-126-152252632929099/AnsiballZ_file.py
Dec 06 09:33:42 np0005548789.localdomain sudo[145055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:42 np0005548789.localdomain sudo[145058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:33:42 np0005548789.localdomain sudo[145058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:33:42 np0005548789.localdomain sudo[145058]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:42 np0005548789.localdomain sshd[145073]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:42 np0005548789.localdomain python3.9[145057]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:42 np0005548789.localdomain sudo[145055]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:43 np0005548789.localdomain sshd[145073]: Received disconnect from 12.156.67.18 port 59562:11: Bye Bye [preauth]
Dec 06 09:33:43 np0005548789.localdomain sshd[145073]: Disconnected from authenticating user root 12.156.67.18 port 59562 [preauth]
Dec 06 09:33:43 np0005548789.localdomain sudo[145164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekfwsvkfvmwpoqglavywllmukhkksupa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013623.026084-150-267528035982673/AnsiballZ_file.py
Dec 06 09:33:43 np0005548789.localdomain sudo[145164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:43 np0005548789.localdomain python3.9[145166]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:43 np0005548789.localdomain sudo[145164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:44 np0005548789.localdomain sudo[145256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtrxzqezeqitjydlmcwgjrqpuaewnyyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013623.8220277-174-43776649650269/AnsiballZ_lineinfile.py
Dec 06 09:33:44 np0005548789.localdomain sudo[145256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53827 DF PROTO=TCP SPT=58770 DPT=9101 SEQ=1433347702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF0FA00000000001030307) 
Dec 06 09:33:44 np0005548789.localdomain python3.9[145258]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:44 np0005548789.localdomain sudo[145256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:45 np0005548789.localdomain python3.9[145348]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:33:45 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53828 DF PROTO=TCP SPT=58770 DPT=9101 SEQ=1433347702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF13AF0000000001030307) 
Dec 06 09:33:46 np0005548789.localdomain python3.9[145438]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:46 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9032 DF PROTO=TCP SPT=41690 DPT=9105 SEQ=2344888170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF18C40000000001030307) 
Dec 06 09:33:46 np0005548789.localdomain python3.9[145530]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:47 np0005548789.localdomain sshd[144566]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:47 np0005548789.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Dec 06 09:33:47 np0005548789.localdomain systemd[1]: session-47.scope: Consumed 8.803s CPU time.
Dec 06 09:33:47 np0005548789.localdomain systemd-logind[766]: Session 47 logged out. Waiting for processes to exit.
Dec 06 09:33:47 np0005548789.localdomain systemd-logind[766]: Removed session 47.
Dec 06 09:33:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53829 DF PROTO=TCP SPT=58770 DPT=9101 SEQ=1433347702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF1BAF0000000001030307) 
Dec 06 09:33:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19904 DF PROTO=TCP SPT=60528 DPT=9100 SEQ=1211413549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF1C980000000001030307) 
Dec 06 09:33:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9033 DF PROTO=TCP SPT=41690 DPT=9105 SEQ=2344888170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF1CB00000000001030307) 
Dec 06 09:33:48 np0005548789.localdomain sshd[145545]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:48 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19905 DF PROTO=TCP SPT=60528 DPT=9100 SEQ=1211413549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF20AF0000000001030307) 
Dec 06 09:33:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9034 DF PROTO=TCP SPT=41690 DPT=9105 SEQ=2344888170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF24AF0000000001030307) 
Dec 06 09:33:50 np0005548789.localdomain sshd[145545]: Received disconnect from 179.33.210.213 port 33962:11: Bye Bye [preauth]
Dec 06 09:33:50 np0005548789.localdomain sshd[145545]: Disconnected from authenticating user root 179.33.210.213 port 33962 [preauth]
Dec 06 09:33:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9035 DF PROTO=TCP SPT=41690 DPT=9105 SEQ=2344888170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF346F0000000001030307) 
Dec 06 09:33:54 np0005548789.localdomain sshd[145547]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:54 np0005548789.localdomain sshd[145547]: Accepted publickey for zuul from 192.168.122.30 port 52324 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:33:54 np0005548789.localdomain systemd-logind[766]: New session 48 of user zuul.
Dec 06 09:33:54 np0005548789.localdomain systemd[1]: Started Session 48 of User zuul.
Dec 06 09:33:54 np0005548789.localdomain sshd[145547]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:33:55 np0005548789.localdomain python3.9[145640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36175 DF PROTO=TCP SPT=59930 DPT=9882 SEQ=3639643898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF412F0000000001030307) 
Dec 06 09:33:57 np0005548789.localdomain sudo[145734]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbdnwnmfhnqsbfjibkccjyqljmxwewjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013636.822687-158-232630241402223/AnsiballZ_file.py
Dec 06 09:33:57 np0005548789.localdomain sudo[145734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:57 np0005548789.localdomain python3.9[145736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:57 np0005548789.localdomain sudo[145734]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 np0005548789.localdomain sudo[145826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaeehjymcjdoykrtvmyzmuedzkwdtmus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013637.5897832-182-160766225013724/AnsiballZ_stat.py
Dec 06 09:33:58 np0005548789.localdomain sudo[145826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:58 np0005548789.localdomain python3.9[145828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:58 np0005548789.localdomain sudo[145826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 np0005548789.localdomain sshd[145890]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:33:58 np0005548789.localdomain sudo[145901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syreslcriqouwqiylshdidtfxjkitbmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013637.5897832-182-160766225013724/AnsiballZ_copy.py
Dec 06 09:33:58 np0005548789.localdomain sudo[145901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:59 np0005548789.localdomain python3.9[145903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013637.5897832-182-160766225013724/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:59 np0005548789.localdomain sudo[145901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:59 np0005548789.localdomain sudo[145993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqlgbuaqwawdzkacgiiyovfcuugxhbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013639.4508965-236-50357741105436/AnsiballZ_file.py
Dec 06 09:33:59 np0005548789.localdomain sudo[145993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53831 DF PROTO=TCP SPT=58770 DPT=9101 SEQ=1433347702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF4BEF0000000001030307) 
Dec 06 09:33:59 np0005548789.localdomain python3.9[145995]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:59 np0005548789.localdomain sudo[145993]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:00 np0005548789.localdomain sshd[145890]: Received disconnect from 118.193.38.207 port 60448:11: Bye Bye [preauth]
Dec 06 09:34:00 np0005548789.localdomain sshd[145890]: Disconnected from authenticating user root 118.193.38.207 port 60448 [preauth]
Dec 06 09:34:00 np0005548789.localdomain sudo[146085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qucylrxcgxfdhfdtbrbfhfqvfjzriipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013640.0716984-253-139812967846941/AnsiballZ_stat.py
Dec 06 09:34:00 np0005548789.localdomain sudo[146085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:00 np0005548789.localdomain python3.9[146087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:00 np0005548789.localdomain sudo[146085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9036 DF PROTO=TCP SPT=41690 DPT=9105 SEQ=2344888170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF53EF0000000001030307) 
Dec 06 09:34:01 np0005548789.localdomain sudo[146158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plaupiapppviwshecgjuotlgyssbfyex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013640.0716984-253-139812967846941/AnsiballZ_copy.py
Dec 06 09:34:01 np0005548789.localdomain sudo[146158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:02 np0005548789.localdomain python3.9[146160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013640.0716984-253-139812967846941/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:02 np0005548789.localdomain sudo[146158]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:02 np0005548789.localdomain sudo[146250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxbyizvyijckwowhgjflzjevrkfbgrcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013642.4965298-315-252133699195095/AnsiballZ_file.py
Dec 06 09:34:02 np0005548789.localdomain sudo[146250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:02 np0005548789.localdomain python3.9[146252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:02 np0005548789.localdomain sudo[146250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:03 np0005548789.localdomain sudo[146342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpoyqcywjyczfdavdviyilvirklahtno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013643.613141-341-259875819668189/AnsiballZ_stat.py
Dec 06 09:34:03 np0005548789.localdomain sudo[146342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:04 np0005548789.localdomain chronyd[137718]: Selected source 158.69.193.108 (pool.ntp.org)
Dec 06 09:34:04 np0005548789.localdomain python3.9[146344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:04 np0005548789.localdomain sudo[146342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:04 np0005548789.localdomain sudo[146415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfiupyduvpzymubxbwnfdwctrsqedtob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013643.613141-341-259875819668189/AnsiballZ_copy.py
Dec 06 09:34:04 np0005548789.localdomain sudo[146415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:04 np0005548789.localdomain python3.9[146417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013643.613141-341-259875819668189/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:04 np0005548789.localdomain sudo[146415]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48060 DF PROTO=TCP SPT=56898 DPT=9102 SEQ=2227287577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF5F700000000001030307) 
Dec 06 09:34:05 np0005548789.localdomain sudo[146507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpcnldjafeqdovdzljzouzlrfczscgme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013644.8166153-384-256293178912344/AnsiballZ_file.py
Dec 06 09:34:05 np0005548789.localdomain sudo[146507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:05 np0005548789.localdomain python3.9[146509]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:05 np0005548789.localdomain sudo[146507]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:05 np0005548789.localdomain sudo[146599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kynslgajhsnttvvncomdbpnprtapxzdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013645.6995184-409-199463690800059/AnsiballZ_stat.py
Dec 06 09:34:05 np0005548789.localdomain sudo[146599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:06 np0005548789.localdomain python3.9[146601]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:06 np0005548789.localdomain sudo[146599]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:06 np0005548789.localdomain sudo[146672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-praovewzlltbsfokirivklxfgdtvkpxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013645.6995184-409-199463690800059/AnsiballZ_copy.py
Dec 06 09:34:06 np0005548789.localdomain sudo[146672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:06 np0005548789.localdomain sshd[146675]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:06 np0005548789.localdomain python3.9[146674]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013645.6995184-409-199463690800059/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:06 np0005548789.localdomain sudo[146672]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:07 np0005548789.localdomain sudo[146766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twavxksuwimoitukjawqlrnfoshttbqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013646.8776257-452-196182134280497/AnsiballZ_file.py
Dec 06 09:34:07 np0005548789.localdomain sudo[146766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:07 np0005548789.localdomain python3.9[146768]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:07 np0005548789.localdomain sudo[146766]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:07 np0005548789.localdomain sudo[146858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pksdnbpuyxpdagismrloqsxcvjnxusxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013647.459166-476-14489716808621/AnsiballZ_stat.py
Dec 06 09:34:07 np0005548789.localdomain sudo[146858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:07 np0005548789.localdomain python3.9[146860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:07 np0005548789.localdomain sudo[146858]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:08 np0005548789.localdomain sshd[146675]: Received disconnect from 64.227.156.63 port 53914:11: Bye Bye [preauth]
Dec 06 09:34:08 np0005548789.localdomain sshd[146675]: Disconnected from authenticating user root 64.227.156.63 port 53914 [preauth]
Dec 06 09:34:08 np0005548789.localdomain sudo[146931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqzszaoprixbdoozlfbifdtdeerhorby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013647.459166-476-14489716808621/AnsiballZ_copy.py
Dec 06 09:34:08 np0005548789.localdomain sudo[146931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:08 np0005548789.localdomain python3.9[146933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013647.459166-476-14489716808621/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:08 np0005548789.localdomain sudo[146931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:08 np0005548789.localdomain sudo[147023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmswwwbjuuwirzvapjtmqrmfgtdeurtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013648.6507983-523-150646836422193/AnsiballZ_file.py
Dec 06 09:34:08 np0005548789.localdomain sudo[147023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:09 np0005548789.localdomain python3.9[147025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:09 np0005548789.localdomain sudo[147023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36177 DF PROTO=TCP SPT=59930 DPT=9882 SEQ=3639643898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF71EF0000000001030307) 
Dec 06 09:34:09 np0005548789.localdomain sudo[147115]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aplxcikivxwszwtcuzqtftwaljaikrow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013649.2545688-548-11558135951373/AnsiballZ_stat.py
Dec 06 09:34:09 np0005548789.localdomain sudo[147115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:09 np0005548789.localdomain python3.9[147117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:09 np0005548789.localdomain sudo[147115]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:09 np0005548789.localdomain sudo[147188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laxnrbzuzmjsmhfxcjcwvfbffcobqrxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013649.2545688-548-11558135951373/AnsiballZ_copy.py
Dec 06 09:34:09 np0005548789.localdomain sudo[147188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:10 np0005548789.localdomain python3.9[147190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013649.2545688-548-11558135951373/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:10 np0005548789.localdomain sudo[147188]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:10 np0005548789.localdomain sudo[147280]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvrddugqderwbksupbvhnyjnxbanlpxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013650.3258624-593-28913171208510/AnsiballZ_file.py
Dec 06 09:34:10 np0005548789.localdomain sudo[147280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48062 DF PROTO=TCP SPT=56898 DPT=9102 SEQ=2227287577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF772F0000000001030307) 
Dec 06 09:34:10 np0005548789.localdomain python3.9[147282]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:10 np0005548789.localdomain sudo[147280]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:11 np0005548789.localdomain sudo[147372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igsefqafjygzhmlrzwkczhowxadmlnzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013651.0013497-618-155047141128834/AnsiballZ_stat.py
Dec 06 09:34:11 np0005548789.localdomain sudo[147372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:11 np0005548789.localdomain python3.9[147374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:11 np0005548789.localdomain sudo[147372]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:12 np0005548789.localdomain sudo[147445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-patdwdibtaptbcpvnjievcebigdhgryn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013651.0013497-618-155047141128834/AnsiballZ_copy.py
Dec 06 09:34:12 np0005548789.localdomain sudo[147445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:12 np0005548789.localdomain python3.9[147447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013651.0013497-618-155047141128834/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:12 np0005548789.localdomain sudo[147445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:12 np0005548789.localdomain sudo[147537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihjetcounxrojmwbfluyqhedgjyvwfmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013652.6015332-659-69184838028678/AnsiballZ_file.py
Dec 06 09:34:12 np0005548789.localdomain sudo[147537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:13 np0005548789.localdomain python3.9[147539]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:13 np0005548789.localdomain sudo[147537]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:14 np0005548789.localdomain sudo[147629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvrxivgltxazcjiuexvfdpxknarpiaec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013653.197245-683-153019413141103/AnsiballZ_stat.py
Dec 06 09:34:14 np0005548789.localdomain sudo[147629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58534 DF PROTO=TCP SPT=45046 DPT=9101 SEQ=31981411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF84D00000000001030307) 
Dec 06 09:34:14 np0005548789.localdomain python3.9[147631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:14 np0005548789.localdomain sudo[147629]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:14 np0005548789.localdomain sudo[147702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywfrnzqavsgbypbbgfablkbtrpkoksus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013653.197245-683-153019413141103/AnsiballZ_copy.py
Dec 06 09:34:14 np0005548789.localdomain sudo[147702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:15 np0005548789.localdomain python3.9[147704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013653.197245-683-153019413141103/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=31b29ee7333177b2eb4f4f85549af35c3d0ec3b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:15 np0005548789.localdomain sudo[147702]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:15 np0005548789.localdomain sshd[147705]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:15 np0005548789.localdomain sshd[147705]: Received disconnect from 81.192.46.35 port 46138:11: Bye Bye [preauth]
Dec 06 09:34:15 np0005548789.localdomain sshd[147705]: Disconnected from authenticating user root 81.192.46.35 port 46138 [preauth]
Dec 06 09:34:16 np0005548789.localdomain sshd[145547]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:16 np0005548789.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Dec 06 09:34:16 np0005548789.localdomain systemd[1]: session-48.scope: Consumed 11.388s CPU time.
Dec 06 09:34:16 np0005548789.localdomain systemd-logind[766]: Session 48 logged out. Waiting for processes to exit.
Dec 06 09:34:16 np0005548789.localdomain systemd-logind[766]: Removed session 48.
Dec 06 09:34:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58536 DF PROTO=TCP SPT=45046 DPT=9101 SEQ=31981411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF90EF0000000001030307) 
Dec 06 09:34:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21556 DF PROTO=TCP SPT=33844 DPT=9105 SEQ=1331080246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CF99F00000000001030307) 
Dec 06 09:34:21 np0005548789.localdomain sshd[147721]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:21 np0005548789.localdomain sshd[147721]: Accepted publickey for zuul from 192.168.122.30 port 53350 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:34:21 np0005548789.localdomain systemd-logind[766]: New session 49 of user zuul.
Dec 06 09:34:21 np0005548789.localdomain systemd[1]: Started Session 49 of User zuul.
Dec 06 09:34:21 np0005548789.localdomain sshd[147721]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:34:22 np0005548789.localdomain sudo[147814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icnvjkidlrjzekfemegjfvossqvfgztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013661.857142-27-9947809497581/AnsiballZ_file.py
Dec 06 09:34:22 np0005548789.localdomain sudo[147814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:22 np0005548789.localdomain python3.9[147816]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:22 np0005548789.localdomain sudo[147814]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:23 np0005548789.localdomain sshd[147868]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:23 np0005548789.localdomain sudo[147908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvzahuggxpwqdspbtxaitelufxmpgmzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013662.9781508-63-116259745451274/AnsiballZ_stat.py
Dec 06 09:34:23 np0005548789.localdomain sudo[147908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:23 np0005548789.localdomain python3.9[147910]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:23 np0005548789.localdomain sudo[147908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21557 DF PROTO=TCP SPT=33844 DPT=9105 SEQ=1331080246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFA9AF0000000001030307) 
Dec 06 09:34:24 np0005548789.localdomain sudo[147981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blhpfyzuqmkuxykuxdxhpiqhkjhwarhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013662.9781508-63-116259745451274/AnsiballZ_copy.py
Dec 06 09:34:24 np0005548789.localdomain sudo[147981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:24 np0005548789.localdomain python3.9[147983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013662.9781508-63-116259745451274/.source.conf _original_basename=ceph.conf follow=False checksum=74b6793c28400fa0a16ce9abdc4efa82feeb961d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:24 np0005548789.localdomain sudo[147981]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:24 np0005548789.localdomain sudo[148073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbsblqnyyqinerbbjajpsxdnwqpmmrwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.3649476-63-279652171569212/AnsiballZ_stat.py
Dec 06 09:34:24 np0005548789.localdomain sudo[148073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:24 np0005548789.localdomain sshd[147868]: Received disconnect from 103.234.151.178 port 55904:11: Bye Bye [preauth]
Dec 06 09:34:24 np0005548789.localdomain sshd[147868]: Disconnected from authenticating user root 103.234.151.178 port 55904 [preauth]
Dec 06 09:34:24 np0005548789.localdomain python3.9[148075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:24 np0005548789.localdomain sudo[148073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:25 np0005548789.localdomain sudo[148146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muihkuuizfzervtegnpciwadamsghecp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.3649476-63-279652171569212/AnsiballZ_copy.py
Dec 06 09:34:25 np0005548789.localdomain sudo[148146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:25 np0005548789.localdomain python3.9[148148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013664.3649476-63-279652171569212/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=9d631b6552ddeaa0e75a39b18f2bdb583e0e85e3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:25 np0005548789.localdomain sudo[148146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:25 np0005548789.localdomain sshd[147721]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36178 DF PROTO=TCP SPT=59930 DPT=9882 SEQ=3639643898 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFB1EF0000000001030307) 
Dec 06 09:34:25 np0005548789.localdomain systemd-logind[766]: Session 49 logged out. Waiting for processes to exit.
Dec 06 09:34:25 np0005548789.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Dec 06 09:34:25 np0005548789.localdomain systemd[1]: session-49.scope: Consumed 2.250s CPU time.
Dec 06 09:34:25 np0005548789.localdomain systemd-logind[766]: Removed session 49.
Dec 06 09:34:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58538 DF PROTO=TCP SPT=45046 DPT=9101 SEQ=31981411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFC1EF0000000001030307) 
Dec 06 09:34:31 np0005548789.localdomain sshd[148163]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:31 np0005548789.localdomain sshd[148163]: Accepted publickey for zuul from 192.168.122.30 port 59378 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:34:31 np0005548789.localdomain systemd-logind[766]: New session 50 of user zuul.
Dec 06 09:34:31 np0005548789.localdomain systemd[1]: Started Session 50 of User zuul.
Dec 06 09:34:31 np0005548789.localdomain sshd[148163]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:34:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21558 DF PROTO=TCP SPT=33844 DPT=9105 SEQ=1331080246 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFC9EF0000000001030307) 
Dec 06 09:34:32 np0005548789.localdomain python3.9[148256]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:33 np0005548789.localdomain sudo[148350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhyuaddobtzriguqxwfjrgowpfwbwpxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013672.9727795-63-256683965390697/AnsiballZ_file.py
Dec 06 09:34:33 np0005548789.localdomain sudo[148350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:33 np0005548789.localdomain python3.9[148352]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:33 np0005548789.localdomain sudo[148350]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:34 np0005548789.localdomain sshd[148444]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:34 np0005548789.localdomain sudo[148442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrbrvybtnzbafgcswpemyzxxrpqsedye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013673.7494326-63-222437047657751/AnsiballZ_file.py
Dec 06 09:34:34 np0005548789.localdomain sudo[148442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:34 np0005548789.localdomain python3.9[148446]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:34 np0005548789.localdomain sudo[148442]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51952 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=1622180186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFD4AF0000000001030307) 
Dec 06 09:34:34 np0005548789.localdomain python3.9[148536]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:35 np0005548789.localdomain sshd[148444]: Received disconnect from 103.157.25.60 port 46292:11: Bye Bye [preauth]
Dec 06 09:34:35 np0005548789.localdomain sshd[148444]: Disconnected from authenticating user root 103.157.25.60 port 46292 [preauth]
Dec 06 09:34:36 np0005548789.localdomain sudo[148626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjzfyjgxhxehaecpavqdcnxpnmriyivw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013675.2199616-132-132863665040772/AnsiballZ_seboolean.py
Dec 06 09:34:36 np0005548789.localdomain sudo[148626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:36 np0005548789.localdomain python3.9[148628]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:34:36 np0005548789.localdomain sudo[148626]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:37 np0005548789.localdomain sudo[148718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtfuprtsraqpsrdegcpwfcsyusugodhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013676.9886653-162-210905271972536/AnsiballZ_setup.py
Dec 06 09:34:37 np0005548789.localdomain sudo[148718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:37 np0005548789.localdomain python3.9[148720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:34:37 np0005548789.localdomain sudo[148718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:38 np0005548789.localdomain sudo[148772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzlowaqwbgultybnozmecmlqwujtuglv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013676.9886653-162-210905271972536/AnsiballZ_dnf.py
Dec 06 09:34:38 np0005548789.localdomain sudo[148772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:38 np0005548789.localdomain python3.9[148774]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:34:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42254 DF PROTO=TCP SPT=48326 DPT=9882 SEQ=995512109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFE5EF0000000001030307) 
Dec 06 09:34:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51954 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=1622180186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFEC6F0000000001030307) 
Dec 06 09:34:41 np0005548789.localdomain sudo[148772]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:42 np0005548789.localdomain sudo[148866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfnnsvvjzqkaznpmwpaymatrkllfctpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013682.1201394-198-129349681442594/AnsiballZ_systemd.py
Dec 06 09:34:42 np0005548789.localdomain sudo[148866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:42 np0005548789.localdomain sudo[148869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:34:42 np0005548789.localdomain sudo[148869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:42 np0005548789.localdomain sudo[148869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:42 np0005548789.localdomain sudo[148884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:34:42 np0005548789.localdomain sudo[148884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:43 np0005548789.localdomain python3.9[148868]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:34:43 np0005548789.localdomain podman[148974]: 2025-12-06 09:34:43.766349425 +0000 UTC m=+0.075854292 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:34:43 np0005548789.localdomain podman[148974]: 2025-12-06 09:34:43.898246831 +0000 UTC m=+0.207751728 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git)
Dec 06 09:34:44 np0005548789.localdomain sudo[148866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548789.localdomain sudo[148884]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2759 DF PROTO=TCP SPT=33812 DPT=9101 SEQ=3519393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52CFFA000000000001030307) 
Dec 06 09:34:44 np0005548789.localdomain sudo[149057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:34:44 np0005548789.localdomain sudo[149057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:44 np0005548789.localdomain sudo[149057]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548789.localdomain sudo[149085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:34:44 np0005548789.localdomain sudo[149085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:44 np0005548789.localdomain sudo[149178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrkibffoihzprismcaikexoioilnkbsn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013684.3192594-222-23829535475629/AnsiballZ_edpm_nftables_snippet.py
Dec 06 09:34:44 np0005548789.localdomain sudo[149178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:44 np0005548789.localdomain python3[149180]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 06 09:34:44 np0005548789.localdomain sudo[149178]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:44 np0005548789.localdomain sudo[149085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:45 np0005548789.localdomain sudo[149257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:34:45 np0005548789.localdomain sudo[149257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:34:45 np0005548789.localdomain sudo[149257]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:45 np0005548789.localdomain sudo[149302]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwlcxtincmndfweinifhhccflketstqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013685.351296-249-162716444643397/AnsiballZ_file.py
Dec 06 09:34:45 np0005548789.localdomain sudo[149302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:45 np0005548789.localdomain python3.9[149304]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:45 np0005548789.localdomain sudo[149302]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:46 np0005548789.localdomain sudo[149394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxhkbrfopuyvxzcwttioyezhyhghmcdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013686.0264676-273-128068839969820/AnsiballZ_stat.py
Dec 06 09:34:46 np0005548789.localdomain sudo[149394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:46 np0005548789.localdomain python3.9[149396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:46 np0005548789.localdomain sudo[149394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:46 np0005548789.localdomain sudo[149442]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayrubspwcgdrjzinxnzmxfoazotlzbty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013686.0264676-273-128068839969820/AnsiballZ_file.py
Dec 06 09:34:46 np0005548789.localdomain sudo[149442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:47 np0005548789.localdomain python3.9[149444]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:47 np0005548789.localdomain sudo[149442]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:47 np0005548789.localdomain sshd[149445]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2761 DF PROTO=TCP SPT=33812 DPT=9101 SEQ=3519393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D005EF0000000001030307) 
Dec 06 09:34:47 np0005548789.localdomain sshd[149445]: Received disconnect from 12.156.67.18 port 52140:11: Bye Bye [preauth]
Dec 06 09:34:47 np0005548789.localdomain sshd[149445]: Disconnected from authenticating user root 12.156.67.18 port 52140 [preauth]
Dec 06 09:34:48 np0005548789.localdomain sudo[149536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geiudstymrwgpgmbuimlmszceqmjnqai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.0465677-309-230571212905824/AnsiballZ_stat.py
Dec 06 09:34:48 np0005548789.localdomain sudo[149536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:48 np0005548789.localdomain python3.9[149538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:48 np0005548789.localdomain sudo[149536]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:48 np0005548789.localdomain sudo[149584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btsbspvqyvtuoiirnaoaboqscetxeirx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.0465677-309-230571212905824/AnsiballZ_file.py
Dec 06 09:34:48 np0005548789.localdomain sudo[149584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:48 np0005548789.localdomain python3.9[149586]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gmuk7otn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:49 np0005548789.localdomain sudo[149584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31193 DF PROTO=TCP SPT=49592 DPT=9105 SEQ=1936642673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D00F2F0000000001030307) 
Dec 06 09:34:49 np0005548789.localdomain sudo[149676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjmnzaqszsdmytiibtldadhfgdklqnyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013689.2145884-345-193322717258159/AnsiballZ_stat.py
Dec 06 09:34:49 np0005548789.localdomain sudo[149676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:50 np0005548789.localdomain python3.9[149678]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:50 np0005548789.localdomain sudo[149676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:50 np0005548789.localdomain sudo[149725]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtuftqsobbjbatateasehbizubnhamew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013689.2145884-345-193322717258159/AnsiballZ_file.py
Dec 06 09:34:50 np0005548789.localdomain sudo[149725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:50 np0005548789.localdomain python3.9[149727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:50 np0005548789.localdomain sudo[149725]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:51 np0005548789.localdomain sudo[149817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klggmvbncrvcranghnkbpocsofjfysnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013690.8194509-384-277329225100607/AnsiballZ_command.py
Dec 06 09:34:51 np0005548789.localdomain sudo[149817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:51 np0005548789.localdomain python3.9[149819]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:34:51 np0005548789.localdomain sudo[149817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:51 np0005548789.localdomain sshd[149835]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:34:52 np0005548789.localdomain sudo[149912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llgycdiipbcdywpumijktxjkzrjwbbop ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013691.6234667-408-137695128594311/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:34:52 np0005548789.localdomain sudo[149912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:52 np0005548789.localdomain python3[149914]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:34:52 np0005548789.localdomain sudo[149912]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:52 np0005548789.localdomain sudo[150004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mohfcbmhhhomxevilouxdhghzvradauz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.4487917-432-909690626717/AnsiballZ_stat.py
Dec 06 09:34:52 np0005548789.localdomain sudo[150004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:52 np0005548789.localdomain python3.9[150006]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:52 np0005548789.localdomain sudo[150004]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:53 np0005548789.localdomain sshd[149835]: Received disconnect from 103.192.152.59 port 40144:11: Bye Bye [preauth]
Dec 06 09:34:53 np0005548789.localdomain sshd[149835]: Disconnected from authenticating user root 103.192.152.59 port 40144 [preauth]
Dec 06 09:34:53 np0005548789.localdomain sudo[150079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njfestaycvoojwzgjfmciwdlrneqjgqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.4487917-432-909690626717/AnsiballZ_copy.py
Dec 06 09:34:53 np0005548789.localdomain sudo[150079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:53 np0005548789.localdomain python3.9[150081]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013692.4487917-432-909690626717/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:53 np0005548789.localdomain sudo[150079]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31194 DF PROTO=TCP SPT=49592 DPT=9105 SEQ=1936642673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D01EEF0000000001030307) 
Dec 06 09:34:54 np0005548789.localdomain sudo[150171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yipdonnhldgqixhungjhlojblxtwyxwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013693.8711767-477-21938630296334/AnsiballZ_stat.py
Dec 06 09:34:54 np0005548789.localdomain sudo[150171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 np0005548789.localdomain python3.9[150173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:54 np0005548789.localdomain sudo[150171]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:54 np0005548789.localdomain sudo[150246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsktyqtqfnwntaegcrkblfmtgasbnjzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013693.8711767-477-21938630296334/AnsiballZ_copy.py
Dec 06 09:34:54 np0005548789.localdomain sudo[150246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 np0005548789.localdomain python3.9[150248]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013693.8711767-477-21938630296334/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:54 np0005548789.localdomain sudo[150246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:55 np0005548789.localdomain sudo[150338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vziyakdbjnhschinqezccdfbrajzpiek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013695.1787555-522-212090974891440/AnsiballZ_stat.py
Dec 06 09:34:55 np0005548789.localdomain sudo[150338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:55 np0005548789.localdomain python3.9[150340]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:55 np0005548789.localdomain sudo[150338]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:56 np0005548789.localdomain sudo[150413]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktfwqjqlyxerpxrwhcanfdvtvfzipyyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013695.1787555-522-212090974891440/AnsiballZ_copy.py
Dec 06 09:34:56 np0005548789.localdomain sudo[150413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:56 np0005548789.localdomain python3.9[150415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013695.1787555-522-212090974891440/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:56 np0005548789.localdomain sudo[150413]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:56 np0005548789.localdomain sudo[150505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgkdrywalhjkveezmiivrexkylkfvpmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.401759-567-237851580408851/AnsiballZ_stat.py
Dec 06 09:34:56 np0005548789.localdomain sudo[150505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:56 np0005548789.localdomain python3.9[150507]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:56 np0005548789.localdomain sudo[150505]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56313 DF PROTO=TCP SPT=42750 DPT=9882 SEQ=627142310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D02BAF0000000001030307) 
Dec 06 09:34:57 np0005548789.localdomain sudo[150580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nopfjkajfpbfigxmbhwdqoxbwlwndmtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.401759-567-237851580408851/AnsiballZ_copy.py
Dec 06 09:34:57 np0005548789.localdomain sudo[150580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:57 np0005548789.localdomain python3.9[150582]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013696.401759-567-237851580408851/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:57 np0005548789.localdomain sudo[150580]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:57 np0005548789.localdomain sudo[150672]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yusfrapwzrskxpynmztlooemkwiuqewk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013697.5907106-612-157607836935768/AnsiballZ_stat.py
Dec 06 09:34:57 np0005548789.localdomain sudo[150672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:58 np0005548789.localdomain python3.9[150674]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:58 np0005548789.localdomain sudo[150672]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:58 np0005548789.localdomain sudo[150747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oikokdxcgncwfbsovaragdlsvhugxmnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013697.5907106-612-157607836935768/AnsiballZ_copy.py
Dec 06 09:34:58 np0005548789.localdomain sudo[150747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:58 np0005548789.localdomain python3.9[150749]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013697.5907106-612-157607836935768/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:58 np0005548789.localdomain sudo[150747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:59 np0005548789.localdomain sudo[150839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twvsvzzlwoixkdtwnblzezjgdvypslkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013698.85298-657-180695394271039/AnsiballZ_file.py
Dec 06 09:34:59 np0005548789.localdomain sudo[150839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:59 np0005548789.localdomain python3.9[150841]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:59 np0005548789.localdomain sudo[150839]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2763 DF PROTO=TCP SPT=33812 DPT=9101 SEQ=3519393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D035EF0000000001030307) 
Dec 06 09:35:00 np0005548789.localdomain sudo[150931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doxlvsnzczbsdwsnhxnyjhhoifxoutka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013700.329196-681-6635615774930/AnsiballZ_command.py
Dec 06 09:35:00 np0005548789.localdomain sudo[150931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:00 np0005548789.localdomain python3.9[150933]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:00 np0005548789.localdomain sudo[150931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31195 DF PROTO=TCP SPT=49592 DPT=9105 SEQ=1936642673 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D03FEF0000000001030307) 
Dec 06 09:35:02 np0005548789.localdomain sudo[151026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmslpaaalojbpqcjvwvyejnjzgooaswo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013701.0179515-705-114346137965153/AnsiballZ_blockinfile.py
Dec 06 09:35:02 np0005548789.localdomain sudo[151026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:02 np0005548789.localdomain python3.9[151028]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:02 np0005548789.localdomain sudo[151026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:03 np0005548789.localdomain sudo[151118]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lraycqqeepjerqcjqrilzvidzpkgndyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.2039351-732-197637762161192/AnsiballZ_command.py
Dec 06 09:35:03 np0005548789.localdomain sudo[151118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:03 np0005548789.localdomain python3.9[151120]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:03 np0005548789.localdomain sudo[151118]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:04 np0005548789.localdomain sudo[151211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bituyzgqopjmujvwbhjweiloyysfzydr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.8299658-756-161332195707607/AnsiballZ_stat.py
Dec 06 09:35:04 np0005548789.localdomain sudo[151211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 np0005548789.localdomain python3.9[151213]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:04 np0005548789.localdomain sudo[151211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17632 DF PROTO=TCP SPT=60550 DPT=9102 SEQ=3977837278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D049EF0000000001030307) 
Dec 06 09:35:04 np0005548789.localdomain sudo[151305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdyzyklepbaxhjmflogquiawviyoabnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013704.526774-780-246226782874995/AnsiballZ_command.py
Dec 06 09:35:04 np0005548789.localdomain sudo[151305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 np0005548789.localdomain python3.9[151307]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:05 np0005548789.localdomain sudo[151305]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:05 np0005548789.localdomain sudo[151400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mttmvnxgqzxrxcggehecadmqfstmtvql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013705.236193-804-22449965285170/AnsiballZ_file.py
Dec 06 09:35:05 np0005548789.localdomain sudo[151400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:05 np0005548789.localdomain python3.9[151402]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:05 np0005548789.localdomain sudo[151400]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:07 np0005548789.localdomain python3.9[151492]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48065 DF PROTO=TCP SPT=56898 DPT=9102 SEQ=2227287577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D055EF0000000001030307) 
Dec 06 09:35:08 np0005548789.localdomain sudo[151583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eigcsdfvdjkqxoaoczjjnbjvwdohpfob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013707.989845-924-19585374200073/AnsiballZ_command.py
Dec 06 09:35:08 np0005548789.localdomain sudo[151583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:08 np0005548789.localdomain python3.9[151585]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005548789.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:a2:0d:dc:1c" external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:08 np0005548789.localdomain ovs-vsctl[151586]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005548789.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:a2:0d:dc:1c external_ids:ovn-encap-ip=172.19.0.107 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 06 09:35:08 np0005548789.localdomain sudo[151583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:08 np0005548789.localdomain sudo[151676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsxktahnufuydhhtliogbwerchxnzsuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013708.6978152-951-134205087581415/AnsiballZ_command.py
Dec 06 09:35:08 np0005548789.localdomain sudo[151676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:09 np0005548789.localdomain python3.9[151678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:09 np0005548789.localdomain sudo[151676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:09 np0005548789.localdomain python3.9[151771]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:10 np0005548789.localdomain sudo[151863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czoxpdkbiptenqmnqoqljimvjojjwivo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.1610613-1005-219782594370059/AnsiballZ_file.py
Dec 06 09:35:10 np0005548789.localdomain sudo[151863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:10 np0005548789.localdomain python3.9[151865]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:10 np0005548789.localdomain sudo[151863]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17634 DF PROTO=TCP SPT=60550 DPT=9102 SEQ=3977837278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D061AF0000000001030307) 
Dec 06 09:35:11 np0005548789.localdomain sudo[151955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yttcdmzqnxlzcdngnrgzjwgmslaxuurl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.833455-1029-258849987735520/AnsiballZ_stat.py
Dec 06 09:35:11 np0005548789.localdomain sudo[151955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:11 np0005548789.localdomain python3.9[151957]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:11 np0005548789.localdomain sudo[151955]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:11 np0005548789.localdomain sudo[152003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcrbedkgsakfgpzauxglrlmaecjmgtep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013710.833455-1029-258849987735520/AnsiballZ_file.py
Dec 06 09:35:11 np0005548789.localdomain sudo[152003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:11 np0005548789.localdomain python3.9[152005]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:11 np0005548789.localdomain sudo[152003]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:12 np0005548789.localdomain sudo[152095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpiafiwrobvixmtpzuktxtvzkkwrbhur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013711.8630676-1029-226403912898290/AnsiballZ_stat.py
Dec 06 09:35:12 np0005548789.localdomain sudo[152095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:12 np0005548789.localdomain python3.9[152097]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:12 np0005548789.localdomain sudo[152095]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:12 np0005548789.localdomain sudo[152143]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppuryinmcmhgnqbycrlfmsbtmiubrmyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013711.8630676-1029-226403912898290/AnsiballZ_file.py
Dec 06 09:35:12 np0005548789.localdomain sudo[152143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:12 np0005548789.localdomain python3.9[152145]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:12 np0005548789.localdomain sudo[152143]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:13 np0005548789.localdomain sudo[152236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhlxeokzihuokizekknarizhqdnjfdmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013713.0814142-1098-110384161552027/AnsiballZ_file.py
Dec 06 09:35:13 np0005548789.localdomain sudo[152236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:13 np0005548789.localdomain python3.9[152238]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:13 np0005548789.localdomain sudo[152236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62160 DF PROTO=TCP SPT=50974 DPT=9101 SEQ=3447700527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D06F310000000001030307) 
Dec 06 09:35:14 np0005548789.localdomain sudo[152328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyhgdevvfjpmvvmdvsswyeeotnlbedek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013714.3355384-1122-177843686163308/AnsiballZ_stat.py
Dec 06 09:35:14 np0005548789.localdomain sudo[152328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:14 np0005548789.localdomain python3.9[152330]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:14 np0005548789.localdomain sudo[152328]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:15 np0005548789.localdomain sshd[152377]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:15 np0005548789.localdomain sudo[152376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpwshymvcpeiozaubzrnfjuejrqqimeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013714.3355384-1122-177843686163308/AnsiballZ_file.py
Dec 06 09:35:15 np0005548789.localdomain sudo[152376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:15 np0005548789.localdomain python3.9[152380]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:15 np0005548789.localdomain sudo[152376]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:15 np0005548789.localdomain sudo[152470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpvvrvpvwenqlrebnhoqjjlokdmgizns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013715.4270616-1158-36632770360034/AnsiballZ_stat.py
Dec 06 09:35:15 np0005548789.localdomain sudo[152470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:15 np0005548789.localdomain python3.9[152472]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:15 np0005548789.localdomain sudo[152470]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:16 np0005548789.localdomain sshd[152377]: Received disconnect from 118.193.38.207 port 41902:11: Bye Bye [preauth]
Dec 06 09:35:16 np0005548789.localdomain sshd[152377]: Disconnected from authenticating user root 118.193.38.207 port 41902 [preauth]
Dec 06 09:35:16 np0005548789.localdomain sudo[152518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvceoisxjbekiugvacmcqjyytetequsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013715.4270616-1158-36632770360034/AnsiballZ_file.py
Dec 06 09:35:16 np0005548789.localdomain sudo[152518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:16 np0005548789.localdomain python3.9[152520]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:16 np0005548789.localdomain sudo[152518]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62162 DF PROTO=TCP SPT=50974 DPT=9101 SEQ=3447700527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D07B2F0000000001030307) 
Dec 06 09:35:17 np0005548789.localdomain sudo[152610]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akhgpsmnpyseppvriasvlvkhkcauthir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013717.1392837-1194-18158994463032/AnsiballZ_systemd.py
Dec 06 09:35:17 np0005548789.localdomain sudo[152610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:17 np0005548789.localdomain python3.9[152612]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:17 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:35:17 np0005548789.localdomain systemd-sysv-generator[152643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:17 np0005548789.localdomain systemd-rc-local-generator[152637]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:18 np0005548789.localdomain sudo[152610]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:18 np0005548789.localdomain sudo[152741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paawarfzeilxwozpvqvfyyoiwogykizl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013718.473227-1218-113363777518170/AnsiballZ_stat.py
Dec 06 09:35:18 np0005548789.localdomain sudo[152741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:18 np0005548789.localdomain python3.9[152743]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:18 np0005548789.localdomain sudo[152741]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:19 np0005548789.localdomain sudo[152789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tksdqprelgjzryjvvvzrqldsgibpuycl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013718.473227-1218-113363777518170/AnsiballZ_file.py
Dec 06 09:35:19 np0005548789.localdomain sudo[152789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:19 np0005548789.localdomain python3.9[152791]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:19 np0005548789.localdomain sudo[152789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26543 DF PROTO=TCP SPT=41638 DPT=9105 SEQ=2004128370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D084700000000001030307) 
Dec 06 09:35:19 np0005548789.localdomain sudo[152881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icegkajrfggrtqmkacgduwuaocyhonjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.588957-1254-173542400095545/AnsiballZ_stat.py
Dec 06 09:35:19 np0005548789.localdomain sudo[152881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:20 np0005548789.localdomain python3.9[152883]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:20 np0005548789.localdomain sudo[152881]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:20 np0005548789.localdomain sudo[152929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clvpxppcumunecalxborlkrrlbhnacfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.588957-1254-173542400095545/AnsiballZ_file.py
Dec 06 09:35:20 np0005548789.localdomain sudo[152929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:20 np0005548789.localdomain python3.9[152931]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:20 np0005548789.localdomain sudo[152929]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:20 np0005548789.localdomain sudo[153021]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siiniuzmaocigmripefxabezvsmuiyqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013720.6970408-1290-151094999186266/AnsiballZ_systemd.py
Dec 06 09:35:20 np0005548789.localdomain sudo[153021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:21 np0005548789.localdomain python3.9[153023]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:35:21 np0005548789.localdomain systemd-sysv-generator[153050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:21 np0005548789.localdomain systemd-rc-local-generator[153046]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:35:21 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:35:21 np0005548789.localdomain sudo[153021]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:23 np0005548789.localdomain sudo[153157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmjxgvsnfndtsofaynrklumcxuskjntn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.0676908-1320-255023275021608/AnsiballZ_file.py
Dec 06 09:35:23 np0005548789.localdomain sudo[153157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:23 np0005548789.localdomain sshd[153160]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:23 np0005548789.localdomain python3.9[153159]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:23 np0005548789.localdomain sudo[153157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26544 DF PROTO=TCP SPT=41638 DPT=9105 SEQ=2004128370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D094300000000001030307) 
Dec 06 09:35:24 np0005548789.localdomain sudo[153251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvwmvsomxtddnbscourdyziwgjujrpoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.7786076-1344-86884193586755/AnsiballZ_stat.py
Dec 06 09:35:24 np0005548789.localdomain sudo[153251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:24 np0005548789.localdomain python3.9[153253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:24 np0005548789.localdomain sudo[153251]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:24 np0005548789.localdomain sshd[153160]: Received disconnect from 81.192.46.35 port 44470:11: Bye Bye [preauth]
Dec 06 09:35:24 np0005548789.localdomain sshd[153160]: Disconnected from authenticating user root 81.192.46.35 port 44470 [preauth]
Dec 06 09:35:24 np0005548789.localdomain sudo[153324]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrgwyciijtighqjemeuknzslvagtxmtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.7786076-1344-86884193586755/AnsiballZ_copy.py
Dec 06 09:35:24 np0005548789.localdomain sudo[153324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:24 np0005548789.localdomain python3.9[153326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013723.7786076-1344-86884193586755/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:24 np0005548789.localdomain sudo[153324]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56316 DF PROTO=TCP SPT=42750 DPT=9882 SEQ=627142310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D09BF00000000001030307) 
Dec 06 09:35:25 np0005548789.localdomain sudo[153416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beilosewnyqrxiiicjdsplacupizpvcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013725.7291615-1395-263488419891434/AnsiballZ_file.py
Dec 06 09:35:25 np0005548789.localdomain sudo[153416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:26 np0005548789.localdomain python3.9[153418]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:26 np0005548789.localdomain sudo[153416]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:26 np0005548789.localdomain sudo[153508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukxcaprqwqvezsfyypprposoywuntbev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013726.4736302-1419-168476942466128/AnsiballZ_stat.py
Dec 06 09:35:26 np0005548789.localdomain sudo[153508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:26 np0005548789.localdomain python3.9[153510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:26 np0005548789.localdomain sudo[153508]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:27 np0005548789.localdomain sudo[153583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-momctkpelsurdykhtwzykofaehgnhipf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013726.4736302-1419-168476942466128/AnsiballZ_copy.py
Dec 06 09:35:27 np0005548789.localdomain sudo[153583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:28 np0005548789.localdomain python3.9[153585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013726.4736302-1419-168476942466128/.source.json _original_basename=.4r3uqdxk follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:28 np0005548789.localdomain sudo[153583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:28 np0005548789.localdomain sudo[153675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdldlejzzrcesqoshhpeyhlrixhkkbmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013728.3706388-1464-91214135257641/AnsiballZ_file.py
Dec 06 09:35:28 np0005548789.localdomain sudo[153675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:28 np0005548789.localdomain python3.9[153677]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:28 np0005548789.localdomain sudo[153675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:29 np0005548789.localdomain sudo[153767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npytshwaewvcbxhugovldeitbkrxaaza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013729.0790696-1488-196457065767791/AnsiballZ_stat.py
Dec 06 09:35:29 np0005548789.localdomain sudo[153767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:29 np0005548789.localdomain sudo[153767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:29 np0005548789.localdomain sudo[153840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvvtmkiiactjtqtptilbbiosttjsalmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013729.0790696-1488-196457065767791/AnsiballZ_copy.py
Dec 06 09:35:29 np0005548789.localdomain sudo[153840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62164 DF PROTO=TCP SPT=50974 DPT=9101 SEQ=3447700527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0ABEF0000000001030307) 
Dec 06 09:35:30 np0005548789.localdomain sudo[153840]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:30 np0005548789.localdomain sudo[153932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jbpxyvpmlbkcsoewksfobrqmuskxvbng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013730.403713-1539-64872617747065/AnsiballZ_container_config_data.py
Dec 06 09:35:30 np0005548789.localdomain sudo[153932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:31 np0005548789.localdomain python3.9[153934]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 06 09:35:31 np0005548789.localdomain sudo[153932]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:31 np0005548789.localdomain sudo[154024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jskoowfhjrkgzsmgkayobfixkavplwzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013731.2996635-1566-175404632081095/AnsiballZ_container_config_hash.py
Dec 06 09:35:31 np0005548789.localdomain sudo[154024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:31 np0005548789.localdomain python3.9[154026]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:35:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26545 DF PROTO=TCP SPT=41638 DPT=9105 SEQ=2004128370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0B3EF0000000001030307) 
Dec 06 09:35:31 np0005548789.localdomain sudo[154024]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:32 np0005548789.localdomain sudo[154116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkbvtxoonzxjonscreivptdevxymxxhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013732.1610594-1593-209086332623999/AnsiballZ_podman_container_info.py
Dec 06 09:35:32 np0005548789.localdomain sudo[154116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:32 np0005548789.localdomain python3.9[154118]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:35:33 np0005548789.localdomain sudo[154116]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17804 DF PROTO=TCP SPT=45458 DPT=9102 SEQ=3201948792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0BEF00000000001030307) 
Dec 06 09:35:37 np0005548789.localdomain sudo[154235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocsfzbzkhinslgkefddjweyvwjowtmjx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013736.1252823-1632-210072081331005/AnsiballZ_edpm_container_manage.py
Dec 06 09:35:37 np0005548789.localdomain sudo[154235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:37 np0005548789.localdomain python3[154237]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:35:37 np0005548789.localdomain python3[154237]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c",
                                                                    "Digest": "sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:38:47.246477714Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 345722821,
                                                                    "VirtualSize": 345722821,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:ba9362d2aeb297e34b0679b2fc8168350c70a5b0ec414daf293bf2bc013e9088",
                                                                              "sha256:aae3b8a85314314b9db80a043fdf3f3b1d0b69927faca0303c73969a23dddd0f"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:22.759131427Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:25.258260855Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:13:28.025145079Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:13.535675197Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:47.244104142Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:38:48.759416475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:37 np0005548789.localdomain podman[154288]: 2025-12-06 09:35:37.897191792 +0000 UTC m=+0.069944319 container remove 1083479e63c7b72a6fd8d844623912e360922ce61d1a1bd165c8ac307f9bd076 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream)
Dec 06 09:35:37 np0005548789.localdomain python3[154237]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Dec 06 09:35:37 np0005548789.localdomain podman[154302]: 
Dec 06 09:35:37 np0005548789.localdomain podman[154302]: 2025-12-06 09:35:37.975594533 +0000 UTC m=+0.065211091 container create 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:35:37 np0005548789.localdomain podman[154302]: 2025-12-06 09:35:37.936408573 +0000 UTC m=+0.026025171 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:37 np0005548789.localdomain python3[154237]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 06 09:35:38 np0005548789.localdomain sudo[154235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:38 np0005548789.localdomain sudo[154430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wunwxiifrsdcagbbigmmwirxnnzvjhed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013738.4741895-1656-221366952600761/AnsiballZ_stat.py
Dec 06 09:35:38 np0005548789.localdomain sudo[154430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:38 np0005548789.localdomain python3.9[154432]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:38 np0005548789.localdomain sudo[154430]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 np0005548789.localdomain sudo[154524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryaoejrqkztavdgdpnludypicqhqftlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013739.2222426-1683-103427688641304/AnsiballZ_file.py
Dec 06 09:35:39 np0005548789.localdomain sudo[154524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2003 DF PROTO=TCP SPT=40512 DPT=9882 SEQ=469531224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0D1F00000000001030307) 
Dec 06 09:35:39 np0005548789.localdomain python3.9[154526]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:39 np0005548789.localdomain sudo[154524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 np0005548789.localdomain sudo[154570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uekmiufeidoxsvmxsswjxbnzfucwcipe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013739.2222426-1683-103427688641304/AnsiballZ_stat.py
Dec 06 09:35:39 np0005548789.localdomain sudo[154570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:40 np0005548789.localdomain python3.9[154572]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:40 np0005548789.localdomain sudo[154570]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:40 np0005548789.localdomain sudo[154661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-potnzxcojyuzrcoztsyzrirnhwzigtcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2090127-1683-273520237774633/AnsiballZ_copy.py
Dec 06 09:35:40 np0005548789.localdomain sudo[154661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17806 DF PROTO=TCP SPT=45458 DPT=9102 SEQ=3201948792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0D6B00000000001030307) 
Dec 06 09:35:40 np0005548789.localdomain python3.9[154663]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013740.2090127-1683-273520237774633/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:40 np0005548789.localdomain sudo[154661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:41 np0005548789.localdomain sudo[154707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsjvjyevnkilxuklqglovwokmkbwclpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2090127-1683-273520237774633/AnsiballZ_systemd.py
Dec 06 09:35:41 np0005548789.localdomain sudo[154707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:41 np0005548789.localdomain sshd[154710]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:41 np0005548789.localdomain python3.9[154709]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:35:41 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:35:41 np0005548789.localdomain systemd-rc-local-generator[154737]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:41 np0005548789.localdomain systemd-sysv-generator[154743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:42 np0005548789.localdomain sudo[154707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:42 np0005548789.localdomain sudo[154791]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olpbthhkyvmrcgeuupzrtljiyboucwoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.2090127-1683-273520237774633/AnsiballZ_systemd.py
Dec 06 09:35:42 np0005548789.localdomain sudo[154791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:42 np0005548789.localdomain sshd[154794]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:42 np0005548789.localdomain python3.9[154793]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:42 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:35:42 np0005548789.localdomain systemd-rc-local-generator[154824]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:42 np0005548789.localdomain systemd-sysv-generator[154827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:42 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Starting ovn_controller container...
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:35:43 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c05aff0a12864d1bd5bcddcfda0418c2fac87ac5e10778af1cef421189be2d3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:35:43 np0005548789.localdomain podman[154836]: 2025-12-06 09:35:43.321816754 +0000 UTC m=+0.151528859 container init 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: tmp-crun.97ZgYg.mount: Deactivated successfully.
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + sudo -E kolla_set_configs
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:35:43 np0005548789.localdomain podman[154836]: 2025-12-06 09:35:43.364308608 +0000 UTC m=+0.194020673 container start 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:35:43 np0005548789.localdomain edpm-start-podman-container[154836]: ovn_controller
Dec 06 09:35:43 np0005548789.localdomain sshd[154710]: Received disconnect from 103.234.151.178 port 18490:11: Bye Bye [preauth]
Dec 06 09:35:43 np0005548789.localdomain sshd[154710]: Disconnected from authenticating user root 103.234.151.178 port 18490 [preauth]
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 0.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Starting User Manager for UID 0...
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:35:43 np0005548789.localdomain podman[154859]: 2025-12-06 09:35:43.454688992 +0000 UTC m=+0.087352541 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 09:35:43 np0005548789.localdomain podman[154859]: 2025-12-06 09:35:43.546832641 +0000 UTC m=+0.179496180 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_controller)
Dec 06 09:35:43 np0005548789.localdomain podman[154859]: unhealthy
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Failed with result 'exit-code'.
Dec 06 09:35:43 np0005548789.localdomain edpm-start-podman-container[154835]: Creating additional drop-in dependency for "ovn_controller" (0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5)
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Queued start job for default target Main User Target.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:35:43 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Dec 06 09:35:43 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:35:43 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Created slice User Application Slice.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Reached target Paths.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Reached target Timers.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Starting D-Bus User Message Bus Socket...
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Starting Create User's Volatile Files and Directories...
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Reached target Sockets.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Finished Create User's Volatile Files and Directories.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Reached target Basic System.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Reached target Main User Target.
Dec 06 09:35:43 np0005548789.localdomain systemd[154885]: Startup finished in 141ms.
Dec 06 09:35:43 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:35:43 np0005548789.localdomain systemd-sysv-generator[154942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:43 np0005548789.localdomain systemd-rc-local-generator[154937]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started User Manager for UID 0.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started ovn_controller container.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started Session c12 of User root.
Dec 06 09:35:43 np0005548789.localdomain sudo[154791]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: INFO:__main__:Validating config file
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: INFO:__main__:Writing out command to execute
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: ++ cat /run_command
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + ARGS=
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + sudo kolla_copy_cacerts
Dec 06 09:35:43 np0005548789.localdomain sshd[154794]: Received disconnect from 64.227.156.63 port 58990:11: Bye Bye [preauth]
Dec 06 09:35:43 np0005548789.localdomain sshd[154794]: Disconnected from authenticating user root 64.227.156.63 port 58990 [preauth]
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: Started Session c13 of User root.
Dec 06 09:35:43 np0005548789.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + [[ ! -n '' ]]
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + . kolla_extend_start
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + umask 0022
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:43Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:35:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:43Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00013|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00021|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-bd2a75-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-ca3c1f-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-afa07b-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00026|binding|INFO|Claiming lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for this chassis.
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00027|binding|INFO|86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b: Claiming fa:16:3e:64:77:f3 192.168.0.162
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00028|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00029|binding|INFO|Removing lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b ovn-installed in OVS
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-bd2a75-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-ca3c1f-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-afa07b-0
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00033|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00034|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00035|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00036|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00037|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:44Z|00038|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:44 np0005548789.localdomain sshd[154993]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52518 DF PROTO=TCP SPT=57100 DPT=9101 SEQ=4215202201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0E4600000000001030307) 
Dec 06 09:35:44 np0005548789.localdomain sudo[155053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlvugatfljwajyqcreafdfziduntvuui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013744.2411485-1767-247642986407328/AnsiballZ_command.py
Dec 06 09:35:44 np0005548789.localdomain sudo[155053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:44 np0005548789.localdomain python3.9[155055]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:44 np0005548789.localdomain ovs-vsctl[155056]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 06 09:35:44 np0005548789.localdomain sudo[155053]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:45Z|00039|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:45Z|00040|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:45 np0005548789.localdomain sudo[155146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iukakdldduuaowykwvxpoibeodujpfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013744.9130383-1791-87570133191262/AnsiballZ_command.py
Dec 06 09:35:45 np0005548789.localdomain sudo[155146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:45 np0005548789.localdomain python3.9[155148]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:45 np0005548789.localdomain ovs-vsctl[155150]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 06 09:35:45 np0005548789.localdomain sudo[155146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548789.localdomain sudo[155166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:35:45 np0005548789.localdomain sudo[155166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:45 np0005548789.localdomain sudo[155166]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:45Z|00041|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:35:45 np0005548789.localdomain sudo[155181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:35:45 np0005548789.localdomain sudo[155181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:46 np0005548789.localdomain sudo[155289]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogeaqyhqcbvsphzsclpdddnqelmqlnda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013746.0477552-1833-89446903628497/AnsiballZ_command.py
Dec 06 09:35:46 np0005548789.localdomain sudo[155289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:46 np0005548789.localdomain sudo[155181]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:46 np0005548789.localdomain sshd[154993]: Connection closed by 45.78.222.162 port 49394 [preauth]
Dec 06 09:35:46 np0005548789.localdomain python3.9[155293]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:46 np0005548789.localdomain ovs-vsctl[155306]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 06 09:35:46 np0005548789.localdomain sudo[155289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:47 np0005548789.localdomain sshd[148163]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:35:47 np0005548789.localdomain systemd[1]: session-50.scope: Deactivated successfully.
Dec 06 09:35:47 np0005548789.localdomain systemd[1]: session-50.scope: Consumed 39.693s CPU time.
Dec 06 09:35:47 np0005548789.localdomain systemd-logind[766]: Session 50 logged out. Waiting for processes to exit.
Dec 06 09:35:47 np0005548789.localdomain systemd-logind[766]: Removed session 50.
Dec 06 09:35:47 np0005548789.localdomain sudo[155321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:35:47 np0005548789.localdomain sudo[155321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:35:47 np0005548789.localdomain sudo[155321]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52520 DF PROTO=TCP SPT=57100 DPT=9101 SEQ=4215202201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0F06F0000000001030307) 
Dec 06 09:35:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15410 DF PROTO=TCP SPT=45386 DPT=9105 SEQ=486432858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D0F9AF0000000001030307) 
Dec 06 09:35:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:52Z|00042|binding|INFO|Setting lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b ovn-installed in OVS
Dec 06 09:35:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:35:52Z|00043|binding|INFO|Setting lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b up in Southbound
Dec 06 09:35:52 np0005548789.localdomain sshd[155336]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:52 np0005548789.localdomain sshd[155336]: Accepted publickey for zuul from 192.168.122.30 port 44858 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:35:52 np0005548789.localdomain systemd-logind[766]: New session 52 of user zuul.
Dec 06 09:35:52 np0005548789.localdomain systemd[1]: Started Session 52 of User zuul.
Dec 06 09:35:52 np0005548789.localdomain sshd[155336]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:35:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15411 DF PROTO=TCP SPT=45386 DPT=9105 SEQ=486432858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D109700000000001030307) 
Dec 06 09:35:53 np0005548789.localdomain python3.9[155429]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 0...
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Activating special unit Exit the Session...
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped target Main User Target.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped target Basic System.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped target Paths.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped target Sockets.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped target Timers.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Closed D-Bus User Message Bus Socket.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Removed slice User Application Slice.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Reached target Shutdown.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Finished Exit the Session.
Dec 06 09:35:54 np0005548789.localdomain systemd[154885]: Reached target Exit the Session.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: user@0.service: Deactivated successfully.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 0.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 09:35:54 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 0.
Dec 06 09:35:54 np0005548789.localdomain sshd[155449]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:54 np0005548789.localdomain sudo[155526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziebxdqbfnxtebsmewloggibiwezwbrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013754.442289-63-104263449967290/AnsiballZ_file.py
Dec 06 09:35:54 np0005548789.localdomain sudo[155526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:54 np0005548789.localdomain sshd[155449]: Received disconnect from 12.156.67.18 port 36496:11: Bye Bye [preauth]
Dec 06 09:35:54 np0005548789.localdomain sshd[155449]: Disconnected from authenticating user root 12.156.67.18 port 36496 [preauth]
Dec 06 09:35:55 np0005548789.localdomain python3.9[155528]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:55 np0005548789.localdomain sudo[155526]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:55 np0005548789.localdomain sudo[155618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmpjrwedwxadjykvpfyaoirkyjztselv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013755.2116542-63-255092334762501/AnsiballZ_file.py
Dec 06 09:35:55 np0005548789.localdomain sudo[155618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:55 np0005548789.localdomain python3.9[155620]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:55 np0005548789.localdomain sudo[155618]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2004 DF PROTO=TCP SPT=40512 DPT=9882 SEQ=469531224 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D111EF0000000001030307) 
Dec 06 09:35:56 np0005548789.localdomain sudo[155710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iscbigykgczczticasfqeoqkpywiuyyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013755.8689828-63-94782599547644/AnsiballZ_file.py
Dec 06 09:35:56 np0005548789.localdomain sudo[155710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:56 np0005548789.localdomain python3.9[155712]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:56 np0005548789.localdomain sudo[155710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:56 np0005548789.localdomain sudo[155802]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pahazfmouzaumhqmyhtzjqywlcbphawk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013756.492048-63-105883089322893/AnsiballZ_file.py
Dec 06 09:35:56 np0005548789.localdomain sudo[155802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:56 np0005548789.localdomain sshd[155805]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:35:56 np0005548789.localdomain python3.9[155804]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:56 np0005548789.localdomain sudo[155802]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:57 np0005548789.localdomain sudo[155896]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixgwdeeoxdhcbbypuhcfzijkkmvtysrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013757.0699182-63-149939618936677/AnsiballZ_file.py
Dec 06 09:35:57 np0005548789.localdomain sudo[155896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:57 np0005548789.localdomain python3.9[155898]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:35:57 np0005548789.localdomain sudo[155896]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:58 np0005548789.localdomain python3.9[155988]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:58 np0005548789.localdomain sshd[155805]: Received disconnect from 103.157.25.60 port 47974:11: Bye Bye [preauth]
Dec 06 09:35:58 np0005548789.localdomain sshd[155805]: Disconnected from authenticating user root 103.157.25.60 port 47974 [preauth]
Dec 06 09:35:58 np0005548789.localdomain sudo[156078]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxjsdhurjjftkumjowyzxhzrvoguhiai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013758.4942727-195-273866266706675/AnsiballZ_seboolean.py
Dec 06 09:35:58 np0005548789.localdomain sudo[156078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:59 np0005548789.localdomain python3.9[156080]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:35:59 np0005548789.localdomain sudo[156078]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52522 DF PROTO=TCP SPT=57100 DPT=9101 SEQ=4215202201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D11FEF0000000001030307) 
Dec 06 09:36:00 np0005548789.localdomain python3.9[156170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:00 np0005548789.localdomain python3.9[156243]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013759.4515567-219-15692034506086/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:01 np0005548789.localdomain python3.9[156333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15412 DF PROTO=TCP SPT=45386 DPT=9105 SEQ=486432858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D129EF0000000001030307) 
Dec 06 09:36:02 np0005548789.localdomain python3.9[156407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013760.890937-264-80620613485043/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:03 np0005548789.localdomain sudo[156497]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raxbdckkqzbbwsztfcftxapnxiwtgqmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013762.8887904-315-109112151385138/AnsiballZ_setup.py
Dec 06 09:36:03 np0005548789.localdomain sudo[156497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:03 np0005548789.localdomain python3.9[156499]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:36:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:36:03Z|00044|memory|INFO|17148 kB peak resident set size after 19.6 seconds
Dec 06 09:36:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:36:03Z|00045|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67
Dec 06 09:36:03 np0005548789.localdomain sudo[156497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:04 np0005548789.localdomain sudo[156551]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkrtgpfxudhvdhqihntjisidymzjbhss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013762.8887904-315-109112151385138/AnsiballZ_dnf.py
Dec 06 09:36:04 np0005548789.localdomain sudo[156551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:04 np0005548789.localdomain sshd[156554]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:04 np0005548789.localdomain python3.9[156553]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:36:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62977 DF PROTO=TCP SPT=55758 DPT=9102 SEQ=1114809180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D134300000000001030307) 
Dec 06 09:36:07 np0005548789.localdomain sudo[156551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17637 DF PROTO=TCP SPT=60550 DPT=9102 SEQ=3977837278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D13FEF0000000001030307) 
Dec 06 09:36:08 np0005548789.localdomain sudo[156647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rktrylplehcprnxttgkqcdqhojjfdxor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013767.7570083-351-128594005252960/AnsiballZ_systemd.py
Dec 06 09:36:08 np0005548789.localdomain sudo[156647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:08 np0005548789.localdomain sshd[156554]: Received disconnect from 179.33.210.213 port 60906:11: Bye Bye [preauth]
Dec 06 09:36:08 np0005548789.localdomain sshd[156554]: Disconnected from authenticating user root 179.33.210.213 port 60906 [preauth]
Dec 06 09:36:08 np0005548789.localdomain python3.9[156649]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:36:08 np0005548789.localdomain sudo[156647]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:09 np0005548789.localdomain python3.9[156742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:09 np0005548789.localdomain python3.9[156813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013768.8289354-375-253776304143022/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:10 np0005548789.localdomain python3.9[156903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:10 np0005548789.localdomain python3.9[156974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013769.83361-375-181523447643519/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62979 DF PROTO=TCP SPT=55758 DPT=9102 SEQ=1114809180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D14BF00000000001030307) 
Dec 06 09:36:12 np0005548789.localdomain python3.9[157064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:12 np0005548789.localdomain python3.9[157135]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013771.6072345-507-171255576961655/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:13 np0005548789.localdomain python3.9[157225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:36:13 np0005548789.localdomain systemd[1]: tmp-crun.O4f3va.mount: Deactivated successfully.
Dec 06 09:36:13 np0005548789.localdomain podman[157297]: 2025-12-06 09:36:13.931720814 +0000 UTC m=+0.083535455 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:36:13 np0005548789.localdomain podman[157297]: 2025-12-06 09:36:13.998398188 +0000 UTC m=+0.150212829 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 06 09:36:14 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:36:14 np0005548789.localdomain python3.9[157296]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013772.5851123-507-272127474319406/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50508 DF PROTO=TCP SPT=33158 DPT=9101 SEQ=2459792528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D159900000000001030307) 
Dec 06 09:36:14 np0005548789.localdomain python3.9[157411]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:15 np0005548789.localdomain sudo[157503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndlfeqejtfztqqyohxdvtthhottjbnrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013775.7362819-621-87005851624692/AnsiballZ_file.py
Dec 06 09:36:15 np0005548789.localdomain sudo[157503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:16 np0005548789.localdomain python3.9[157505]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:16 np0005548789.localdomain sudo[157503]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:16 np0005548789.localdomain sudo[157595]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swqbdcpmhrgpibbghvaruqophgiweuat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013776.3367379-645-97729198054842/AnsiballZ_stat.py
Dec 06 09:36:16 np0005548789.localdomain sudo[157595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:16 np0005548789.localdomain python3.9[157597]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:16 np0005548789.localdomain sudo[157595]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:16 np0005548789.localdomain sudo[157643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpoopjkuqpkgmxsppxxuqlzojkwcdxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013776.3367379-645-97729198054842/AnsiballZ_file.py
Dec 06 09:36:16 np0005548789.localdomain sudo[157643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:17 np0005548789.localdomain python3.9[157645]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:17 np0005548789.localdomain sudo[157643]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50510 DF PROTO=TCP SPT=33158 DPT=9101 SEQ=2459792528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D165AF0000000001030307) 
Dec 06 09:36:17 np0005548789.localdomain sudo[157735]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uchpueagdikuthzfsstrjuasxduopzii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013777.2307448-645-120623166050464/AnsiballZ_stat.py
Dec 06 09:36:17 np0005548789.localdomain sudo[157735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:17 np0005548789.localdomain python3.9[157737]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:17 np0005548789.localdomain sudo[157735]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:17 np0005548789.localdomain sudo[157783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moewqaorvkkjhhaqcgkatdrzukbnheqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013777.2307448-645-120623166050464/AnsiballZ_file.py
Dec 06 09:36:17 np0005548789.localdomain sudo[157783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:18 np0005548789.localdomain python3.9[157785]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:18 np0005548789.localdomain sudo[157783]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:18 np0005548789.localdomain sudo[157875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nglvpknwlqlfofggvifkvvwnxobnldhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013778.4989078-714-7650645590016/AnsiballZ_file.py
Dec 06 09:36:18 np0005548789.localdomain sudo[157875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:18 np0005548789.localdomain python3.9[157877]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:18 np0005548789.localdomain sudo[157875]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:19 np0005548789.localdomain sudo[157967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvedejzvrumcplzzaxnbbbwvlzbslvkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013779.1166012-739-271628311651275/AnsiballZ_stat.py
Dec 06 09:36:19 np0005548789.localdomain sudo[157967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:19 np0005548789.localdomain python3.9[157969]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:19 np0005548789.localdomain sudo[157967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14800 DF PROTO=TCP SPT=40724 DPT=9105 SEQ=2893121207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D16EAF0000000001030307) 
Dec 06 09:36:19 np0005548789.localdomain sudo[158015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukxhabwivsrkjradbapgyajvtowxbpsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013779.1166012-739-271628311651275/AnsiballZ_file.py
Dec 06 09:36:19 np0005548789.localdomain sudo[158015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:20 np0005548789.localdomain python3.9[158017]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:20 np0005548789.localdomain sudo[158015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:20 np0005548789.localdomain sudo[158107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrzezfxuzchvfgwmjuxpzxzsdjtborup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013780.257388-774-22498248265854/AnsiballZ_stat.py
Dec 06 09:36:20 np0005548789.localdomain sudo[158107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:20 np0005548789.localdomain python3.9[158109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:20 np0005548789.localdomain sudo[158107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:20 np0005548789.localdomain sudo[158155]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhqvkepptoyvlwlaxymlbdouecyraeos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013780.257388-774-22498248265854/AnsiballZ_file.py
Dec 06 09:36:20 np0005548789.localdomain sudo[158155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:21 np0005548789.localdomain python3.9[158157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:21 np0005548789.localdomain sudo[158155]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:21 np0005548789.localdomain sudo[158247]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suhotmzqmwrzbyryyyxtxnmkwcxtgqbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013781.3930297-810-210730043950363/AnsiballZ_systemd.py
Dec 06 09:36:21 np0005548789.localdomain sudo[158247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:22 np0005548789.localdomain python3.9[158249]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:22 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:36:22Z|00046|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 06 09:36:22 np0005548789.localdomain systemd-sysv-generator[158273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:22 np0005548789.localdomain systemd-rc-local-generator[158270]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:22 np0005548789.localdomain sudo[158247]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:22 np0005548789.localdomain sudo[158376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywunesdwgrlvtpgmsbeubgvoihtfbcec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013782.5097861-834-241517541920991/AnsiballZ_stat.py
Dec 06 09:36:22 np0005548789.localdomain sudo[158376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:23 np0005548789.localdomain python3.9[158378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:23 np0005548789.localdomain sudo[158376]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:23 np0005548789.localdomain sudo[158424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwndiitdyohybbxrkgjbmmilqnvotzvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013782.5097861-834-241517541920991/AnsiballZ_file.py
Dec 06 09:36:23 np0005548789.localdomain sudo[158424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:23 np0005548789.localdomain python3.9[158426]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:23 np0005548789.localdomain sudo[158424]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14801 DF PROTO=TCP SPT=40724 DPT=9105 SEQ=2893121207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D17E700000000001030307) 
Dec 06 09:36:23 np0005548789.localdomain sudo[158516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lblsdxwykttayzypwtevfpcongmcyfrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.671328-870-129953431721316/AnsiballZ_stat.py
Dec 06 09:36:23 np0005548789.localdomain sudo[158516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:24 np0005548789.localdomain python3.9[158518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:24 np0005548789.localdomain sudo[158516]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:24 np0005548789.localdomain sudo[158564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndivyzjguaohtlstskjekhqoxldeygjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.671328-870-129953431721316/AnsiballZ_file.py
Dec 06 09:36:24 np0005548789.localdomain sudo[158564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:24 np0005548789.localdomain python3.9[158566]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:24 np0005548789.localdomain sudo[158564]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:24 np0005548789.localdomain sudo[158656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnfvqxkakgouwufaktgsdiozvtmtscto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013784.770934-906-115932914306318/AnsiballZ_systemd.py
Dec 06 09:36:24 np0005548789.localdomain sudo[158656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:25 np0005548789.localdomain python3.9[158658]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:25 np0005548789.localdomain systemd-sysv-generator[158683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:25 np0005548789.localdomain systemd-rc-local-generator[158679]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=375 DF PROTO=TCP SPT=42346 DPT=9882 SEQ=4209804336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D185EF0000000001030307) 
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:36:25 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:36:25 np0005548789.localdomain sudo[158656]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:26 np0005548789.localdomain sshd[158716]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:26 np0005548789.localdomain sudo[158792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvlndzeildzxttvxtxvzesbqichupwcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013786.7005606-936-139936627922309/AnsiballZ_file.py
Dec 06 09:36:26 np0005548789.localdomain sudo[158792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:27 np0005548789.localdomain sshd[158795]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:27 np0005548789.localdomain python3.9[158794]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:27 np0005548789.localdomain sudo[158792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:27 np0005548789.localdomain sudo[158886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxecjhyrrjtellhebynqngqhinhcljqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013787.351369-960-86414425805267/AnsiballZ_stat.py
Dec 06 09:36:27 np0005548789.localdomain sudo[158886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:27 np0005548789.localdomain python3.9[158888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:27 np0005548789.localdomain sudo[158886]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:28 np0005548789.localdomain sudo[158959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnykizzgiymtqtaigmvwpsvfwtikdpyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013787.351369-960-86414425805267/AnsiballZ_copy.py
Dec 06 09:36:28 np0005548789.localdomain sudo[158959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:28 np0005548789.localdomain python3.9[158961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013787.351369-960-86414425805267/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:28 np0005548789.localdomain sudo[158959]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:28 np0005548789.localdomain sshd[158795]: Received disconnect from 118.193.38.207 port 57176:11: Bye Bye [preauth]
Dec 06 09:36:28 np0005548789.localdomain sshd[158795]: Disconnected from authenticating user root 118.193.38.207 port 57176 [preauth]
Dec 06 09:36:28 np0005548789.localdomain sshd[158963]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:29 np0005548789.localdomain sshd[158963]: Received disconnect from 81.192.46.35 port 42794:11: Bye Bye [preauth]
Dec 06 09:36:29 np0005548789.localdomain sshd[158963]: Disconnected from authenticating user root 81.192.46.35 port 42794 [preauth]
Dec 06 09:36:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50512 DF PROTO=TCP SPT=33158 DPT=9101 SEQ=2459792528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D195EF0000000001030307) 
Dec 06 09:36:29 np0005548789.localdomain sudo[159054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utfgxxcefimvhhhtdrjbdvqneclkipaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013789.5571947-1011-176159965701226/AnsiballZ_file.py
Dec 06 09:36:29 np0005548789.localdomain sudo[159054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:30 np0005548789.localdomain python3.9[159056]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:36:30 np0005548789.localdomain sudo[159054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:30 np0005548789.localdomain sshd[158716]: Received disconnect from 103.192.152.59 port 50866:11: Bye Bye [preauth]
Dec 06 09:36:30 np0005548789.localdomain sshd[158716]: Disconnected from authenticating user root 103.192.152.59 port 50866 [preauth]
Dec 06 09:36:30 np0005548789.localdomain sudo[159146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kidirkryjylgubciagmqsugqcgvvpyvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013790.2731814-1035-196142432322869/AnsiballZ_stat.py
Dec 06 09:36:30 np0005548789.localdomain sudo[159146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:30 np0005548789.localdomain python3.9[159148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:36:30 np0005548789.localdomain sudo[159146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:31 np0005548789.localdomain sudo[159221]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkoelvzrnqehzyeqoicshydphfybpwkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013790.2731814-1035-196142432322869/AnsiballZ_copy.py
Dec 06 09:36:31 np0005548789.localdomain sudo[159221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:31 np0005548789.localdomain python3.9[159223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013790.2731814-1035-196142432322869/.source.json _original_basename=.zkyi8gbk follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:31 np0005548789.localdomain sudo[159221]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:31 np0005548789.localdomain sudo[159313]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcmhwddnxstmexgxqnbysltpvmjqrrxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013791.4640183-1080-12720861471347/AnsiballZ_file.py
Dec 06 09:36:31 np0005548789.localdomain sudo[159313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14802 DF PROTO=TCP SPT=40724 DPT=9105 SEQ=2893121207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D19DEF0000000001030307) 
Dec 06 09:36:31 np0005548789.localdomain python3.9[159315]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:31 np0005548789.localdomain sudo[159313]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:32 np0005548789.localdomain sudo[159405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrewbnaheyvkueevbwkbbatdjuqolkew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013792.186444-1104-23610675425164/AnsiballZ_stat.py
Dec 06 09:36:32 np0005548789.localdomain sudo[159405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:32 np0005548789.localdomain sudo[159405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:32 np0005548789.localdomain sudo[159478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axjhgmcycxmzzfnoxdhbuvdaskzvkujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013792.186444-1104-23610675425164/AnsiballZ_copy.py
Dec 06 09:36:32 np0005548789.localdomain sudo[159478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:33 np0005548789.localdomain sudo[159478]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:34 np0005548789.localdomain sudo[159570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqcgsjtqorbagyuzsammyolryddcuufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013793.627966-1155-166605931688474/AnsiballZ_container_config_data.py
Dec 06 09:36:34 np0005548789.localdomain sudo[159570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:34 np0005548789.localdomain python3.9[159572]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 06 09:36:34 np0005548789.localdomain sudo[159570]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31750 DF PROTO=TCP SPT=34236 DPT=9102 SEQ=1521893675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1A96F0000000001030307) 
Dec 06 09:36:34 np0005548789.localdomain sudo[159662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzixekxbmmxynsmszmsetxnphcprodmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013794.4851441-1182-108659501793475/AnsiballZ_container_config_hash.py
Dec 06 09:36:34 np0005548789.localdomain sudo[159662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:35 np0005548789.localdomain python3.9[159664]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:36:35 np0005548789.localdomain sudo[159662]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:35 np0005548789.localdomain sudo[159754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agfnamurktbygvregqeavhagkfvkdypc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013795.406317-1209-217115040710646/AnsiballZ_podman_container_info.py
Dec 06 09:36:35 np0005548789.localdomain sudo[159754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:36 np0005548789.localdomain python3.9[159756]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:36:36 np0005548789.localdomain sudo[159754]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17809 DF PROTO=TCP SPT=45458 DPT=9102 SEQ=3201948792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1B5EF0000000001030307) 
Dec 06 09:36:39 np0005548789.localdomain sudo[159872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auufolofumgtcifvjobxbdwjixqhbbtl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013799.4138715-1248-222380878439663/AnsiballZ_edpm_container_manage.py
Dec 06 09:36:39 np0005548789.localdomain sudo[159872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:40 np0005548789.localdomain python3[159874]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:36:40 np0005548789.localdomain python3[159874]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9",
                                                                    "Digest": "sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:29:20.327314945Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 784141054,
                                                                    "VirtualSize": 784141054,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53/diff:/var/lib/containers/storage/overlay/2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:75abaaa40a93c0e2bba524b6f8d4eb5f1c4c9a33db70c892c7582ec5b0827e5e",
                                                                              "sha256:01f43f620d1ea2a9e584abe0cc14c336bedcf55765127c000d743f536dd36f25",
                                                                              "sha256:0bf5bd378602f28be423f5e84abddff3b103396fae3c167031b6e3fcfcf6f120"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.18897737Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:50.762138914Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:13.720608935Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:27.636630318Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:40.546186661Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:52.875291445Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:27:22.608862134Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:35.764559413Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:40.983506098Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:28:44.803537768Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324920691Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:20.324983383Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:24.215761584Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548789.localdomain podman[159923]: 2025-12-06 09:36:40.495160274 +0000 UTC m=+0.084974068 container remove 87083982c89f1c3d9da67a4aa7fe7496e6609bad80280d2bae3a3e4e828efe54 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '270cf6e6b67cba1ef197c7fa89d5bb20'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:36:40 np0005548789.localdomain python3[159874]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Dec 06 09:36:40 np0005548789.localdomain podman[159937]: 
Dec 06 09:36:40 np0005548789.localdomain podman[159937]: 2025-12-06 09:36:40.600868441 +0000 UTC m=+0.088418625 container create 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 06 09:36:40 np0005548789.localdomain podman[159937]: 2025-12-06 09:36:40.557558037 +0000 UTC m=+0.045108231 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548789.localdomain python3[159874]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:36:40 np0005548789.localdomain sudo[159872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31752 DF PROTO=TCP SPT=34236 DPT=9102 SEQ=1521893675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1C12F0000000001030307) 
Dec 06 09:36:41 np0005548789.localdomain sudo[160065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbgpduulziufwaozizodncnixgxbojfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013801.2669847-1272-169206679963051/AnsiballZ_stat.py
Dec 06 09:36:41 np0005548789.localdomain sudo[160065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:41 np0005548789.localdomain python3.9[160067]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:41 np0005548789.localdomain sudo[160065]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:42 np0005548789.localdomain sudo[160159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcrbqmcfyuphrauhwdibrsbxdzjgibit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.019959-1299-201237917293333/AnsiballZ_file.py
Dec 06 09:36:42 np0005548789.localdomain sudo[160159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:42 np0005548789.localdomain python3.9[160161]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:42 np0005548789.localdomain sudo[160159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:42 np0005548789.localdomain sudo[160205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbtzftudoerncbgfvtyoqioihxmeghsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.019959-1299-201237917293333/AnsiballZ_stat.py
Dec 06 09:36:42 np0005548789.localdomain sudo[160205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:42 np0005548789.localdomain python3.9[160207]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:42 np0005548789.localdomain sudo[160205]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 np0005548789.localdomain sudo[160296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpngjcxlqdxyzwhhllccyrgcmxtzhnyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.9471653-1299-237582816213639/AnsiballZ_copy.py
Dec 06 09:36:43 np0005548789.localdomain sudo[160296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:43 np0005548789.localdomain python3.9[160298]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013802.9471653-1299-237582816213639/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:43 np0005548789.localdomain sudo[160296]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 np0005548789.localdomain sudo[160342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzchnvyhywypgtezgehmemimzwlnwxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.9471653-1299-237582816213639/AnsiballZ_systemd.py
Dec 06 09:36:43 np0005548789.localdomain sudo[160342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:44 np0005548789.localdomain python3.9[160344]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:36:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:36:44 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:44 np0005548789.localdomain podman[160346]: 2025-12-06 09:36:44.178011986 +0000 UTC m=+0.083978288 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 06 09:36:44 np0005548789.localdomain systemd-rc-local-generator[160387]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:44 np0005548789.localdomain systemd-sysv-generator[160390]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:44 np0005548789.localdomain podman[160346]: 2025-12-06 09:36:44.223670403 +0000 UTC m=+0.129636725 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 06 09:36:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59073 DF PROTO=TCP SPT=40368 DPT=9101 SEQ=3006578736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1CEC00000000001030307) 
Dec 06 09:36:44 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:36:44 np0005548789.localdomain sudo[160342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:44 np0005548789.localdomain sudo[160448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmmpjedrpbalfyybxpetualasmalbsoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013802.9471653-1299-237582816213639/AnsiballZ_systemd.py
Dec 06 09:36:44 np0005548789.localdomain sudo[160448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:45 np0005548789.localdomain python3.9[160450]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:45 np0005548789.localdomain systemd-sysv-generator[160480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:45 np0005548789.localdomain systemd-rc-local-generator[160474]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Starting ovn_metadata_agent container...
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:36:45 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec60694536734bdc4f05abf8c315b77759f80d4c5e7f43137384cbac97f56aea/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:36:45 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec60694536734bdc4f05abf8c315b77759f80d4c5e7f43137384cbac97f56aea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:36:45 np0005548789.localdomain podman[160491]: 2025-12-06 09:36:45.529283373 +0000 UTC m=+0.126740005 container init 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + sudo -E kolla_set_configs
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:36:45 np0005548789.localdomain podman[160491]: 2025-12-06 09:36:45.565046325 +0000 UTC m=+0.162502927 container start 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:36:45 np0005548789.localdomain edpm-start-podman-container[160491]: ovn_metadata_agent
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Validating config file
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Copying service configuration files
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Writing out command to execute
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: ++ cat /run_command
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + CMD=neutron-ovn-metadata-agent
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + ARGS=
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + sudo kolla_copy_cacerts
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + [[ ! -n '' ]]
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + . kolla_extend_start
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: Running command: 'neutron-ovn-metadata-agent'
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + umask 0022
Dec 06 09:36:45 np0005548789.localdomain ovn_metadata_agent[160504]: + exec neutron-ovn-metadata-agent
Dec 06 09:36:45 np0005548789.localdomain podman[160512]: 2025-12-06 09:36:45.636409224 +0000 UTC m=+0.068187813 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:36:45 np0005548789.localdomain edpm-start-podman-container[160490]: Creating additional drop-in dependency for "ovn_metadata_agent" (5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999)
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:45 np0005548789.localdomain podman[160512]: 2025-12-06 09:36:45.719207684 +0000 UTC m=+0.150986303 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 09:36:45 np0005548789.localdomain systemd-rc-local-generator[160575]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:45 np0005548789.localdomain systemd-sysv-generator[160578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:36:45 np0005548789.localdomain systemd[1]: Started ovn_metadata_agent container.
Dec 06 09:36:46 np0005548789.localdomain sudo[160448]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.219 160509 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.220 160509 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.220 160509 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.220 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.221 160509 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.222 160509 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.223 160509 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.224 160509 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.225 160509 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.226 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.227 160509 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.228 160509 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.229 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.230 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.231 160509 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.232 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.233 160509 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.234 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.235 160509 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.236 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.237 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.238 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.239 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.240 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.240 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.240 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.240 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.241 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.242 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.243 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain sudo[160607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.244 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.245 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.246 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain sudo[160607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.247 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.248 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.249 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain sudo[160607]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.250 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.251 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.252 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.253 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.254 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.255 160509 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.256 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.257 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.257 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.257 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.257 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.257 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.258 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.259 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.260 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.261 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.262 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.263 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.264 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.265 160509 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.266 160509 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.277 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.277 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.277 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.278 160509 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.278 160509 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.296 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name b142a5ef-fbed-4e92-aa78-e3ad080c6370 (UUID: b142a5ef-fbed-4e92-aa78-e3ad080c6370) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 06 09:36:47 np0005548789.localdomain sudo[160622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:36:47 np0005548789.localdomain sudo[160622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.316 160509 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.317 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.317 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.317 160509 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.319 160509 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.321 160509 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.329 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:77:f3 192.168.0.162'], port_security=['fa:16:3e:64:77:f3 192.168.0.162'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.162/24', 'neutron:device_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005548789.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '7', 'neutron:security_group_ids': '65e67ecb-ffcf-41e6-8b8b-ed491f2580ec 7ce08e20-be94-4509-a371-aa5c036416af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7872d306-938e-4ee0-be61-57ba3983d747, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.330 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'b142a5ef-fbed-4e92-aa78-e3ad080c6370'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], external_ids={'neutron:ovn-metadata-id': 'ebeaa3f7-4a1f-5fad-955a-c95905ca8ce8', 'neutron:ovn-metadata-sb-cfg': '1'}, name=b142a5ef-fbed-4e92-aa78-e3ad080c6370, nb_cfg_timestamp=1765013752689, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.331 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b in datapath 652b6bdc-40ce-45b7-8aa5-3bca79987993 bound to our chassis on insert
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.331 160509 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fbac21abb50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.332 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.332 160509 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.332 160509 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.333 160509 INFO oslo_service.service [-] Starting 1 workers
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.335 160509 DEBUG oslo_service.service [-] Started child 160637 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.337 160637 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-152989'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.337 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 652b6bdc-40ce-45b7-8aa5-3bca79987993
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.339 160509 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpw60da7x0/privsep.sock']
Dec 06 09:36:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59075 DF PROTO=TCP SPT=40368 DPT=9101 SEQ=3006578736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1DAB00000000001030307) 
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.357 160637 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.357 160637 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.358 160637 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.360 160637 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.362 160637 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.371 160637 INFO eventlet.wsgi.server [-] (160637) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 06 09:36:47 np0005548789.localdomain sudo[160622]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:47 np0005548789.localdomain sshd[155336]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:36:47 np0005548789.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Dec 06 09:36:47 np0005548789.localdomain systemd[1]: session-52.scope: Consumed 30.495s CPU time.
Dec 06 09:36:47 np0005548789.localdomain systemd-logind[766]: Session 52 logged out. Waiting for processes to exit.
Dec 06 09:36:47 np0005548789.localdomain systemd-logind[766]: Removed session 52.
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.926 160509 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.927 160509 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw60da7x0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.839 160674 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.842 160674 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.845 160674 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.845 160674 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160674
Dec 06 09:36:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:47.931 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb2c9c0-38f2-4ab6-8c69-2e0c4b560f07]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:48.373 160674 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:36:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:48.373 160674 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:36:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:48.373 160674 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:36:48 np0005548789.localdomain sudo[160679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:36:48 np0005548789.localdomain sudo[160679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:36:48 np0005548789.localdomain sudo[160679]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:48.832 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ca83f95c-ad91-44d2-9e3f-74d3093b6a5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:48.834 160509 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp67br8jf_/privsep.sock']
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.381 160509 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.382 160509 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp67br8jf_/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.298 160700 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.301 160700 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.303 160700 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.304 160700 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160700
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.385 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad8ad8a-bcb7-43d4-8ea4-d160d5895ea6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8259 DF PROTO=TCP SPT=54056 DPT=9105 SEQ=1085199004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1E3EF0000000001030307) 
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.822 160700 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.822 160700 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:36:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:49.822 160700 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.302 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[a6de2c49-8b0d-452d-bad1-c67929586341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.305 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[72902aee-8fe7-4cab-aec8-27ff3ae1e0ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.327 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[8c18e598-90dc-40c3-b7d9-f847d966b2bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.345 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f6f77a-cb3d-4d66-93d8-ba825139df2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap652b6bdc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:a7:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 69, 'rx_bytes': 8926, 'tx_bytes': 7209, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 69, 'rx_bytes': 8926, 'tx_bytes': 7209, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710085, 'reachable_time': 41718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 160710, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.360 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1dafa5-946e-4d6c-8bcf-ab0f9a984f45]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap652b6bdc-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710094, 'tstamp': 710094}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160711, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap652b6bdc-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710097, 'tstamp': 710097}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160711, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710099, 'tstamp': 710099}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160711, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:a70c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 710085, 'tstamp': 710085}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160711, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.410 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9ffcdce4-f6a7-4f49-916e-250ccfcd1f33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.411 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap652b6bdc-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.415 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap652b6bdc-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.416 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.417 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap652b6bdc-40, col_values=(('external_ids', {'iface-id': '4fb81ffd-e198-4628-9bd0-0c0f0c89c33a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.417 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:36:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.421 160509 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmppe3r5ww7/privsep.sock']
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.061 160509 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.062 160509 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmppe3r5ww7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.976 160720 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.982 160720 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.985 160720 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:50.986 160720 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160720
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.065 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[332fc036-0489-4972-97ad-d51c31f71629]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.481 160720 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.482 160720 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.482 160720 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.940 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[f231409a-c5e6-4288-9792-fbc804639319]: (4, ['ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.943 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, column=external_ids, values=({'neutron:ovn-metadata-id': 'ebeaa3f7-4a1f-5fad-955a-c95905ca8ce8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.944 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.945 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.958 160509 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.958 160509 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.959 160509 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.959 160509 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.959 160509 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.959 160509 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.960 160509 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.960 160509 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.960 160509 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.960 160509 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.961 160509 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.961 160509 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.961 160509 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.962 160509 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.962 160509 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.962 160509 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.962 160509 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.963 160509 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.963 160509 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.963 160509 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.963 160509 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.964 160509 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.964 160509 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.964 160509 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.964 160509 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.965 160509 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.965 160509 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.965 160509 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.965 160509 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.966 160509 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.966 160509 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.966 160509 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.966 160509 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.967 160509 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.967 160509 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.967 160509 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.967 160509 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.968 160509 DEBUG oslo_service.service [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.968 160509 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.968 160509 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.969 160509 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.969 160509 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.969 160509 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.970 160509 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.970 160509 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.970 160509 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.970 160509 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.971 160509 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.971 160509 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.971 160509 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.971 160509 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.972 160509 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.972 160509 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.972 160509 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.972 160509 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.973 160509 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.973 160509 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.973 160509 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.973 160509 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.974 160509 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.974 160509 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.974 160509 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.974 160509 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.975 160509 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.975 160509 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.975 160509 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.975 160509 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.976 160509 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.976 160509 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.976 160509 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.976 160509 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.977 160509 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.977 160509 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.977 160509 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.977 160509 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.978 160509 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.978 160509 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.978 160509 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.978 160509 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.979 160509 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.979 160509 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.979 160509 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.979 160509 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.980 160509 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.980 160509 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.980 160509 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.980 160509 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.980 160509 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.981 160509 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.981 160509 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.981 160509 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.982 160509 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.982 160509 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.982 160509 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.982 160509 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.983 160509 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.983 160509 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.983 160509 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.983 160509 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.984 160509 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.984 160509 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.984 160509 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.984 160509 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.984 160509 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.985 160509 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.985 160509 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.985 160509 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.985 160509 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.985 160509 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.986 160509 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.986 160509 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.986 160509 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.986 160509 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.987 160509 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.987 160509 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.987 160509 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.987 160509 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.988 160509 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.988 160509 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.988 160509 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.988 160509 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.988 160509 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.989 160509 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.989 160509 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.989 160509 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.989 160509 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.990 160509 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.990 160509 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.990 160509 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.990 160509 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.991 160509 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.991 160509 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.991 160509 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.991 160509 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.992 160509 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.992 160509 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.992 160509 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.992 160509 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.993 160509 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.993 160509 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.993 160509 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.993 160509 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.993 160509 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.994 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.994 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.994 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.995 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.995 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.995 160509 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.995 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.996 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.996 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.996 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.996 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.996 160509 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.997 160509 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.997 160509 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.997 160509 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.997 160509 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.998 160509 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.998 160509 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.998 160509 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.998 160509 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.998 160509 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.999 160509 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.999 160509 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.999 160509 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:51.999 160509 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.000 160509 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.000 160509 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.000 160509 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.000 160509 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.000 160509 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.001 160509 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.001 160509 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.001 160509 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.001 160509 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.002 160509 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.003 160509 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.004 160509 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.005 160509 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.006 160509 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.007 160509 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.008 160509 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.009 160509 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.010 160509 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.011 160509 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.012 160509 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.013 160509 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.014 160509 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.015 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.016 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.017 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.018 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.018 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.018 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.018 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.018 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.019 160509 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.020 160509 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.020 160509 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.020 160509 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.020 160509 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:36:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:36:52.020 160509 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:36:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8260 DF PROTO=TCP SPT=54056 DPT=9105 SEQ=1085199004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1F3AF0000000001030307) 
Dec 06 09:36:53 np0005548789.localdomain sshd[160725]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:54 np0005548789.localdomain sshd[160725]: Accepted publickey for zuul from 192.168.122.30 port 51304 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:36:54 np0005548789.localdomain systemd-logind[766]: New session 53 of user zuul.
Dec 06 09:36:54 np0005548789.localdomain systemd[1]: Started Session 53 of User zuul.
Dec 06 09:36:54 np0005548789.localdomain sshd[160725]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:36:55 np0005548789.localdomain python3.9[160818]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:36:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47394 DF PROTO=TCP SPT=39376 DPT=9882 SEQ=2275117447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D1FBEF0000000001030307) 
Dec 06 09:36:56 np0005548789.localdomain sudo[160912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sffxcwetjrrlqdsmndyhlapellynqjkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013815.6429083-63-136982754901735/AnsiballZ_command.py
Dec 06 09:36:56 np0005548789.localdomain sudo[160912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:56 np0005548789.localdomain python3.9[160914]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:56 np0005548789.localdomain sudo[160912]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:56 np0005548789.localdomain sudo[161017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxwjyrenxenvzbpmdajtaboxjvkhrytc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013816.5109847-87-181823401613438/AnsiballZ_command.py
Dec 06 09:36:56 np0005548789.localdomain sudo[161017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:57 np0005548789.localdomain python3.9[161019]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:57 np0005548789.localdomain systemd[1]: tmp-crun.VHTygU.mount: Deactivated successfully.
Dec 06 09:36:57 np0005548789.localdomain systemd[1]: libpod-5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964.scope: Deactivated successfully.
Dec 06 09:36:57 np0005548789.localdomain podman[161020]: 2025-12-06 09:36:57.102830882 +0000 UTC m=+0.085940579 container died 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Dec 06 09:36:57 np0005548789.localdomain podman[161020]: 2025-12-06 09:36:57.136113868 +0000 UTC m=+0.119223525 container cleanup 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z)
Dec 06 09:36:57 np0005548789.localdomain sudo[161017]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:57 np0005548789.localdomain podman[161033]: 2025-12-06 09:36:57.198109667 +0000 UTC m=+0.086494545 container remove 5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1761123044, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Dec 06 09:36:57 np0005548789.localdomain systemd[1]: libpod-conmon-5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964.scope: Deactivated successfully.
Dec 06 09:36:57 np0005548789.localdomain sudo[161138]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afdjghtfhdjaxbklponczkskaxiukfdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013817.447026-117-237110767526986/AnsiballZ_systemd_service.py
Dec 06 09:36:57 np0005548789.localdomain sudo[161138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8a1de60586d08fb5b92518366abbb713eb2f96c306cc36f6078e1ea96b940056-merged.mount: Deactivated successfully.
Dec 06 09:36:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5519229fe01370c92f47b0e8e46cddb6cb973cc807c8388a053612d951244964-userdata-shm.mount: Deactivated successfully.
Dec 06 09:36:58 np0005548789.localdomain python3.9[161140]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:36:58 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:36:58 np0005548789.localdomain systemd-rc-local-generator[161169]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:58 np0005548789.localdomain systemd-sysv-generator[161172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:36:58 np0005548789.localdomain sudo[161138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:59 np0005548789.localdomain sshd[161237]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:36:59 np0005548789.localdomain python3.9[161269]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:36:59 np0005548789.localdomain network[161286]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:36:59 np0005548789.localdomain network[161287]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:36:59 np0005548789.localdomain network[161288]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:36:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59077 DF PROTO=TCP SPT=40368 DPT=9101 SEQ=3006578736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D209EF0000000001030307) 
Dec 06 09:36:59 np0005548789.localdomain sshd[161237]: Received disconnect from 12.156.67.18 port 36586:11: Bye Bye [preauth]
Dec 06 09:36:59 np0005548789.localdomain sshd[161237]: Disconnected from authenticating user root 12.156.67.18 port 36586 [preauth]
Dec 06 09:37:00 np0005548789.localdomain sshd[161296]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8261 DF PROTO=TCP SPT=54056 DPT=9105 SEQ=1085199004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D213EF0000000001030307) 
Dec 06 09:37:02 np0005548789.localdomain sshd[161296]: Received disconnect from 103.234.151.178 port 44620:11: Bye Bye [preauth]
Dec 06 09:37:02 np0005548789.localdomain sshd[161296]: Disconnected from authenticating user root 103.234.151.178 port 44620 [preauth]
Dec 06 09:37:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16580 DF PROTO=TCP SPT=46134 DPT=9102 SEQ=2267506048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D21E6F0000000001030307) 
Dec 06 09:37:05 np0005548789.localdomain sudo[161489]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odumcascpbhonzukqxqxcatxwogupmat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013825.0278609-174-155067744827867/AnsiballZ_systemd_service.py
Dec 06 09:37:05 np0005548789.localdomain sudo[161489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:05 np0005548789.localdomain python3.9[161491]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:05 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:37:05 np0005548789.localdomain systemd-rc-local-generator[161515]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:37:05 np0005548789.localdomain systemd-sysv-generator[161519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:37:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:06 np0005548789.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Dec 06 09:37:06 np0005548789.localdomain sudo[161489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:06 np0005548789.localdomain sudo[161621]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpqudvlnwamiutgduzdobyamidodnnvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013826.1588268-174-179200097681120/AnsiballZ_systemd_service.py
Dec 06 09:37:06 np0005548789.localdomain sudo[161621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:06 np0005548789.localdomain python3.9[161623]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:06 np0005548789.localdomain sudo[161621]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:07 np0005548789.localdomain sudo[161714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsosxrbrsafbhjdpfdrktzjojjfyzovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013826.8741043-174-56627778909161/AnsiballZ_systemd_service.py
Dec 06 09:37:07 np0005548789.localdomain sudo[161714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:07 np0005548789.localdomain python3.9[161716]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62982 DF PROTO=TCP SPT=55758 DPT=9102 SEQ=1114809180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D229F00000000001030307) 
Dec 06 09:37:08 np0005548789.localdomain sudo[161714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:08 np0005548789.localdomain sudo[161807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vltzzduioqcmdhlwrmytrabdbzwyfwle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013828.626003-174-55599542851529/AnsiballZ_systemd_service.py
Dec 06 09:37:08 np0005548789.localdomain sudo[161807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:09 np0005548789.localdomain python3.9[161809]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:09 np0005548789.localdomain sudo[161807]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:09 np0005548789.localdomain sudo[161900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxjebkjtoycsjkysidgaaszjaerrurgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013829.3229232-174-53416101970923/AnsiballZ_systemd_service.py
Dec 06 09:37:09 np0005548789.localdomain sudo[161900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:09 np0005548789.localdomain python3.9[161902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16582 DF PROTO=TCP SPT=46134 DPT=9102 SEQ=2267506048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2362F0000000001030307) 
Dec 06 09:37:10 np0005548789.localdomain sudo[161900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:11 np0005548789.localdomain sudo[161993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqisudfukhgtiwznukbhtwqxtsfbtuge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013831.0197537-174-14950341355486/AnsiballZ_systemd_service.py
Dec 06 09:37:11 np0005548789.localdomain sudo[161993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:11 np0005548789.localdomain python3.9[161995]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:11 np0005548789.localdomain sudo[161993]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:11 np0005548789.localdomain sudo[162086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fypzggrrjgxspeqjvzujmfnkhjbbezyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013831.728381-174-56673107627853/AnsiballZ_systemd_service.py
Dec 06 09:37:11 np0005548789.localdomain sudo[162086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:12 np0005548789.localdomain python3.9[162088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:37:12 np0005548789.localdomain sudo[162086]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:13 np0005548789.localdomain sudo[162179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swfrzrupjbkfeekehnugxpywwgtrsdpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013832.9649293-330-24652173733482/AnsiballZ_file.py
Dec 06 09:37:13 np0005548789.localdomain sudo[162179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:13 np0005548789.localdomain python3.9[162181]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:13 np0005548789.localdomain sudo[162179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:14 np0005548789.localdomain sudo[162271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnjwstvzrxbsweuhmmewwzmthhsbhygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013833.9909701-330-119397493023524/AnsiballZ_file.py
Dec 06 09:37:14 np0005548789.localdomain sudo[162271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48154 DF PROTO=TCP SPT=34958 DPT=9101 SEQ=2103639703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D243EF0000000001030307) 
Dec 06 09:37:14 np0005548789.localdomain python3.9[162273]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:14 np0005548789.localdomain sudo[162271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:14 np0005548789.localdomain sudo[162363]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dctgzqibrkdxanqjpqgxfortnyiigpwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013834.5228257-330-258483447019002/AnsiballZ_file.py
Dec 06 09:37:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:37:14 np0005548789.localdomain sudo[162363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:14 np0005548789.localdomain podman[162365]: 2025-12-06 09:37:14.870332949 +0000 UTC m=+0.091354486 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:37:14 np0005548789.localdomain podman[162365]: 2025-12-06 09:37:14.909876317 +0000 UTC m=+0.130897844 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:37:14 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:37:14 np0005548789.localdomain python3.9[162366]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:14 np0005548789.localdomain sudo[162363]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:15 np0005548789.localdomain sudo[162480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyuojbywwienpigktrtnjududqncnyfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013835.1067936-330-207296386727962/AnsiballZ_file.py
Dec 06 09:37:15 np0005548789.localdomain sudo[162480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:15 np0005548789.localdomain python3.9[162482]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:15 np0005548789.localdomain sudo[162480]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:16 np0005548789.localdomain sudo[162572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojppxfjgmejrxjkbmcujqizpfvopsfma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013836.0983813-330-100406653119925/AnsiballZ_file.py
Dec 06 09:37:16 np0005548789.localdomain sudo[162572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:37:16 np0005548789.localdomain systemd[1]: tmp-crun.Fpjkgk.mount: Deactivated successfully.
Dec 06 09:37:16 np0005548789.localdomain podman[162575]: 2025-12-06 09:37:16.423007399 +0000 UTC m=+0.077875690 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:37:16 np0005548789.localdomain podman[162575]: 2025-12-06 09:37:16.433197534 +0000 UTC m=+0.088065835 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:37:16 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:37:16 np0005548789.localdomain sshd[162593]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:16 np0005548789.localdomain python3.9[162574]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:16 np0005548789.localdomain sudo[162572]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:16 np0005548789.localdomain sudo[162684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmmaszofyfqvsvuerhwosyutkpcnfxjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013836.6597314-330-50002549297018/AnsiballZ_file.py
Dec 06 09:37:16 np0005548789.localdomain sudo[162684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:17 np0005548789.localdomain python3.9[162686]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:17 np0005548789.localdomain sudo[162684]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48156 DF PROTO=TCP SPT=34958 DPT=9101 SEQ=2103639703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D24FEF0000000001030307) 
Dec 06 09:37:17 np0005548789.localdomain sudo[162776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yloqvyfzpamodisrlsimvnhdcezknono ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013837.205184-330-162992566145159/AnsiballZ_file.py
Dec 06 09:37:17 np0005548789.localdomain sudo[162776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:17 np0005548789.localdomain python3.9[162778]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:17 np0005548789.localdomain sudo[162776]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:17 np0005548789.localdomain sshd[162593]: Received disconnect from 64.227.156.63 port 54398:11: Bye Bye [preauth]
Dec 06 09:37:17 np0005548789.localdomain sshd[162593]: Disconnected from authenticating user root 64.227.156.63 port 54398 [preauth]
Dec 06 09:37:18 np0005548789.localdomain sudo[162868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrwbciqsvymgnaiqpplybbjbebtbbacp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013838.3135462-480-137199859858055/AnsiballZ_file.py
Dec 06 09:37:18 np0005548789.localdomain sudo[162868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:18 np0005548789.localdomain python3.9[162870]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:18 np0005548789.localdomain sudo[162868]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:19 np0005548789.localdomain sudo[162960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xihicrdxyqapmssggknueaaxrnrptwze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013838.9159875-480-269583574299934/AnsiballZ_file.py
Dec 06 09:37:19 np0005548789.localdomain sudo[162960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:19 np0005548789.localdomain python3.9[162962]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:19 np0005548789.localdomain sudo[162960]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64689 DF PROTO=TCP SPT=59664 DPT=9105 SEQ=313070873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D259300000000001030307) 
Dec 06 09:37:19 np0005548789.localdomain sudo[163052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvbkitgmvlxbnextwolkjtzflyntoelv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013839.4984121-480-112362188027610/AnsiballZ_file.py
Dec 06 09:37:19 np0005548789.localdomain sudo[163052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:19 np0005548789.localdomain python3.9[163054]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:19 np0005548789.localdomain sudo[163052]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:20 np0005548789.localdomain sudo[163144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jodycuhntppevppewcmuaprcivggkkat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013840.0854158-480-204771603001494/AnsiballZ_file.py
Dec 06 09:37:20 np0005548789.localdomain sudo[163144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:20 np0005548789.localdomain python3.9[163146]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:20 np0005548789.localdomain sudo[163144]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:20 np0005548789.localdomain sudo[163236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnpvuyqyuwxphmdrbolpnpnevogkkpiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013840.6295326-480-10370741331399/AnsiballZ_file.py
Dec 06 09:37:20 np0005548789.localdomain sudo[163236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:21 np0005548789.localdomain python3.9[163238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:21 np0005548789.localdomain sudo[163236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:21 np0005548789.localdomain sshd[163298]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:21 np0005548789.localdomain sudo[163330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-immhwddgqnqywgklsrdxaeyxdublivvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013841.1977766-480-201131158432814/AnsiballZ_file.py
Dec 06 09:37:21 np0005548789.localdomain sudo[163330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:21 np0005548789.localdomain python3.9[163332]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:21 np0005548789.localdomain sudo[163330]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:22 np0005548789.localdomain sudo[163422]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfafqsejetxsikapapcwolzhrcalmhgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013841.8168907-480-177006454110082/AnsiballZ_file.py
Dec 06 09:37:22 np0005548789.localdomain sudo[163422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:22 np0005548789.localdomain python3.9[163424]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:37:22 np0005548789.localdomain sudo[163422]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:22 np0005548789.localdomain sudo[163514]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuktetvgtltiyyadkphqfqfkbedevjru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013842.660579-634-120604504308397/AnsiballZ_command.py
Dec 06 09:37:22 np0005548789.localdomain sudo[163514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:22 np0005548789.localdomain sshd[163298]: Received disconnect from 103.157.25.60 port 49646:11: Bye Bye [preauth]
Dec 06 09:37:22 np0005548789.localdomain sshd[163298]: Disconnected from authenticating user root 103.157.25.60 port 49646 [preauth]
Dec 06 09:37:23 np0005548789.localdomain python3.9[163516]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:23 np0005548789.localdomain sudo[163514]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64690 DF PROTO=TCP SPT=59664 DPT=9105 SEQ=313070873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D268EF0000000001030307) 
Dec 06 09:37:23 np0005548789.localdomain python3.9[163608]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:37:24 np0005548789.localdomain sudo[163698]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aunfbteuwbxijpyxbbzzbgebsitunuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013844.230842-687-207271624935428/AnsiballZ_systemd_service.py
Dec 06 09:37:24 np0005548789.localdomain sudo[163698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:24 np0005548789.localdomain python3.9[163700]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:37:24 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:37:24 np0005548789.localdomain systemd-rc-local-generator[163728]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:37:24 np0005548789.localdomain systemd-sysv-generator[163732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:37:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:37:25 np0005548789.localdomain sudo[163698]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:26 np0005548789.localdomain sudo[163826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azdnrranzwpbvsathzuwswbwbrpovsun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013845.8413515-711-263706702970457/AnsiballZ_command.py
Dec 06 09:37:26 np0005548789.localdomain sudo[163826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:26 np0005548789.localdomain python3.9[163828]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:26 np0005548789.localdomain sudo[163826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:26 np0005548789.localdomain sudo[163919]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhkxwvyzzhdzhierwnbtomtrxlkswdmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013846.4937856-711-174594039215074/AnsiballZ_command.py
Dec 06 09:37:26 np0005548789.localdomain sudo[163919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:26 np0005548789.localdomain python3.9[163921]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:26 np0005548789.localdomain sudo[163919]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21093 DF PROTO=TCP SPT=53914 DPT=9882 SEQ=907222399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D275AF0000000001030307) 
Dec 06 09:37:27 np0005548789.localdomain sudo[164012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbovwujrngotiuyidllkyagnrufwamec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013847.058863-711-96979259485249/AnsiballZ_command.py
Dec 06 09:37:27 np0005548789.localdomain sudo[164012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:27 np0005548789.localdomain python3.9[164014]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:28 np0005548789.localdomain sudo[164012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:28 np0005548789.localdomain sudo[164105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shvyaalxgvqmlzegubshcjydxndcdisj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013848.6050766-711-5702782448811/AnsiballZ_command.py
Dec 06 09:37:28 np0005548789.localdomain sudo[164105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:29 np0005548789.localdomain python3.9[164107]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:29 np0005548789.localdomain sudo[164105]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:29 np0005548789.localdomain sudo[164198]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhbosoiywslbrzybxfckrqjwpajxitei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013849.1553283-711-18737227592946/AnsiballZ_command.py
Dec 06 09:37:29 np0005548789.localdomain sudo[164198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:29 np0005548789.localdomain python3.9[164200]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:29 np0005548789.localdomain sudo[164198]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48158 DF PROTO=TCP SPT=34958 DPT=9101 SEQ=2103639703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D27FF00000000001030307) 
Dec 06 09:37:29 np0005548789.localdomain sudo[164291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzecktgxkzzmhllnhsivhztogmoxlorc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013849.7275991-711-14675935596740/AnsiballZ_command.py
Dec 06 09:37:29 np0005548789.localdomain sudo[164291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:30 np0005548789.localdomain python3.9[164293]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:30 np0005548789.localdomain sudo[164291]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:30 np0005548789.localdomain sudo[164384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlwimsgmocjzktebxcpwyhbhxbdwsmti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013850.2967083-711-120760797414728/AnsiballZ_command.py
Dec 06 09:37:30 np0005548789.localdomain sudo[164384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:30 np0005548789.localdomain python3.9[164386]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:37:30 np0005548789.localdomain sudo[164384]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:31 np0005548789.localdomain sudo[164477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eymtewrwujckhufixkhonoysoddmhkjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013851.3211105-873-74954407588394/AnsiballZ_getent.py
Dec 06 09:37:31 np0005548789.localdomain sudo[164477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:31 np0005548789.localdomain python3.9[164479]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 06 09:37:31 np0005548789.localdomain sudo[164477]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64691 DF PROTO=TCP SPT=59664 DPT=9105 SEQ=313070873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D289F00000000001030307) 
Dec 06 09:37:32 np0005548789.localdomain sudo[164570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jogwbyjbsdrrrjksqngeynacmnofqsud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013852.1528027-897-271407420643666/AnsiballZ_group.py
Dec 06 09:37:32 np0005548789.localdomain sudo[164570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:32 np0005548789.localdomain python3.9[164572]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:37:32 np0005548789.localdomain groupadd[164573]: group added to /etc/group: name=libvirt, GID=42473
Dec 06 09:37:32 np0005548789.localdomain groupadd[164573]: group added to /etc/gshadow: name=libvirt
Dec 06 09:37:32 np0005548789.localdomain groupadd[164573]: new group: name=libvirt, GID=42473
Dec 06 09:37:32 np0005548789.localdomain sudo[164570]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:33 np0005548789.localdomain sudo[164668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvxdexnbzpbgzrgbeuaucvryoiauguxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013853.1828244-921-247154403707782/AnsiballZ_user.py
Dec 06 09:37:33 np0005548789.localdomain sudo[164668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:33 np0005548789.localdomain python3.9[164670]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:37:33 np0005548789.localdomain useradd[164672]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Dec 06 09:37:34 np0005548789.localdomain sudo[164668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:34 np0005548789.localdomain sudo[164768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvtvtxanjufzektutgtrfekqvpgrotnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013854.428638-954-65362302991502/AnsiballZ_setup.py
Dec 06 09:37:34 np0005548789.localdomain sudo[164768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=56162 DPT=9102 SEQ=3353457552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D293B00000000001030307) 
Dec 06 09:37:34 np0005548789.localdomain python3.9[164770]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:37:35 np0005548789.localdomain sudo[164768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:35 np0005548789.localdomain sudo[164822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iybpiqjahroouhbofqygxhhebynurbgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013854.428638-954-65362302991502/AnsiballZ_dnf.py
Dec 06 09:37:35 np0005548789.localdomain sudo[164822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:37:35 np0005548789.localdomain python3.9[164824]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:37:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31755 DF PROTO=TCP SPT=34236 DPT=9102 SEQ=1521893675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D29FF00000000001030307) 
Dec 06 09:37:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23209 DF PROTO=TCP SPT=56162 DPT=9102 SEQ=3353457552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2AB6F0000000001030307) 
Dec 06 09:37:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19439 DF PROTO=TCP SPT=44680 DPT=9101 SEQ=343523345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2B9210000000001030307) 
Dec 06 09:37:44 np0005548789.localdomain sshd[164896]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:37:45 np0005548789.localdomain podman[164898]: 2025-12-06 09:37:45.930389365 +0000 UTC m=+0.085342022 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:37:45 np0005548789.localdomain systemd[1]: tmp-crun.tOUMtJ.mount: Deactivated successfully.
Dec 06 09:37:46 np0005548789.localdomain sshd[164896]: Received disconnect from 118.193.38.207 port 36356:11: Bye Bye [preauth]
Dec 06 09:37:46 np0005548789.localdomain sshd[164896]: Disconnected from authenticating user root 118.193.38.207 port 36356 [preauth]
Dec 06 09:37:46 np0005548789.localdomain podman[164898]: 2025-12-06 09:37:46.012392329 +0000 UTC m=+0.167344936 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 06 09:37:46 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:37:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:37:46 np0005548789.localdomain podman[164922]: 2025-12-06 09:37:46.917532343 +0000 UTC m=+0.078269627 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:37:46 np0005548789.localdomain podman[164922]: 2025-12-06 09:37:46.920869089 +0000 UTC m=+0.081606413 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:37:46 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:37:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:37:47.267 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:37:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:37:47.267 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:37:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:37:47.269 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:37:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19441 DF PROTO=TCP SPT=44680 DPT=9101 SEQ=343523345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2C52F0000000001030307) 
Dec 06 09:37:48 np0005548789.localdomain sudo[164940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:37:48 np0005548789.localdomain sudo[164940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:48 np0005548789.localdomain sudo[164940]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:48 np0005548789.localdomain sudo[164958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:37:48 np0005548789.localdomain sudo[164958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:49 np0005548789.localdomain sudo[164958]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21272 DF PROTO=TCP SPT=45938 DPT=9105 SEQ=4088601586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2CE300000000001030307) 
Dec 06 09:37:50 np0005548789.localdomain sudo[165009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:37:50 np0005548789.localdomain sudo[165009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:37:50 np0005548789.localdomain sudo[165009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:37:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21273 DF PROTO=TCP SPT=45938 DPT=9105 SEQ=4088601586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2DDEF0000000001030307) 
Dec 06 09:37:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21096 DF PROTO=TCP SPT=53914 DPT=9882 SEQ=907222399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2E5EF0000000001030307) 
Dec 06 09:37:55 np0005548789.localdomain sshd[165046]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:37:57 np0005548789.localdomain sshd[165046]: Received disconnect from 103.192.152.59 port 59578:11: Bye Bye [preauth]
Dec 06 09:37:57 np0005548789.localdomain sshd[165046]: Disconnected from authenticating user root 103.192.152.59 port 59578 [preauth]
Dec 06 09:37:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19443 DF PROTO=TCP SPT=44680 DPT=9101 SEQ=343523345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2F5EF0000000001030307) 
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  Converting 2759 SID table entries...
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:01 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21274 DF PROTO=TCP SPT=45938 DPT=9105 SEQ=4088601586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D2FDEF0000000001030307) 
Dec 06 09:38:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33932 DF PROTO=TCP SPT=58314 DPT=9102 SEQ=3077742811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D308EF0000000001030307) 
Dec 06 09:38:05 np0005548789.localdomain sshd[166074]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:07 np0005548789.localdomain sshd[166074]: Received disconnect from 45.78.222.162 port 49596:11: Bye Bye [preauth]
Dec 06 09:38:07 np0005548789.localdomain sshd[166074]: Disconnected from authenticating user root 45.78.222.162 port 49596 [preauth]
Dec 06 09:38:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64469 DF PROTO=TCP SPT=33104 DPT=9882 SEQ=1567678979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D319F00000000001030307) 
Dec 06 09:38:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33934 DF PROTO=TCP SPT=58314 DPT=9102 SEQ=3077742811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D320B00000000001030307) 
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:12 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26469 DF PROTO=TCP SPT=35112 DPT=9101 SEQ=2682104277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D32E500000000001030307) 
Dec 06 09:38:16 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Dec 06 09:38:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:38:16 np0005548789.localdomain podman[166084]: 2025-12-06 09:38:16.936298685 +0000 UTC m=+0.089097530 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:38:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:38:16 np0005548789.localdomain podman[166084]: 2025-12-06 09:38:16.99219916 +0000 UTC m=+0.144998005 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 09:38:17 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:38:17 np0005548789.localdomain podman[166109]: 2025-12-06 09:38:17.065700278 +0000 UTC m=+0.075147456 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:38:17 np0005548789.localdomain podman[166109]: 2025-12-06 09:38:17.074518154 +0000 UTC m=+0.083965322 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:38:17 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:38:17 np0005548789.localdomain sshd[166127]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26471 DF PROTO=TCP SPT=35112 DPT=9101 SEQ=2682104277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D33A700000000001030307) 
Dec 06 09:38:17 np0005548789.localdomain sshd[166127]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 09:38:17 np0005548789.localdomain sshd[166127]: Connection closed by 43.163.93.82 port 58704
Dec 06 09:38:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55750 DF PROTO=TCP SPT=39902 DPT=9105 SEQ=3210868471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3436F0000000001030307) 
Dec 06 09:38:20 np0005548789.localdomain sshd[166129]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:20 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:21 np0005548789.localdomain sshd[166129]: Received disconnect from 103.234.151.178 port 7216:11: Bye Bye [preauth]
Dec 06 09:38:21 np0005548789.localdomain sshd[166129]: Disconnected from authenticating user root 103.234.151.178 port 7216 [preauth]
Dec 06 09:38:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55751 DF PROTO=TCP SPT=39902 DPT=9105 SEQ=3210868471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D353300000000001030307) 
Dec 06 09:38:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17110 DF PROTO=TCP SPT=54638 DPT=9882 SEQ=1117969624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D35FEF0000000001030307) 
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:28 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26473 DF PROTO=TCP SPT=35112 DPT=9101 SEQ=2682104277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D369EF0000000001030307) 
Dec 06 09:38:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55752 DF PROTO=TCP SPT=39902 DPT=9105 SEQ=3210868471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D373F00000000001030307) 
Dec 06 09:38:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23225 DF PROTO=TCP SPT=48930 DPT=9102 SEQ=3124958974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D37E2F0000000001030307) 
Dec 06 09:38:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23212 DF PROTO=TCP SPT=56162 DPT=9102 SEQ=3353457552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D389EF0000000001030307) 
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:38 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23227 DF PROTO=TCP SPT=48930 DPT=9102 SEQ=3124958974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D395F00000000001030307) 
Dec 06 09:38:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23203 DF PROTO=TCP SPT=32928 DPT=9101 SEQ=3537228615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3A3800000000001030307) 
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:47 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:38:47.268 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:38:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:38:47.270 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:38:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:38:47.271 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:38:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23205 DF PROTO=TCP SPT=32928 DPT=9101 SEQ=3537228615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3AF6F0000000001030307) 
Dec 06 09:38:47 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=24 res=1
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:38:47 np0005548789.localdomain podman[166176]: 2025-12-06 09:38:47.818794454 +0000 UTC m=+0.100096932 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 09:38:47 np0005548789.localdomain systemd-sysv-generator[166230]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:38:47 np0005548789.localdomain systemd-rc-local-generator[166224]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:38:47 np0005548789.localdomain podman[166176]: 2025-12-06 09:38:47.893066326 +0000 UTC m=+0.174368794 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 09:38:47 np0005548789.localdomain podman[166177]: 2025-12-06 09:38:47.906394714 +0000 UTC m=+0.186526066 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:38:47 np0005548789.localdomain podman[166177]: 2025-12-06 09:38:47.91510867 +0000 UTC m=+0.195239972 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:38:47 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:38:48 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:38:48 np0005548789.localdomain systemd-sysv-generator[166280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:38:48 np0005548789.localdomain systemd-rc-local-generator[166275]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:38:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:38:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29804 DF PROTO=TCP SPT=48604 DPT=9105 SEQ=983361846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3B8B00000000001030307) 
Dec 06 09:38:50 np0005548789.localdomain sudo[166297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:50 np0005548789.localdomain sudo[166297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:50 np0005548789.localdomain sudo[166297]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:50 np0005548789.localdomain sudo[166315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:38:50 np0005548789.localdomain sudo[166315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:50 np0005548789.localdomain sudo[166315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:51 np0005548789.localdomain sshd[166365]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:51 np0005548789.localdomain sudo[166367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:38:51 np0005548789.localdomain sudo[166367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:38:51 np0005548789.localdomain sudo[166367]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:52 np0005548789.localdomain sshd[166365]: Received disconnect from 103.157.25.60 port 51320:11: Bye Bye [preauth]
Dec 06 09:38:52 np0005548789.localdomain sshd[166365]: Disconnected from authenticating user root 103.157.25.60 port 51320 [preauth]
Dec 06 09:38:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29805 DF PROTO=TCP SPT=48604 DPT=9105 SEQ=983361846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3C86F0000000001030307) 
Dec 06 09:38:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17113 DF PROTO=TCP SPT=54638 DPT=9882 SEQ=1117969624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3CFEF0000000001030307) 
Dec 06 09:38:55 np0005548789.localdomain sshd[166385]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:56 np0005548789.localdomain sshd[166387]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  Converting 2763 SID table entries...
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability open_perms=1
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:38:57 np0005548789.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:38:57 np0005548789.localdomain sshd[166387]: Received disconnect from 64.227.156.63 port 36040:11: Bye Bye [preauth]
Dec 06 09:38:57 np0005548789.localdomain sshd[166387]: Disconnected from authenticating user root 64.227.156.63 port 36040 [preauth]
Dec 06 09:38:57 np0005548789.localdomain groupadd[166398]: group added to /etc/group: name=clevis, GID=985
Dec 06 09:38:57 np0005548789.localdomain groupadd[166398]: group added to /etc/gshadow: name=clevis
Dec 06 09:38:57 np0005548789.localdomain groupadd[166398]: new group: name=clevis, GID=985
Dec 06 09:38:58 np0005548789.localdomain useradd[166405]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 06 09:38:58 np0005548789.localdomain usermod[166415]: add 'clevis' to group 'tss'
Dec 06 09:38:58 np0005548789.localdomain usermod[166415]: add 'clevis' to shadow group 'tss'
Dec 06 09:38:59 np0005548789.localdomain sshd[166385]: Received disconnect from 179.33.210.213 port 35488:11: Bye Bye [preauth]
Dec 06 09:38:59 np0005548789.localdomain sshd[166385]: Disconnected from authenticating user root 179.33.210.213 port 35488 [preauth]
Dec 06 09:38:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=32928 DPT=9101 SEQ=3537228615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3DFEF0000000001030307) 
Dec 06 09:39:01 np0005548789.localdomain groupadd[166437]: group added to /etc/group: name=dnsmasq, GID=984
Dec 06 09:39:01 np0005548789.localdomain groupadd[166437]: group added to /etc/gshadow: name=dnsmasq
Dec 06 09:39:01 np0005548789.localdomain groupadd[166437]: new group: name=dnsmasq, GID=984
Dec 06 09:39:01 np0005548789.localdomain useradd[166444]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 06 09:39:01 np0005548789.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Dec 06 09:39:01 np0005548789.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=25 res=1
Dec 06 09:39:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29806 DF PROTO=TCP SPT=48604 DPT=9105 SEQ=983361846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3E7F00000000001030307) 
Dec 06 09:39:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45655 DF PROTO=TCP SPT=38490 DPT=9102 SEQ=4244599726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3F32F0000000001030307) 
Dec 06 09:39:04 np0005548789.localdomain sshd[166458]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:06 np0005548789.localdomain sshd[166458]: Received disconnect from 118.193.38.207 port 40874:11: Bye Bye [preauth]
Dec 06 09:39:06 np0005548789.localdomain sshd[166458]: Disconnected from authenticating user root 118.193.38.207 port 40874 [preauth]
Dec 06 09:39:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33937 DF PROTO=TCP SPT=58314 DPT=9102 SEQ=3077742811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D3FFEF0000000001030307) 
Dec 06 09:39:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45657 DF PROTO=TCP SPT=38490 DPT=9102 SEQ=4244599726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D40AEF0000000001030307) 
Dec 06 09:39:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4511 DF PROTO=TCP SPT=58514 DPT=9101 SEQ=826127641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D418B00000000001030307) 
Dec 06 09:39:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4513 DF PROTO=TCP SPT=58514 DPT=9101 SEQ=826127641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D424AF0000000001030307) 
Dec 06 09:39:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:39:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:39:18 np0005548789.localdomain podman[169320]: 2025-12-06 09:39:18.948958921 +0000 UTC m=+0.100492564 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Dec 06 09:39:18 np0005548789.localdomain podman[169320]: 2025-12-06 09:39:18.98097224 +0000 UTC m=+0.132505883 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:39:18 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:39:19 np0005548789.localdomain podman[169316]: 2025-12-06 09:39:19.032132865 +0000 UTC m=+0.183773341 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:39:19 np0005548789.localdomain podman[169316]: 2025-12-06 09:39:19.070715195 +0000 UTC m=+0.222355671 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 06 09:39:19 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:39:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56496 DF PROTO=TCP SPT=51976 DPT=9105 SEQ=3791762787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D42DEF0000000001030307) 
Dec 06 09:39:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56497 DF PROTO=TCP SPT=51976 DPT=9105 SEQ=3791762787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D43DAF0000000001030307) 
Dec 06 09:39:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39788 DF PROTO=TCP SPT=34382 DPT=9882 SEQ=1781381716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D445EF0000000001030307) 
Dec 06 09:39:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4515 DF PROTO=TCP SPT=58514 DPT=9101 SEQ=826127641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D453EF0000000001030307) 
Dec 06 09:39:29 np0005548789.localdomain sshd[177364]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:31 np0005548789.localdomain sshd[177364]: Connection reset by authenticating user root 91.202.233.33 port 47844 [preauth]
Dec 06 09:39:31 np0005548789.localdomain sshd[178678]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56498 DF PROTO=TCP SPT=51976 DPT=9105 SEQ=3791762787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D45DEF0000000001030307) 
Dec 06 09:39:33 np0005548789.localdomain sshd[178678]: Connection reset by authenticating user root 91.202.233.33 port 33782 [preauth]
Dec 06 09:39:33 np0005548789.localdomain sshd[180375]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56639 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=1210804314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4686F0000000001030307) 
Dec 06 09:39:35 np0005548789.localdomain sshd[180375]: Connection reset by authenticating user root 91.202.233.33 port 33794 [preauth]
Dec 06 09:39:35 np0005548789.localdomain sshd[181673]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:37 np0005548789.localdomain sshd[181673]: Connection reset by authenticating user root 91.202.233.33 port 33804 [preauth]
Dec 06 09:39:37 np0005548789.localdomain sshd[183296]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23230 DF PROTO=TCP SPT=48930 DPT=9102 SEQ=3124958974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D473F00000000001030307) 
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Reloading rules
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Collecting garbage unconditionally...
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Finished loading, compiling and executing 5 rules
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Reloading rules
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Collecting garbage unconditionally...
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:39:38 np0005548789.localdomain polkitd[1032]: Finished loading, compiling and executing 5 rules
Dec 06 09:39:39 np0005548789.localdomain sshd[183296]: Connection reset by authenticating user root 91.202.233.33 port 33820 [preauth]
Dec 06 09:39:40 np0005548789.localdomain groupadd[183580]: group added to /etc/group: name=ceph, GID=167
Dec 06 09:39:40 np0005548789.localdomain groupadd[183580]: group added to /etc/gshadow: name=ceph
Dec 06 09:39:40 np0005548789.localdomain groupadd[183580]: new group: name=ceph, GID=167
Dec 06 09:39:40 np0005548789.localdomain useradd[183586]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 06 09:39:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56641 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=1210804314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4802F0000000001030307) 
Dec 06 09:39:42 np0005548789.localdomain sshd[183690]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:43 np0005548789.localdomain sshd[119889]: Received signal 15; terminating.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: sshd.service: Unit process 183690 (sshd) remains running after unit stopped.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: sshd.service: Unit process 183701 (sshd) remains running after unit stopped.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: sshd.service: Consumed 8.154s CPU time, read 32.0K from disk, written 92.0K to disk.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:39:43 np0005548789.localdomain sshd[184255]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:43 np0005548789.localdomain sshd[184255]: Server listening on 0.0.0.0 port 22.
Dec 06 09:39:43 np0005548789.localdomain sshd[184255]: Server listening on :: port 22.
Dec 06 09:39:43 np0005548789.localdomain systemd[1]: Started OpenSSH server daemon.
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49590 DF PROTO=TCP SPT=55096 DPT=9101 SEQ=578874230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D48DE00000000001030307) 
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain sshd[183690]: Received disconnect from 103.192.152.59 port 42988:11: Bye Bye [preauth]
Dec 06 09:39:44 np0005548789.localdomain sshd[183690]: Disconnected from authenticating user root 103.192.152.59 port 42988 [preauth]
Dec 06 09:39:44 np0005548789.localdomain sshd[184362]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:39:46 np0005548789.localdomain systemd-sysv-generator[184498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:46 np0005548789.localdomain systemd-rc-local-generator[184494]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:39:46 np0005548789.localdomain sshd[184362]: Received disconnect from 103.234.151.178 port 33356:11: Bye Bye [preauth]
Dec 06 09:39:46 np0005548789.localdomain sshd[184362]: Disconnected from authenticating user root 103.234.151.178 port 33356 [preauth]
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:39:46 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:39:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:39:47.269 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:39:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:39:47.270 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:39:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:39:47.271 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:39:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49592 DF PROTO=TCP SPT=55096 DPT=9101 SEQ=578874230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D499EF0000000001030307) 
Dec 06 09:39:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:39:49 np0005548789.localdomain sudo[164822]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:39:49 np0005548789.localdomain podman[188969]: 2025-12-06 09:39:49.178934202 +0000 UTC m=+0.086720285 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:39:49 np0005548789.localdomain podman[188969]: 2025-12-06 09:39:49.20885367 +0000 UTC m=+0.116639683 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:39:49 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:39:49 np0005548789.localdomain podman[189087]: 2025-12-06 09:39:49.296248534 +0000 UTC m=+0.116786618 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 09:39:49 np0005548789.localdomain podman[189087]: 2025-12-06 09:39:49.330261484 +0000 UTC m=+0.150799588 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:39:49 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:39:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4181 DF PROTO=TCP SPT=40810 DPT=9105 SEQ=2956413822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4A2EF0000000001030307) 
Dec 06 09:39:51 np0005548789.localdomain sudo[190761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:51 np0005548789.localdomain sudo[190761]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:51 np0005548789.localdomain sudo[190761]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:51 np0005548789.localdomain sudo[190830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:39:51 np0005548789.localdomain sudo[190830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:52 np0005548789.localdomain sudo[190830]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:53 np0005548789.localdomain sudo[191412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:39:53 np0005548789.localdomain sudo[191412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:39:53 np0005548789.localdomain sudo[191412]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4182 DF PROTO=TCP SPT=40810 DPT=9105 SEQ=2956413822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4B2AF0000000001030307) 
Dec 06 09:39:56 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:39:56 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:39:56 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Consumed 12.693s CPU time.
Dec 06 09:39:56 np0005548789.localdomain systemd[1]: run-r58d9d66a043744a1868bfd422b592b96.service: Deactivated successfully.
Dec 06 09:39:56 np0005548789.localdomain systemd[1]: run-r5671576059fd4d47ade689acb4dab74f.service: Deactivated successfully.
Dec 06 09:39:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41395 DF PROTO=TCP SPT=56308 DPT=9882 SEQ=2296920758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4BF6F0000000001030307) 
Dec 06 09:39:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49594 DF PROTO=TCP SPT=55096 DPT=9101 SEQ=578874230 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4C9EF0000000001030307) 
Dec 06 09:40:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4183 DF PROTO=TCP SPT=40810 DPT=9105 SEQ=2956413822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4D3EF0000000001030307) 
Dec 06 09:40:04 np0005548789.localdomain sudo[193203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxrryeixnypdkycqzkmstiqqmitmjyoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014003.965299-990-64152343105286/AnsiballZ_systemd.py
Dec 06 09:40:04 np0005548789.localdomain sudo[193203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30788 DF PROTO=TCP SPT=40024 DPT=9102 SEQ=2904579337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4DDAF0000000001030307) 
Dec 06 09:40:04 np0005548789.localdomain python3.9[193205]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:04 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:05 np0005548789.localdomain systemd-rc-local-generator[193228]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:05 np0005548789.localdomain systemd-sysv-generator[193233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:05 np0005548789.localdomain sudo[193203]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:05 np0005548789.localdomain sudo[193352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtlqvdqzwmuembwytpktqkspbpobcdud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014005.4442084-990-246533775786847/AnsiballZ_systemd.py
Dec 06 09:40:05 np0005548789.localdomain sudo[193352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:06 np0005548789.localdomain python3.9[193354]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:06 np0005548789.localdomain systemd-rc-local-generator[193381]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:06 np0005548789.localdomain systemd-sysv-generator[193387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:06 np0005548789.localdomain sudo[193352]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:06 np0005548789.localdomain sudo[193501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkbutwsobablnyvuoupaotwlrypzagmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014006.4990308-990-97932239729169/AnsiballZ_systemd.py
Dec 06 09:40:06 np0005548789.localdomain sudo[193501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:07 np0005548789.localdomain python3.9[193503]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:07 np0005548789.localdomain systemd-rc-local-generator[193532]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:07 np0005548789.localdomain systemd-sysv-generator[193537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:07 np0005548789.localdomain sudo[193501]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45660 DF PROTO=TCP SPT=38490 DPT=9102 SEQ=4244599726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4E9EF0000000001030307) 
Dec 06 09:40:08 np0005548789.localdomain sudo[193650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acpwghodykxegrhjzuxzidqvwkazbcvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014008.189233-990-77695758148642/AnsiballZ_systemd.py
Dec 06 09:40:08 np0005548789.localdomain sudo[193650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:08 np0005548789.localdomain python3.9[193652]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:08 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:08 np0005548789.localdomain systemd-rc-local-generator[193677]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:08 np0005548789.localdomain systemd-sysv-generator[193682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:40:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b7610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55d1180b62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:09 np0005548789.localdomain sudo[193650]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:09 np0005548789.localdomain sudo[193799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyignsxrpjoxpencgbanmfqhjmrggyxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014009.404592-1077-83491163076363/AnsiballZ_systemd.py
Dec 06 09:40:09 np0005548789.localdomain sudo[193799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:09 np0005548789.localdomain python3.9[193801]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:10 np0005548789.localdomain systemd-sysv-generator[193830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:10 np0005548789.localdomain systemd-rc-local-generator[193827]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:10 np0005548789.localdomain sudo[193799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30790 DF PROTO=TCP SPT=40024 DPT=9102 SEQ=2904579337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D4F56F0000000001030307) 
Dec 06 09:40:10 np0005548789.localdomain sudo[193948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-albgplrfmsaodcgpthzcvtscppdhktmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014010.5522602-1077-172953861534738/AnsiballZ_systemd.py
Dec 06 09:40:10 np0005548789.localdomain sudo[193948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:11 np0005548789.localdomain python3.9[193950]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:11 np0005548789.localdomain systemd-rc-local-generator[193979]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:11 np0005548789.localdomain systemd-sysv-generator[193983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:11 np0005548789.localdomain sudo[193948]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:11 np0005548789.localdomain sudo[194096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqtpxoguoirxaiocywnbwrbvdoonromz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014011.6593404-1077-2392118322240/AnsiballZ_systemd.py
Dec 06 09:40:11 np0005548789.localdomain sudo[194096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:12 np0005548789.localdomain python3.9[194098]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:12 np0005548789.localdomain systemd-rc-local-generator[194127]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:12 np0005548789.localdomain systemd-sysv-generator[194131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:12 np0005548789.localdomain sudo[194096]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:40:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.023       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.03              0.00         1    0.030       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468eb610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.005       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.2 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55e1468ea2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Dec 06 09:40:12 np0005548789.localdomain sudo[194244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqyjqdhtyjlukwmmzajnuenzvzvvkant ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014012.7413461-1077-106527557957633/AnsiballZ_systemd.py
Dec 06 09:40:12 np0005548789.localdomain sudo[194244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:13 np0005548789.localdomain python3.9[194246]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:13 np0005548789.localdomain sudo[194244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:13 np0005548789.localdomain sudo[194357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nawcbsdkdipltoimsnvedhjgueyfdsaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014013.475953-1077-139531868114993/AnsiballZ_systemd.py
Dec 06 09:40:13 np0005548789.localdomain sudo[194357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:14 np0005548789.localdomain python3.9[194359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:14 np0005548789.localdomain systemd-rc-local-generator[194391]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:14 np0005548789.localdomain systemd-sysv-generator[194394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22094 DF PROTO=TCP SPT=54726 DPT=9101 SEQ=1750144204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D503100000000001030307) 
Dec 06 09:40:14 np0005548789.localdomain sudo[194357]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22096 DF PROTO=TCP SPT=54726 DPT=9101 SEQ=1750144204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D50F2F0000000001030307) 
Dec 06 09:40:17 np0005548789.localdomain sudo[194505]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffsvfslifsiztixzrcgqquxgeyzpcvmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014017.5399532-1185-159332138222466/AnsiballZ_systemd.py
Dec 06 09:40:17 np0005548789.localdomain sudo[194505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:18 np0005548789.localdomain python3.9[194507]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:40:18 np0005548789.localdomain systemd-sysv-generator[194540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:18 np0005548789.localdomain systemd-rc-local-generator[194537]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:40:18 np0005548789.localdomain sudo[194505]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32175 DF PROTO=TCP SPT=45532 DPT=9105 SEQ=4017140069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D518300000000001030307) 
Dec 06 09:40:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:40:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:40:19 np0005548789.localdomain podman[194613]: 2025-12-06 09:40:19.917653922 +0000 UTC m=+0.072830403 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 09:40:19 np0005548789.localdomain podman[194613]: 2025-12-06 09:40:19.956200395 +0000 UTC m=+0.111376836 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:40:19 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:40:20 np0005548789.localdomain sudo[194698]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uiqttckhqxdycxqmzdzldnhmmsudyumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014019.7002683-1209-38613681559234/AnsiballZ_systemd.py
Dec 06 09:40:20 np0005548789.localdomain sudo[194698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:20 np0005548789.localdomain podman[194617]: 2025-12-06 09:40:19.966157461 +0000 UTC m=+0.119309403 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Dec 06 09:40:20 np0005548789.localdomain podman[194617]: 2025-12-06 09:40:20.05001307 +0000 UTC m=+0.203165012 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 09:40:20 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:40:20 np0005548789.localdomain python3.9[194700]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:20 np0005548789.localdomain sudo[194698]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:20 np0005548789.localdomain sudo[194811]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caepojvugsjegalcqbhglvcyfnniikqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014020.5011563-1209-163605768108414/AnsiballZ_systemd.py
Dec 06 09:40:20 np0005548789.localdomain sudo[194811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:21 np0005548789.localdomain python3.9[194813]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:21 np0005548789.localdomain sudo[194811]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:21 np0005548789.localdomain sudo[194924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyoeumqhnahvvypmygtzrzvbchchhhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014021.6130376-1209-51817359700675/AnsiballZ_systemd.py
Dec 06 09:40:21 np0005548789.localdomain sudo[194924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:22 np0005548789.localdomain sshd[194927]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:40:22 np0005548789.localdomain python3.9[194926]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:23 np0005548789.localdomain sudo[194924]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:23 np0005548789.localdomain sudo[195039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pslayszarohofxscpxxovnvzfjrgnoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014023.3245368-1209-104365499489673/AnsiballZ_systemd.py
Dec 06 09:40:23 np0005548789.localdomain sudo[195039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32176 DF PROTO=TCP SPT=45532 DPT=9105 SEQ=4017140069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D527EF0000000001030307) 
Dec 06 09:40:23 np0005548789.localdomain python3.9[195041]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:24 np0005548789.localdomain sshd[194927]: Received disconnect from 103.157.25.60 port 53006:11: Bye Bye [preauth]
Dec 06 09:40:24 np0005548789.localdomain sshd[194927]: Disconnected from authenticating user root 103.157.25.60 port 53006 [preauth]
Dec 06 09:40:24 np0005548789.localdomain sudo[195039]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:25 np0005548789.localdomain sudo[195152]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bafdnhyhubpnjuzcshpbaknhbmijqpim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014025.085418-1209-36438456020377/AnsiballZ_systemd.py
Dec 06 09:40:25 np0005548789.localdomain sudo[195152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:25 np0005548789.localdomain python3.9[195154]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:25 np0005548789.localdomain sudo[195152]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41398 DF PROTO=TCP SPT=56308 DPT=9882 SEQ=2296920758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D52FEF0000000001030307) 
Dec 06 09:40:26 np0005548789.localdomain sudo[195265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csffkqvgnpceoznqdfswzcdkjubbngob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014025.833557-1209-273629658989626/AnsiballZ_systemd.py
Dec 06 09:40:26 np0005548789.localdomain sudo[195265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:26 np0005548789.localdomain sshd[195268]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:40:26 np0005548789.localdomain python3.9[195267]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:27 np0005548789.localdomain sudo[195265]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:27 np0005548789.localdomain sshd[195268]: Received disconnect from 118.193.38.207 port 37620:11: Bye Bye [preauth]
Dec 06 09:40:27 np0005548789.localdomain sshd[195268]: Disconnected from authenticating user root 118.193.38.207 port 37620 [preauth]
Dec 06 09:40:27 np0005548789.localdomain sudo[195380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdmgswgegahyxftjfteipxkuyeurphmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014027.6236632-1209-67638260131492/AnsiballZ_systemd.py
Dec 06 09:40:27 np0005548789.localdomain sudo[195380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:28 np0005548789.localdomain python3.9[195382]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:28 np0005548789.localdomain sudo[195380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:28 np0005548789.localdomain sudo[195493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tejvfupongolymygstfntkemxhxoyovh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014028.374816-1209-25538972865489/AnsiballZ_systemd.py
Dec 06 09:40:28 np0005548789.localdomain sudo[195493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:29 np0005548789.localdomain python3.9[195495]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:29 np0005548789.localdomain sudo[195493]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:29 np0005548789.localdomain sudo[195606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxrtfjiwrbcecxlytkwoquxqhxrpqnts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014029.1949222-1209-280500870532862/AnsiballZ_systemd.py
Dec 06 09:40:29 np0005548789.localdomain sudo[195606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:29 np0005548789.localdomain python3.9[195608]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:29 np0005548789.localdomain sudo[195606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22098 DF PROTO=TCP SPT=54726 DPT=9101 SEQ=1750144204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D53FF00000000001030307) 
Dec 06 09:40:30 np0005548789.localdomain sudo[195719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzmhojqppxrhdogruknvvsolqprgbgbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014029.97919-1209-213885563493969/AnsiballZ_systemd.py
Dec 06 09:40:30 np0005548789.localdomain sudo[195719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:30 np0005548789.localdomain python3.9[195721]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:30 np0005548789.localdomain sudo[195719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:31 np0005548789.localdomain sudo[195832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzglljvkmrbwubvjbtlyifwlyfyirjuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014031.499242-1209-268943276434123/AnsiballZ_systemd.py
Dec 06 09:40:31 np0005548789.localdomain sudo[195832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32177 DF PROTO=TCP SPT=45532 DPT=9105 SEQ=4017140069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D547EF0000000001030307) 
Dec 06 09:40:32 np0005548789.localdomain python3.9[195834]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:32 np0005548789.localdomain sudo[195832]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:32 np0005548789.localdomain sudo[195945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfxgzcqvwmxbtrtgzieovrekqpcswqxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014032.2578962-1209-149429327850969/AnsiballZ_systemd.py
Dec 06 09:40:32 np0005548789.localdomain sudo[195945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:32 np0005548789.localdomain python3.9[195947]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:32 np0005548789.localdomain sudo[195945]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:33 np0005548789.localdomain sudo[196058]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaybpeolzmuatbmtarduraomllrmfolq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014033.082697-1209-94145202085792/AnsiballZ_systemd.py
Dec 06 09:40:33 np0005548789.localdomain sudo[196058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:33 np0005548789.localdomain python3.9[196060]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:33 np0005548789.localdomain sudo[196058]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:34 np0005548789.localdomain sudo[196171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijfpknubajzafxvqldfmwdnxjchxbiwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014034.3314204-1209-11493335819186/AnsiballZ_systemd.py
Dec 06 09:40:34 np0005548789.localdomain sudo[196171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=179 DF PROTO=TCP SPT=45314 DPT=9102 SEQ=2710989866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D552EF0000000001030307) 
Dec 06 09:40:34 np0005548789.localdomain python3.9[196173]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:40:34 np0005548789.localdomain sudo[196171]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:35 np0005548789.localdomain sudo[196284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyksltnkrnisgnhyjclkdzroqchghkba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014035.5040038-1515-141063771431211/AnsiballZ_file.py
Dec 06 09:40:35 np0005548789.localdomain sudo[196284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:35 np0005548789.localdomain python3.9[196286]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:35 np0005548789.localdomain sudo[196284]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:36 np0005548789.localdomain sudo[196394]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aezykssbwviujcptayvzlplbcjmdkqjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014036.0971665-1515-215934314473552/AnsiballZ_file.py
Dec 06 09:40:36 np0005548789.localdomain sudo[196394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:36 np0005548789.localdomain sshd[196397]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:40:36 np0005548789.localdomain python3.9[196396]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:36 np0005548789.localdomain sudo[196394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:36 np0005548789.localdomain sudo[196506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeobhvkwmljubjsywvsdtrtwsemgxylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014036.7129068-1515-173230782984768/AnsiballZ_file.py
Dec 06 09:40:36 np0005548789.localdomain sudo[196506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:37 np0005548789.localdomain python3.9[196508]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:37 np0005548789.localdomain sudo[196506]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:37 np0005548789.localdomain sudo[196616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oralpkkxuuefxjejsnaootzyvyairjhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014037.3502352-1515-93014204128938/AnsiballZ_file.py
Dec 06 09:40:37 np0005548789.localdomain sudo[196616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:37 np0005548789.localdomain python3.9[196618]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:37 np0005548789.localdomain sudo[196616]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:37 np0005548789.localdomain sshd[196397]: Received disconnect from 64.227.156.63 port 54046:11: Bye Bye [preauth]
Dec 06 09:40:37 np0005548789.localdomain sshd[196397]: Disconnected from authenticating user root 64.227.156.63 port 54046 [preauth]
Dec 06 09:40:38 np0005548789.localdomain sudo[196726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uswfmmzhesogvkltwxlyzpndfqfhrvbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014037.983122-1515-266962561898451/AnsiballZ_file.py
Dec 06 09:40:38 np0005548789.localdomain sudo[196726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:38 np0005548789.localdomain python3.9[196728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:38 np0005548789.localdomain sudo[196726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:38 np0005548789.localdomain sudo[196836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryzijaucztdguhcyvpviyrrsiccpglgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014038.4883232-1515-81397594514676/AnsiballZ_file.py
Dec 06 09:40:38 np0005548789.localdomain sudo[196836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:38 np0005548789.localdomain python3.9[196838]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:40:38 np0005548789.localdomain sudo[196836]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42484 DF PROTO=TCP SPT=59730 DPT=9882 SEQ=179938619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D563EF0000000001030307) 
Dec 06 09:40:39 np0005548789.localdomain sudo[196946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iewmlewzgodinismkssraukkjiuvwxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014039.3518567-1644-243531893126623/AnsiballZ_stat.py
Dec 06 09:40:39 np0005548789.localdomain sudo[196946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:39 np0005548789.localdomain python3.9[196948]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:40 np0005548789.localdomain sudo[196946]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:40 np0005548789.localdomain sudo[197036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umakvdkccagkmnxhhiyfhtpfufrchnho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014039.3518567-1644-243531893126623/AnsiballZ_copy.py
Dec 06 09:40:40 np0005548789.localdomain sudo[197036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:40 np0005548789.localdomain python3.9[197038]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014039.3518567-1644-243531893126623/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:40 np0005548789.localdomain sudo[197036]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=181 DF PROTO=TCP SPT=45314 DPT=9102 SEQ=2710989866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D56AB00000000001030307) 
Dec 06 09:40:41 np0005548789.localdomain sudo[197147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpxbpwlmvunaycxkmqbebmxetmfoiobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014040.800713-1644-215500618630509/AnsiballZ_stat.py
Dec 06 09:40:41 np0005548789.localdomain sudo[197147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:41 np0005548789.localdomain python3.9[197149]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:41 np0005548789.localdomain sudo[197147]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:41 np0005548789.localdomain sudo[197237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjfpvcrnnuhaeftipsxmclnnmlkfpxzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014040.800713-1644-215500618630509/AnsiballZ_copy.py
Dec 06 09:40:41 np0005548789.localdomain sudo[197237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:41 np0005548789.localdomain python3.9[197239]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014040.800713-1644-215500618630509/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:41 np0005548789.localdomain sudo[197237]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:42 np0005548789.localdomain sudo[197347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqwljrkenyjxqxzairimwxjpletzubvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014041.8903732-1644-47980772668363/AnsiballZ_stat.py
Dec 06 09:40:42 np0005548789.localdomain sudo[197347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:42 np0005548789.localdomain python3.9[197349]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:42 np0005548789.localdomain sudo[197347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:42 np0005548789.localdomain sudo[197437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhepqtokqvkncngginhghwbydrfncvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014041.8903732-1644-47980772668363/AnsiballZ_copy.py
Dec 06 09:40:42 np0005548789.localdomain sudo[197437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:42 np0005548789.localdomain python3.9[197439]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014041.8903732-1644-47980772668363/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:42 np0005548789.localdomain sudo[197437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:43 np0005548789.localdomain sudo[197547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyuqpsouzgnyimeqvppomjfjazfanphj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014043.0292294-1644-272244161885259/AnsiballZ_stat.py
Dec 06 09:40:43 np0005548789.localdomain sudo[197547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:43 np0005548789.localdomain python3.9[197549]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:43 np0005548789.localdomain sudo[197547]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:43 np0005548789.localdomain sudo[197637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxbnxykkmyrstvqikatyhazidigaseow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014043.0292294-1644-272244161885259/AnsiballZ_copy.py
Dec 06 09:40:43 np0005548789.localdomain sudo[197637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:44 np0005548789.localdomain python3.9[197639]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014043.0292294-1644-272244161885259/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:44 np0005548789.localdomain sudo[197637]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45037 DF PROTO=TCP SPT=47250 DPT=9101 SEQ=4230461905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D578410000000001030307) 
Dec 06 09:40:44 np0005548789.localdomain sudo[197747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzzdvgzidxviuehwppkimiobriyxzzkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014044.169851-1644-249129172537316/AnsiballZ_stat.py
Dec 06 09:40:44 np0005548789.localdomain sudo[197747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:44 np0005548789.localdomain python3.9[197749]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:44 np0005548789.localdomain sudo[197747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:45 np0005548789.localdomain sudo[197837]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyalusjouscsvveipmngqnbdlbpebwna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014044.169851-1644-249129172537316/AnsiballZ_copy.py
Dec 06 09:40:45 np0005548789.localdomain sudo[197837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:45 np0005548789.localdomain python3.9[197839]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014044.169851-1644-249129172537316/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:45 np0005548789.localdomain sudo[197837]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:45 np0005548789.localdomain sudo[197947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkdaoluuvlakkpqncvjswskqchcvzwpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014045.3556237-1644-109866090232975/AnsiballZ_stat.py
Dec 06 09:40:45 np0005548789.localdomain sudo[197947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:45 np0005548789.localdomain python3.9[197949]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:45 np0005548789.localdomain sudo[197947]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:46 np0005548789.localdomain sudo[198037]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dajustotgyrykkhijxbqbohbcepfsnyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014045.3556237-1644-109866090232975/AnsiballZ_copy.py
Dec 06 09:40:46 np0005548789.localdomain sudo[198037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:46 np0005548789.localdomain python3.9[198039]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014045.3556237-1644-109866090232975/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:46 np0005548789.localdomain sudo[198037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:46 np0005548789.localdomain sudo[198147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-minoxncainokpgwvuvzyqpoptyoftubq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014046.500331-1644-144119703668803/AnsiballZ_stat.py
Dec 06 09:40:46 np0005548789.localdomain sudo[198147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:46 np0005548789.localdomain python3.9[198149]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:46 np0005548789.localdomain sudo[198147]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:47 np0005548789.localdomain sudo[198235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvtwwlzldjcpifpxbthznrgoorybbucr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014046.500331-1644-144119703668803/AnsiballZ_copy.py
Dec 06 09:40:47 np0005548789.localdomain sudo[198235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:40:47.270 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:40:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:40:47.271 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:40:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:40:47.272 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:40:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45039 DF PROTO=TCP SPT=47250 DPT=9101 SEQ=4230461905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5842F0000000001030307) 
Dec 06 09:40:47 np0005548789.localdomain python3.9[198237]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014046.500331-1644-144119703668803/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:47 np0005548789.localdomain sudo[198235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:47 np0005548789.localdomain sudo[198345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quoenrtbnwjonigwklmxhgwwzgbwcqvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014047.5667634-1644-162433883724825/AnsiballZ_stat.py
Dec 06 09:40:47 np0005548789.localdomain sudo[198345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:48 np0005548789.localdomain python3.9[198347]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:48 np0005548789.localdomain sudo[198345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:48 np0005548789.localdomain sudo[198435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsrfaauhqrfxtvdahjpprjztmuquuted ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014047.5667634-1644-162433883724825/AnsiballZ_copy.py
Dec 06 09:40:48 np0005548789.localdomain sudo[198435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:48 np0005548789.localdomain python3.9[198437]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014047.5667634-1644-162433883724825/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:48 np0005548789.localdomain sudo[198435]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:49 np0005548789.localdomain sudo[198545]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jusgzxpiixuhrrqwcxejmauqlotpbyjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014049.2191782-1986-118769325520106/AnsiballZ_file.py
Dec 06 09:40:49 np0005548789.localdomain sudo[198545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:49 np0005548789.localdomain python3.9[198547]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:49 np0005548789.localdomain sudo[198545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62226 DF PROTO=TCP SPT=60330 DPT=9105 SEQ=1203010640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D58D700000000001030307) 
Dec 06 09:40:50 np0005548789.localdomain sudo[198655]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dllzfxlwuqfnpcjigwjjerhyztzlbgpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014049.8672323-2010-138592587534942/AnsiballZ_file.py
Dec 06 09:40:50 np0005548789.localdomain sudo[198655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:40:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:40:50 np0005548789.localdomain podman[198659]: 2025-12-06 09:40:50.236566766 +0000 UTC m=+0.079256505 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:40:50 np0005548789.localdomain podman[198659]: 2025-12-06 09:40:50.272157211 +0000 UTC m=+0.114846950 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:40:50 np0005548789.localdomain podman[198657]: 2025-12-06 09:40:50.282191017 +0000 UTC m=+0.125043911 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:40:50 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:40:50 np0005548789.localdomain podman[198657]: 2025-12-06 09:40:50.321109842 +0000 UTC m=+0.163962796 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:40:50 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:40:50 np0005548789.localdomain python3.9[198658]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:50 np0005548789.localdomain sudo[198655]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:50 np0005548789.localdomain sudo[198806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkofdfeolwycanyhhqojqhplrmfjsyrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014050.471646-2010-164435876497110/AnsiballZ_file.py
Dec 06 09:40:50 np0005548789.localdomain sudo[198806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:51 np0005548789.localdomain python3.9[198808]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:51 np0005548789.localdomain sudo[198806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:51 np0005548789.localdomain sudo[198916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqiqsfemhhrybkwsdgbdddxrjplwlpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014051.1526392-2010-196561184707710/AnsiballZ_file.py
Dec 06 09:40:51 np0005548789.localdomain sudo[198916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:51 np0005548789.localdomain python3.9[198918]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:51 np0005548789.localdomain sudo[198916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:52 np0005548789.localdomain sudo[199026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doicghlizeaumpeiqzpnowflqaypqhwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014051.8011417-2010-117809892502238/AnsiballZ_file.py
Dec 06 09:40:52 np0005548789.localdomain sudo[199026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:52 np0005548789.localdomain python3.9[199028]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:52 np0005548789.localdomain sudo[199026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:52 np0005548789.localdomain sudo[199136]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjfabewoffvknrhdammjfboaggorobqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014052.4082959-2010-50114431494287/AnsiballZ_file.py
Dec 06 09:40:52 np0005548789.localdomain sudo[199136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:52 np0005548789.localdomain python3.9[199138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:52 np0005548789.localdomain sudo[199136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548789.localdomain sudo[199246]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amenmjapkziceajqagrscovhtretvicc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014053.069786-2010-98892199331692/AnsiballZ_file.py
Dec 06 09:40:53 np0005548789.localdomain sudo[199246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:53 np0005548789.localdomain sudo[199247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:40:53 np0005548789.localdomain sudo[199247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:53 np0005548789.localdomain sudo[199247]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548789.localdomain sudo[199267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:40:53 np0005548789.localdomain sudo[199267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:53 np0005548789.localdomain python3.9[199264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:53 np0005548789.localdomain sudo[199246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62227 DF PROTO=TCP SPT=60330 DPT=9105 SEQ=1203010640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D59D2F0000000001030307) 
Dec 06 09:40:53 np0005548789.localdomain sudo[199410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-livodioidkjgrwgxhufawdmlaihavlrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014053.6702082-2010-240360274014139/AnsiballZ_file.py
Dec 06 09:40:53 np0005548789.localdomain sudo[199410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:54 np0005548789.localdomain sudo[199267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548789.localdomain python3.9[199412]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:54 np0005548789.localdomain sudo[199410]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548789.localdomain sudo[199534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgsgjthkrensaakuepbwrkiskhtvoqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014054.284459-2010-108904087847726/AnsiballZ_file.py
Dec 06 09:40:54 np0005548789.localdomain sudo[199534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:54 np0005548789.localdomain python3.9[199536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:54 np0005548789.localdomain sudo[199537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:40:54 np0005548789.localdomain sudo[199537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:40:54 np0005548789.localdomain sudo[199534]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:54 np0005548789.localdomain sudo[199537]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:55 np0005548789.localdomain sudo[199662]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtqysfjllofxtpkiufodeairrumgwbal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014054.8810952-2010-43235541038806/AnsiballZ_file.py
Dec 06 09:40:55 np0005548789.localdomain sudo[199662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:55 np0005548789.localdomain python3.9[199664]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:55 np0005548789.localdomain sudo[199662]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:55 np0005548789.localdomain sudo[199772]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymdoldqgjzhejtbzktahzblcbpdaganj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014055.469272-2010-244957733509354/AnsiballZ_file.py
Dec 06 09:40:55 np0005548789.localdomain sudo[199772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:56 np0005548789.localdomain python3.9[199774]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:56 np0005548789.localdomain sudo[199772]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:56 np0005548789.localdomain sudo[199882]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saijtaaxafqbrfpgnocjyaagdqzrwysi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014056.1966648-2010-98817962138688/AnsiballZ_file.py
Dec 06 09:40:56 np0005548789.localdomain sudo[199882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:56 np0005548789.localdomain python3.9[199884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:56 np0005548789.localdomain sudo[199882]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27768 DF PROTO=TCP SPT=56404 DPT=9882 SEQ=532615157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5A9EF0000000001030307) 
Dec 06 09:40:57 np0005548789.localdomain sudo[199992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpfjxobvgfitgfjejxvdadtsrtgoyvhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014056.8442082-2010-208284455046762/AnsiballZ_file.py
Dec 06 09:40:57 np0005548789.localdomain sudo[199992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:57 np0005548789.localdomain python3.9[199994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:57 np0005548789.localdomain sudo[199992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:57 np0005548789.localdomain sudo[200102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wilnwxzzzosjzopykvgqzvtenweaeheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014057.4704545-2010-54407706961248/AnsiballZ_file.py
Dec 06 09:40:57 np0005548789.localdomain sudo[200102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:57 np0005548789.localdomain python3.9[200104]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:57 np0005548789.localdomain sudo[200102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:58 np0005548789.localdomain sudo[200212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdooxqquwyevwyhjiwglutzwmwsavudi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014058.081082-2010-66624843667190/AnsiballZ_file.py
Dec 06 09:40:58 np0005548789.localdomain sudo[200212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:59 np0005548789.localdomain python3.9[200214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:40:59 np0005548789.localdomain sudo[200212]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45041 DF PROTO=TCP SPT=47250 DPT=9101 SEQ=4230461905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5B3EF0000000001030307) 
Dec 06 09:40:59 np0005548789.localdomain sudo[200322]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpkytykxxsrklnpsfchjyxfeonjkbamj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268197-2307-260790214299972/AnsiballZ_stat.py
Dec 06 09:40:59 np0005548789.localdomain sudo[200322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:59 np0005548789.localdomain python3.9[200324]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:40:59 np0005548789.localdomain sudo[200322]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62228 DF PROTO=TCP SPT=60330 DPT=9105 SEQ=1203010640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5BDF00000000001030307) 
Dec 06 09:41:02 np0005548789.localdomain sudo[200410]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjkhisdwovyniiktyvlrrewbecuqnypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268197-2307-260790214299972/AnsiballZ_copy.py
Dec 06 09:41:02 np0005548789.localdomain sudo[200410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:02 np0005548789.localdomain python3.9[200412]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014059.3268197-2307-260790214299972/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:02 np0005548789.localdomain sudo[200410]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:03 np0005548789.localdomain sudo[200520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbqrnmbqnlavataeueknmdnvvfzzqrii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014062.7543066-2307-279831913583817/AnsiballZ_stat.py
Dec 06 09:41:03 np0005548789.localdomain sudo[200520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:03 np0005548789.localdomain python3.9[200522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:03 np0005548789.localdomain sudo[200520]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:03 np0005548789.localdomain sudo[200608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evbogntpxoujwdmfsjlxdqlhnigjduln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014062.7543066-2307-279831913583817/AnsiballZ_copy.py
Dec 06 09:41:03 np0005548789.localdomain sudo[200608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:03 np0005548789.localdomain python3.9[200610]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014062.7543066-2307-279831913583817/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:03 np0005548789.localdomain sudo[200608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:04 np0005548789.localdomain sudo[200718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlexhhneyaewvxxwptqrngvmydhatkqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014063.9284487-2307-91829065378216/AnsiballZ_stat.py
Dec 06 09:41:04 np0005548789.localdomain sudo[200718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:04 np0005548789.localdomain python3.9[200720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:04 np0005548789.localdomain sudo[200718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56345 DF PROTO=TCP SPT=41422 DPT=9102 SEQ=3406258533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5C8300000000001030307) 
Dec 06 09:41:04 np0005548789.localdomain sudo[200806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnamysysoxhkguoahfdvabzrxljxnueo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014063.9284487-2307-91829065378216/AnsiballZ_copy.py
Dec 06 09:41:04 np0005548789.localdomain sudo[200806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:04 np0005548789.localdomain python3.9[200808]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014063.9284487-2307-91829065378216/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:05 np0005548789.localdomain sudo[200806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:05 np0005548789.localdomain sudo[200916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjqcbshdbknnwcxqbhjyveduackfsnys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014065.1403632-2307-145135552790145/AnsiballZ_stat.py
Dec 06 09:41:05 np0005548789.localdomain sudo[200916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:05 np0005548789.localdomain python3.9[200918]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:05 np0005548789.localdomain sudo[200916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:06 np0005548789.localdomain sudo[201004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wokeqzjomdavlsfccuhzjplvmibpcoux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014065.1403632-2307-145135552790145/AnsiballZ_copy.py
Dec 06 09:41:06 np0005548789.localdomain sudo[201004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:06 np0005548789.localdomain python3.9[201006]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014065.1403632-2307-145135552790145/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:06 np0005548789.localdomain sudo[201004]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:06 np0005548789.localdomain sudo[201114]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeimeimyjzemimuwpwdgwrkbgxwoximv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014066.3170414-2307-152918529932135/AnsiballZ_stat.py
Dec 06 09:41:06 np0005548789.localdomain sudo[201114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:06 np0005548789.localdomain python3.9[201116]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:06 np0005548789.localdomain sudo[201114]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:07 np0005548789.localdomain sudo[201202]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmudwaqoygnzdnwibtrcldyazrjabbkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014066.3170414-2307-152918529932135/AnsiballZ_copy.py
Dec 06 09:41:07 np0005548789.localdomain sudo[201202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:07 np0005548789.localdomain python3.9[201204]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014066.3170414-2307-152918529932135/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:07 np0005548789.localdomain sudo[201202]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:07 np0005548789.localdomain sudo[201312]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crbhpxphqhhulkwmwndwbawcxefndlvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014067.408739-2307-265050007237985/AnsiballZ_stat.py
Dec 06 09:41:07 np0005548789.localdomain sudo[201312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30793 DF PROTO=TCP SPT=40024 DPT=9102 SEQ=2904579337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5D3EF0000000001030307) 
Dec 06 09:41:07 np0005548789.localdomain python3.9[201314]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:07 np0005548789.localdomain sudo[201312]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:08 np0005548789.localdomain sudo[201400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azourznilvnmsvuqybciuegcbowysfge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014067.408739-2307-265050007237985/AnsiballZ_copy.py
Dec 06 09:41:08 np0005548789.localdomain sudo[201400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:08 np0005548789.localdomain python3.9[201402]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014067.408739-2307-265050007237985/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:08 np0005548789.localdomain sudo[201400]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:08 np0005548789.localdomain sudo[201510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvvnawhntplgopvljxqurvjorfabsujm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014068.547441-2307-254833164115308/AnsiballZ_stat.py
Dec 06 09:41:08 np0005548789.localdomain sudo[201510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:08 np0005548789.localdomain python3.9[201512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:08 np0005548789.localdomain sudo[201510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:09 np0005548789.localdomain sudo[201598]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbyaoacqlwjoffvadowsidjayhtgopmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014068.547441-2307-254833164115308/AnsiballZ_copy.py
Dec 06 09:41:09 np0005548789.localdomain sudo[201598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:09 np0005548789.localdomain python3.9[201600]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014068.547441-2307-254833164115308/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:09 np0005548789.localdomain sudo[201598]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:09 np0005548789.localdomain sudo[201708]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzalvlwbocvhhstqvtzjguorbuxcdzcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014069.665544-2307-257104182139347/AnsiballZ_stat.py
Dec 06 09:41:09 np0005548789.localdomain sudo[201708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:10 np0005548789.localdomain python3.9[201710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:10 np0005548789.localdomain sudo[201708]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:10 np0005548789.localdomain sudo[201796]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfixbeqbnushuskqepzbvykochsynheu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014069.665544-2307-257104182139347/AnsiballZ_copy.py
Dec 06 09:41:10 np0005548789.localdomain sudo[201796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:10 np0005548789.localdomain python3.9[201798]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014069.665544-2307-257104182139347/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:10 np0005548789.localdomain sudo[201796]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56347 DF PROTO=TCP SPT=41422 DPT=9102 SEQ=3406258533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5DFEF0000000001030307) 
Dec 06 09:41:10 np0005548789.localdomain sudo[201906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orjvtqccfvmhtojnhsiwgponstjochzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014070.751107-2307-225721217987632/AnsiballZ_stat.py
Dec 06 09:41:10 np0005548789.localdomain sudo[201906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:11 np0005548789.localdomain python3.9[201908]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:11 np0005548789.localdomain sudo[201906]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:11 np0005548789.localdomain sudo[201994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emzvinzbtpilzrdqxhtcsbgrzzdoefcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014070.751107-2307-225721217987632/AnsiballZ_copy.py
Dec 06 09:41:11 np0005548789.localdomain sudo[201994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:11 np0005548789.localdomain python3.9[201996]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014070.751107-2307-225721217987632/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:11 np0005548789.localdomain sudo[201994]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 np0005548789.localdomain sudo[202104]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sypgzvgdtgycwwxvfvbjvncbedfevvoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014071.8027453-2307-51517089730812/AnsiballZ_stat.py
Dec 06 09:41:12 np0005548789.localdomain sudo[202104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:12 np0005548789.localdomain python3.9[202106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:12 np0005548789.localdomain sudo[202104]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 np0005548789.localdomain sudo[202192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaaaunptlabtiqcfhqbzddltxddlqkob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014071.8027453-2307-51517089730812/AnsiballZ_copy.py
Dec 06 09:41:12 np0005548789.localdomain sudo[202192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:12 np0005548789.localdomain python3.9[202194]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014071.8027453-2307-51517089730812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:12 np0005548789.localdomain sshd[202195]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:12 np0005548789.localdomain sudo[202192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:13 np0005548789.localdomain sudo[202304]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyvagyjrkgwtxcpnfkodjraglwbjbzch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014072.8748844-2307-15507973832661/AnsiballZ_stat.py
Dec 06 09:41:13 np0005548789.localdomain sudo[202304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:13 np0005548789.localdomain python3.9[202306]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:13 np0005548789.localdomain sudo[202304]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:13 np0005548789.localdomain sudo[202392]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gflxserqjcchxnbufmmcmwvvtuxrzhdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014072.8748844-2307-15507973832661/AnsiballZ_copy.py
Dec 06 09:41:13 np0005548789.localdomain sudo[202392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:13 np0005548789.localdomain python3.9[202394]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014072.8748844-2307-15507973832661/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:13 np0005548789.localdomain sudo[202392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 np0005548789.localdomain sudo[202502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cebdbbiqcgixdfktrdnpqivrrdmyjuev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014073.9188316-2307-209534536075623/AnsiballZ_stat.py
Dec 06 09:41:14 np0005548789.localdomain sudo[202502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:14 np0005548789.localdomain sshd[202195]: Received disconnect from 103.234.151.178 port 59484:11: Bye Bye [preauth]
Dec 06 09:41:14 np0005548789.localdomain sshd[202195]: Disconnected from authenticating user root 103.234.151.178 port 59484 [preauth]
Dec 06 09:41:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16008 DF PROTO=TCP SPT=48766 DPT=9101 SEQ=2535142758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5ED700000000001030307) 
Dec 06 09:41:14 np0005548789.localdomain python3.9[202504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:14 np0005548789.localdomain sudo[202502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 np0005548789.localdomain sudo[202590]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dazzrjahsqrathobajsokfqwhkrlxwip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014073.9188316-2307-209534536075623/AnsiballZ_copy.py
Dec 06 09:41:14 np0005548789.localdomain sudo[202590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:14 np0005548789.localdomain python3.9[202592]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014073.9188316-2307-209534536075623/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:14 np0005548789.localdomain sudo[202590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 np0005548789.localdomain sudo[202700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daecmtzjuyjeyijclugedjhvlhgzrvpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014075.0138829-2307-267883138721481/AnsiballZ_stat.py
Dec 06 09:41:15 np0005548789.localdomain sudo[202700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:15 np0005548789.localdomain python3.9[202702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:15 np0005548789.localdomain sudo[202700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 np0005548789.localdomain sudo[202788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-znwmutjwzgbblvjounexdoaoqepazvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014075.0138829-2307-267883138721481/AnsiballZ_copy.py
Dec 06 09:41:15 np0005548789.localdomain sudo[202788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:16 np0005548789.localdomain python3.9[202790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014075.0138829-2307-267883138721481/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:16 np0005548789.localdomain sudo[202788]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 np0005548789.localdomain sudo[202898]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubvelqqjqdjjwhjdefmooszgnfichnmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014076.3405855-2307-204485640268918/AnsiballZ_stat.py
Dec 06 09:41:16 np0005548789.localdomain sudo[202898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:16 np0005548789.localdomain python3.9[202900]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:16 np0005548789.localdomain sudo[202898]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:17 np0005548789.localdomain sudo[202986]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rncjdhdgheulnvvyjsubotaqzogktzbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014076.3405855-2307-204485640268918/AnsiballZ_copy.py
Dec 06 09:41:17 np0005548789.localdomain sudo[202986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:17 np0005548789.localdomain python3.9[202988]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014076.3405855-2307-204485640268918/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:17 np0005548789.localdomain sudo[202986]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16010 DF PROTO=TCP SPT=48766 DPT=9101 SEQ=2535142758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D5F96F0000000001030307) 
Dec 06 09:41:18 np0005548789.localdomain python3.9[203096]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:19 np0005548789.localdomain sudo[203207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfxzxkgarjuaypqiqxiuqxiyiyezjlsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014079.143878-2925-231200296595292/AnsiballZ_seboolean.py
Dec 06 09:41:19 np0005548789.localdomain sudo[203207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50792 DF PROTO=TCP SPT=56146 DPT=9105 SEQ=2037736614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D602AF0000000001030307) 
Dec 06 09:41:19 np0005548789.localdomain python3.9[203209]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 06 09:41:19 np0005548789.localdomain sudo[203207]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:41:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:41:20 np0005548789.localdomain podman[203228]: 2025-12-06 09:41:20.949578013 +0000 UTC m=+0.106627260 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:41:20 np0005548789.localdomain podman[203227]: 2025-12-06 09:41:20.991031766 +0000 UTC m=+0.147244348 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Dec 06 09:41:21 np0005548789.localdomain podman[203227]: 2025-12-06 09:41:21.023118073 +0000 UTC m=+0.179330665 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:41:21 np0005548789.localdomain podman[203228]: 2025-12-06 09:41:21.092179568 +0000 UTC m=+0.249228795 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:41:21 np0005548789.localdomain sudo[203359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hghnmbdqbsokcedjfvecnqwmdozqbyud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014081.199904-2956-32727655742772/AnsiballZ_systemd.py
Dec 06 09:41:21 np0005548789.localdomain sudo[203359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:21 np0005548789.localdomain python3.9[203361]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:41:21 np0005548789.localdomain systemd-sysv-generator[203389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:21 np0005548789.localdomain systemd-rc-local-generator[203383]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:21 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Starting libvirt logging daemon socket...
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Starting libvirt logging daemon...
Dec 06 09:41:22 np0005548789.localdomain systemd[1]: Started libvirt logging daemon.
Dec 06 09:41:22 np0005548789.localdomain sudo[203359]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:22 np0005548789.localdomain sshd[203420]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:22 np0005548789.localdomain sudo[203512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfljiolvymrrbagviqwlgkihrxccvtrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014082.4475873-2956-213960783978576/AnsiballZ_systemd.py
Dec 06 09:41:22 np0005548789.localdomain sudo[203512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:23 np0005548789.localdomain python3.9[203514]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:41:23 np0005548789.localdomain systemd-rc-local-generator[203541]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:23 np0005548789.localdomain systemd-sysv-generator[203544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 06 09:41:23 np0005548789.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 06 09:41:23 np0005548789.localdomain sudo[203512]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50793 DF PROTO=TCP SPT=56146 DPT=9105 SEQ=2037736614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D612700000000001030307) 
Dec 06 09:41:23 np0005548789.localdomain sudo[203686]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgabidqdruftqgtfjjgedvdkwyvsjgra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014083.591283-2956-249667244035639/AnsiballZ_systemd.py
Dec 06 09:41:23 np0005548789.localdomain sudo[203686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:24 np0005548789.localdomain python3.9[203688]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:41:24 np0005548789.localdomain systemd-rc-local-generator[203712]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:24 np0005548789.localdomain systemd-sysv-generator[203717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:24 np0005548789.localdomain sshd[203420]: Received disconnect from 103.192.152.59 port 35776:11: Bye Bye [preauth]
Dec 06 09:41:24 np0005548789.localdomain sshd[203420]: Disconnected from authenticating user root 103.192.152.59 port 35776 [preauth]
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Started libvirt proxy daemon.
Dec 06 09:41:24 np0005548789.localdomain sudo[203686]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 09:41:24 np0005548789.localdomain setroubleshoot[203725]: Deleting alert 58e2bb45-d8cf-42a0-b321-404a4f96b4c3, it is allowed in current policy
Dec 06 09:41:24 np0005548789.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Dec 06 09:41:24 np0005548789.localdomain sudo[203864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvepvcxpijulyzloqqdcvgdtzqxvlzuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014084.638951-2956-156606512903677/AnsiballZ_systemd.py
Dec 06 09:41:24 np0005548789.localdomain sudo[203864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:25 np0005548789.localdomain python3.9[203866]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:41:25 np0005548789.localdomain systemd-rc-local-generator[203893]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:25 np0005548789.localdomain systemd-sysv-generator[203899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 06 09:41:25 np0005548789.localdomain systemd[1]: Started libvirt QEMU daemon.
Dec 06 09:41:25 np0005548789.localdomain sudo[203864]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27771 DF PROTO=TCP SPT=56404 DPT=9882 SEQ=532615157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D619EF0000000001030307) 
Dec 06 09:41:25 np0005548789.localdomain setroubleshoot[203725]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l c13484d1-2fe1-4721-ae6c-59ffaec2470f
Dec 06 09:41:25 np0005548789.localdomain setroubleshoot[203725]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 06 09:41:25 np0005548789.localdomain setroubleshoot[203725]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l c13484d1-2fe1-4721-ae6c-59ffaec2470f
Dec 06 09:41:25 np0005548789.localdomain setroubleshoot[203725]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Dec 06 09:41:25 np0005548789.localdomain sudo[204049]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhlovmhyvyyjclpzacmyuauhptjyeutm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014085.7639146-2956-148684744809850/AnsiballZ_systemd.py
Dec 06 09:41:25 np0005548789.localdomain sudo[204049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:26 np0005548789.localdomain python3.9[204051]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:41:26 np0005548789.localdomain systemd-rc-local-generator[204085]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:26 np0005548789.localdomain systemd-sysv-generator[204090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Starting libvirt secret daemon socket...
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 06 09:41:26 np0005548789.localdomain systemd[1]: Started libvirt secret daemon.
Dec 06 09:41:26 np0005548789.localdomain sudo[204049]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:27 np0005548789.localdomain sudo[204232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wikjwmfrfrgqjlmvgcderlnpsdgqtjxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014087.2307148-3066-241476312494000/AnsiballZ_file.py
Dec 06 09:41:27 np0005548789.localdomain sudo[204232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:27 np0005548789.localdomain python3.9[204234]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:27 np0005548789.localdomain sudo[204232]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:28 np0005548789.localdomain sudo[204342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itoubqnpyzcrubqignccyeircgjqhlni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014087.926875-3090-280471728754153/AnsiballZ_find.py
Dec 06 09:41:28 np0005548789.localdomain sudo[204342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:28 np0005548789.localdomain python3.9[204344]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:41:28 np0005548789.localdomain sudo[204342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:28 np0005548789.localdomain sudo[204452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlwbpmrafcevhidzpmthvpquyygejjmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014088.6399298-3114-107477009257644/AnsiballZ_command.py
Dec 06 09:41:28 np0005548789.localdomain sudo[204452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:29 np0005548789.localdomain python3.9[204454]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:29 np0005548789.localdomain sudo[204452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16012 DF PROTO=TCP SPT=48766 DPT=9101 SEQ=2535142758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D629EF0000000001030307) 
Dec 06 09:41:29 np0005548789.localdomain python3.9[204566]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:41:30 np0005548789.localdomain python3.9[204674]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:31 np0005548789.localdomain python3.9[204760]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014090.3217163-3171-164138750994441/.source.xml follow=False _original_basename=secret.xml.j2 checksum=9621e6cf70c8e0de93f1c73ff2a387c8c3ac4910 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50794 DF PROTO=TCP SPT=56146 DPT=9105 SEQ=2037736614 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D631EF0000000001030307) 
Dec 06 09:41:32 np0005548789.localdomain sudo[204868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egjhjsoheujackvmxviexqllqpztghih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014091.416654-3216-119529982417815/AnsiballZ_command.py
Dec 06 09:41:32 np0005548789.localdomain sudo[204868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:32 np0005548789.localdomain python3.9[204870]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 1939e851-b10c-5c3b-9bb7-8e7f380233e8
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:32 np0005548789.localdomain polkitd[1032]: Registered Authentication Agent for unix-process:204872:1051090 (system bus name :1.2848 [pkttyagent --process 204872 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:32 np0005548789.localdomain polkitd[1032]: Unregistered Authentication Agent for unix-process:204872:1051090 (system bus name :1.2848, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:32 np0005548789.localdomain polkitd[1032]: Registered Authentication Agent for unix-process:204871:1051089 (system bus name :1.2849 [pkttyagent --process 204871 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:32 np0005548789.localdomain polkitd[1032]: Unregistered Authentication Agent for unix-process:204871:1051089 (system bus name :1.2849, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:32 np0005548789.localdomain sudo[204868]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 np0005548789.localdomain python3.9[204990]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:34 np0005548789.localdomain sudo[205098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrfukdtzkfdvanriotllxwswgqqclpts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014093.7495618-3264-176165877796793/AnsiballZ_command.py
Dec 06 09:41:34 np0005548789.localdomain sudo[205098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:34 np0005548789.localdomain sshd[205101]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1415 DF PROTO=TCP SPT=37706 DPT=9102 SEQ=949871166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D63D2F0000000001030307) 
Dec 06 09:41:34 np0005548789.localdomain sudo[205098]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:35 np0005548789.localdomain sudo[205211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwnvmmwbmbcvsznnikubrqhmzbummpfu ; FSID=1939e851-b10c-5c3b-9bb7-8e7f380233e8 KEY=AQC14jNpAAAAABAAVDrRWQiDxWIwal0FbWGWhA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014094.9931629-3288-5493361449416/AnsiballZ_command.py
Dec 06 09:41:35 np0005548789.localdomain sudo[205211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:35 np0005548789.localdomain polkitd[1032]: Registered Authentication Agent for unix-process:205214:1051378 (system bus name :1.2852 [pkttyagent --process 205214 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Dec 06 09:41:35 np0005548789.localdomain polkitd[1032]: Unregistered Authentication Agent for unix-process:205214:1051378 (system bus name :1.2852, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Dec 06 09:41:35 np0005548789.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Dec 06 09:41:35 np0005548789.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 09:41:36 np0005548789.localdomain sudo[205211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:36 np0005548789.localdomain sshd[205101]: Connection reset by authenticating user root 45.140.17.124 port 32172 [preauth]
Dec 06 09:41:37 np0005548789.localdomain sudo[205329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tffoktgjfyddxolhplmclfkbjxvuhypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014096.8063555-3312-138669946373333/AnsiballZ_copy.py
Dec 06 09:41:37 np0005548789.localdomain sudo[205329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:37 np0005548789.localdomain sshd[205332]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:37 np0005548789.localdomain python3.9[205331]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:37 np0005548789.localdomain sudo[205329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:37 np0005548789.localdomain sudo[205441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgtgqmiyumagjzsamextyaiofxwtbemv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014097.4925287-3337-25413590663113/AnsiballZ_stat.py
Dec 06 09:41:37 np0005548789.localdomain sudo[205441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:37 np0005548789.localdomain python3.9[205443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=184 DF PROTO=TCP SPT=45314 DPT=9102 SEQ=2710989866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D649F00000000001030307) 
Dec 06 09:41:37 np0005548789.localdomain sudo[205441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:38 np0005548789.localdomain sudo[205529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzfsoxwgfwuyeipxdrmmkjxttwagdmls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014097.4925287-3337-25413590663113/AnsiballZ_copy.py
Dec 06 09:41:38 np0005548789.localdomain sudo[205529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:38 np0005548789.localdomain python3.9[205531]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014097.4925287-3337-25413590663113/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:38 np0005548789.localdomain sudo[205529]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:38 np0005548789.localdomain sshd[205332]: Connection reset by authenticating user root 45.140.17.124 port 32192 [preauth]
Dec 06 09:41:39 np0005548789.localdomain sshd[205620]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:39 np0005548789.localdomain sshd[205642]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:39 np0005548789.localdomain sudo[205641]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxaqvygbaxyjbvlwupjyvpyrhwmvagft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014098.8640893-3384-78823466301623/AnsiballZ_file.py
Dec 06 09:41:39 np0005548789.localdomain sudo[205641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:39 np0005548789.localdomain python3.9[205644]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:39 np0005548789.localdomain sudo[205641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:39 np0005548789.localdomain sudo[205753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdrxsedpdledpmmarzpruqheclsdkejh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014099.5677617-3408-174079458033870/AnsiballZ_stat.py
Dec 06 09:41:39 np0005548789.localdomain sudo[205753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:40 np0005548789.localdomain python3.9[205755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:40 np0005548789.localdomain sudo[205753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:40 np0005548789.localdomain sudo[205810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxnszqhzkvkovpjedldesmhiisjnbbpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014099.5677617-3408-174079458033870/AnsiballZ_file.py
Dec 06 09:41:40 np0005548789.localdomain sudo[205810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:40 np0005548789.localdomain python3.9[205812]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:40 np0005548789.localdomain sudo[205810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1417 DF PROTO=TCP SPT=37706 DPT=9102 SEQ=949871166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D654EF0000000001030307) 
Dec 06 09:41:40 np0005548789.localdomain sshd[205642]: Connection reset by authenticating user root 45.140.17.124 port 32208 [preauth]
Dec 06 09:41:41 np0005548789.localdomain sshd[205888]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:41 np0005548789.localdomain sudo[205921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyttmwpypntxvopoxqgvaddmkixattog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014100.8168073-3444-94955469030261/AnsiballZ_stat.py
Dec 06 09:41:41 np0005548789.localdomain sudo[205921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:41 np0005548789.localdomain python3.9[205923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:41 np0005548789.localdomain sudo[205921]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:41 np0005548789.localdomain sudo[205979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgzubdfwclqqrmrahcnvjkwrjqoppinb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014100.8168073-3444-94955469030261/AnsiballZ_file.py
Dec 06 09:41:41 np0005548789.localdomain sudo[205979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:41 np0005548789.localdomain python3.9[205981]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.if4akpn7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:41 np0005548789.localdomain sudo[205979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:42 np0005548789.localdomain sudo[206089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keztgyzncvqoihefchefhkyzrmhiqrqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014101.984606-3481-65210389219257/AnsiballZ_stat.py
Dec 06 09:41:42 np0005548789.localdomain sudo[206089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:42 np0005548789.localdomain python3.9[206091]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:42 np0005548789.localdomain sudo[206089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:42 np0005548789.localdomain sudo[206146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-criydczmbonqwvygysevwyjxhrufcnxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014101.984606-3481-65210389219257/AnsiballZ_file.py
Dec 06 09:41:42 np0005548789.localdomain sudo[206146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:42 np0005548789.localdomain python3.9[206148]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:42 np0005548789.localdomain sudo[206146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:42 np0005548789.localdomain sshd[205620]: Received disconnect from 179.33.210.213 port 35712:11: Bye Bye [preauth]
Dec 06 09:41:42 np0005548789.localdomain sshd[205620]: Disconnected from authenticating user root 179.33.210.213 port 35712 [preauth]
Dec 06 09:41:43 np0005548789.localdomain sshd[205888]: Connection reset by authenticating user root 45.140.17.124 port 32234 [preauth]
Dec 06 09:41:43 np0005548789.localdomain sudo[206256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alycxjspklkjjaajueuxgnnrzurqysjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014103.1616817-3519-187723219512882/AnsiballZ_command.py
Dec 06 09:41:43 np0005548789.localdomain sudo[206256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:43 np0005548789.localdomain sshd[206259]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:43 np0005548789.localdomain python3.9[206258]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:43 np0005548789.localdomain sudo[206256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:44 np0005548789.localdomain sudo[206369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nonjzoevkbjebfwzhpqdrgnrbibkbqhj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014103.930196-3544-44821642736280/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:41:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33004 DF PROTO=TCP SPT=49220 DPT=9101 SEQ=1657222880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D662A00000000001030307) 
Dec 06 09:41:44 np0005548789.localdomain sudo[206369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:44 np0005548789.localdomain python3[206371]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:41:44 np0005548789.localdomain sudo[206369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:45 np0005548789.localdomain sudo[206479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkcstqquvegaenhdmgwudddqywrxeuvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014104.751204-3567-178382467292606/AnsiballZ_stat.py
Dec 06 09:41:45 np0005548789.localdomain sudo[206479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:45 np0005548789.localdomain python3.9[206481]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:45 np0005548789.localdomain sudo[206479]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:45 np0005548789.localdomain sshd[206259]: Connection reset by authenticating user root 45.140.17.124 port 32950 [preauth]
Dec 06 09:41:45 np0005548789.localdomain sudo[206536]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lchmnjpqbstchmzcqebtkjggqgcaovkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014104.751204-3567-178382467292606/AnsiballZ_file.py
Dec 06 09:41:45 np0005548789.localdomain sudo[206536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:45 np0005548789.localdomain python3.9[206538]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:45 np0005548789.localdomain sudo[206536]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:46 np0005548789.localdomain sshd[206556]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:46 np0005548789.localdomain sudo[206648]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggxrngvsufuxdqkibmamfncthwqlvwiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014106.7215326-3603-37268128732506/AnsiballZ_stat.py
Dec 06 09:41:46 np0005548789.localdomain sudo[206648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:47 np0005548789.localdomain python3.9[206650]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:47 np0005548789.localdomain sudo[206648]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:41:47.271 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:41:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:41:47.272 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:41:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:41:47.273 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:41:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33006 DF PROTO=TCP SPT=49220 DPT=9101 SEQ=1657222880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D66EAF0000000001030307) 
Dec 06 09:41:47 np0005548789.localdomain sudo[206705]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eznqzbcroqqhsestnlagtmgqxlzotpwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014106.7215326-3603-37268128732506/AnsiballZ_file.py
Dec 06 09:41:47 np0005548789.localdomain sudo[206705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:47 np0005548789.localdomain python3.9[206707]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:47 np0005548789.localdomain sudo[206705]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:48 np0005548789.localdomain sshd[206556]: Received disconnect from 118.193.38.207 port 48652:11: Bye Bye [preauth]
Dec 06 09:41:48 np0005548789.localdomain sshd[206556]: Disconnected from authenticating user root 118.193.38.207 port 48652 [preauth]
Dec 06 09:41:48 np0005548789.localdomain sudo[206815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-matwhgcpxdwiemflmufhkyebjrzyzlcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014107.8943233-3639-217374823098852/AnsiballZ_stat.py
Dec 06 09:41:48 np0005548789.localdomain sudo[206815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:48 np0005548789.localdomain python3.9[206817]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:48 np0005548789.localdomain sudo[206815]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:49 np0005548789.localdomain sudo[206872]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmkqemfspkgyndnctkvsxuuukjdbqpco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014107.8943233-3639-217374823098852/AnsiballZ_file.py
Dec 06 09:41:49 np0005548789.localdomain sudo[206872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:49 np0005548789.localdomain python3.9[206874]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:49 np0005548789.localdomain sudo[206872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44721 DF PROTO=TCP SPT=35312 DPT=9105 SEQ=125070986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D677AF0000000001030307) 
Dec 06 09:41:49 np0005548789.localdomain sudo[206982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdennclmfxmhnvocaepkegxwlxyhnssh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014109.506715-3675-16964904208067/AnsiballZ_stat.py
Dec 06 09:41:49 np0005548789.localdomain sudo[206982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:50 np0005548789.localdomain python3.9[206984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:50 np0005548789.localdomain sudo[206982]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:50 np0005548789.localdomain sudo[207039]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osuxfbgesfpptapmmtvhijhktfixudiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014109.506715-3675-16964904208067/AnsiballZ_file.py
Dec 06 09:41:50 np0005548789.localdomain sudo[207039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:50 np0005548789.localdomain python3.9[207041]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:50 np0005548789.localdomain sudo[207039]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:51 np0005548789.localdomain sudo[207149]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixbuizaiafnkgeaghrzgwdgpnwwleccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014110.8487735-3712-137168062748546/AnsiballZ_stat.py
Dec 06 09:41:51 np0005548789.localdomain sudo[207149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: tmp-crun.5x0S7M.mount: Deactivated successfully.
Dec 06 09:41:51 np0005548789.localdomain podman[207152]: 2025-12-06 09:41:51.340245432 +0000 UTC m=+0.087269784 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 06 09:41:51 np0005548789.localdomain podman[207153]: 2025-12-06 09:41:51.394023727 +0000 UTC m=+0.139285565 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: tmp-crun.plWUsp.mount: Deactivated successfully.
Dec 06 09:41:51 np0005548789.localdomain podman[207152]: 2025-12-06 09:41:51.407188543 +0000 UTC m=+0.154212855 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:41:51 np0005548789.localdomain podman[207153]: 2025-12-06 09:41:51.427126592 +0000 UTC m=+0.172388440 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 09:41:51 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:41:51 np0005548789.localdomain python3.9[207151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:51 np0005548789.localdomain sudo[207149]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:51 np0005548789.localdomain sudo[207283]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygzvboaymjmbacbwximjbiplsusbpoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014110.8487735-3712-137168062748546/AnsiballZ_copy.py
Dec 06 09:41:51 np0005548789.localdomain sudo[207283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:52 np0005548789.localdomain python3.9[207285]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014110.8487735-3712-137168062748546/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:52 np0005548789.localdomain sudo[207283]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:52 np0005548789.localdomain sudo[207393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcvmkjeriegiptnmfvsbnxvyleyrntyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014112.2596145-3756-110083302728710/AnsiballZ_file.py
Dec 06 09:41:52 np0005548789.localdomain sudo[207393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:52 np0005548789.localdomain python3.9[207395]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:52 np0005548789.localdomain sudo[207393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:53 np0005548789.localdomain sudo[207503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzkkvwxnvmylikegeukvrohwgoaohmmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014112.9229574-3780-112039484683863/AnsiballZ_command.py
Dec 06 09:41:53 np0005548789.localdomain sudo[207503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:53 np0005548789.localdomain python3.9[207505]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:53 np0005548789.localdomain sudo[207503]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44722 DF PROTO=TCP SPT=35312 DPT=9105 SEQ=125070986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D687700000000001030307) 
Dec 06 09:41:54 np0005548789.localdomain sudo[207616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuyghnbcfuvsxwhruzigzenncalxnmtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014113.6369207-3804-10375217254998/AnsiballZ_blockinfile.py
Dec 06 09:41:54 np0005548789.localdomain sudo[207616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:54 np0005548789.localdomain sshd[207619]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:41:54 np0005548789.localdomain python3.9[207618]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:54 np0005548789.localdomain sudo[207616]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:54 np0005548789.localdomain sudo[207692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:54 np0005548789.localdomain sudo[207692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:54 np0005548789.localdomain sudo[207692]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:54 np0005548789.localdomain sudo[207727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:41:55 np0005548789.localdomain sudo[207763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpcyfgvkzleiibamaryhsrzajqncheoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014114.7101367-3831-179591529107876/AnsiballZ_command.py
Dec 06 09:41:55 np0005548789.localdomain sudo[207727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:55 np0005548789.localdomain sudo[207763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:55 np0005548789.localdomain python3.9[207766]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:55 np0005548789.localdomain sudo[207763]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:55 np0005548789.localdomain sudo[207727]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:55 np0005548789.localdomain sudo[207908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hoilddmjlwydcqydhbppobhwmhndigxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014115.446724-3855-212918803832236/AnsiballZ_stat.py
Dec 06 09:41:55 np0005548789.localdomain sudo[207908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26940 DF PROTO=TCP SPT=42488 DPT=9882 SEQ=458478708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D68FEF0000000001030307) 
Dec 06 09:41:56 np0005548789.localdomain sshd[207619]: Received disconnect from 103.157.25.60 port 54682:11: Bye Bye [preauth]
Dec 06 09:41:56 np0005548789.localdomain sshd[207619]: Disconnected from authenticating user root 103.157.25.60 port 54682 [preauth]
Dec 06 09:41:56 np0005548789.localdomain python3.9[207910]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:41:56 np0005548789.localdomain sudo[207908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:56 np0005548789.localdomain sudo[208020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcqmcoqpkxbsyqnpnqoadlasvtirsyhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014116.3459153-3880-97342121837996/AnsiballZ_command.py
Dec 06 09:41:56 np0005548789.localdomain sudo[208020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:56 np0005548789.localdomain python3.9[208022]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:41:56 np0005548789.localdomain sudo[208020]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:57 np0005548789.localdomain sudo[208133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czdaywrynwffoegpaqidzhxyvyotlbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.0982423-3904-10992810022405/AnsiballZ_file.py
Dec 06 09:41:57 np0005548789.localdomain sudo[208133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:57 np0005548789.localdomain python3.9[208135]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:57 np0005548789.localdomain sudo[208133]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548789.localdomain sudo[208243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlavcfdnuewyanazbrttazorqupfldqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.8059533-3927-110576517304984/AnsiballZ_stat.py
Dec 06 09:41:58 np0005548789.localdomain sudo[208243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:58 np0005548789.localdomain python3.9[208245]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:58 np0005548789.localdomain sudo[208243]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548789.localdomain sudo[208246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:41:58 np0005548789.localdomain sudo[208246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:41:58 np0005548789.localdomain sudo[208246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 np0005548789.localdomain sudo[208349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijwbyloiwtojzkwhspkppgtlbxdtfcdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014117.8059533-3927-110576517304984/AnsiballZ_copy.py
Dec 06 09:41:58 np0005548789.localdomain sudo[208349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:58 np0005548789.localdomain python3.9[208351]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014117.8059533-3927-110576517304984/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:41:58 np0005548789.localdomain sudo[208349]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:59 np0005548789.localdomain sudo[208459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtphfvqpxtwjzgbbukgqqkzxkrwutyvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014119.0941432-3972-268370244555265/AnsiballZ_stat.py
Dec 06 09:41:59 np0005548789.localdomain sudo[208459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:41:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33008 DF PROTO=TCP SPT=49220 DPT=9101 SEQ=1657222880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D69DEF0000000001030307) 
Dec 06 09:41:59 np0005548789.localdomain python3.9[208461]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:41:59 np0005548789.localdomain sudo[208459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:59 np0005548789.localdomain sudo[208547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhvxcbnaksjaqgwmivyiegemqkldvufu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014119.0941432-3972-268370244555265/AnsiballZ_copy.py
Dec 06 09:41:59 np0005548789.localdomain sudo[208547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:00 np0005548789.localdomain python3.9[208549]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014119.0941432-3972-268370244555265/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:00 np0005548789.localdomain sudo[208547]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:00 np0005548789.localdomain sudo[208657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymdznanqyfuzbqpisaapbitovvtbomhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014120.3321948-4017-219940237060375/AnsiballZ_stat.py
Dec 06 09:42:00 np0005548789.localdomain sudo[208657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:00 np0005548789.localdomain python3.9[208659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:00 np0005548789.localdomain sudo[208657]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:01 np0005548789.localdomain sudo[208745]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prerkdynmbcdbqdyxiwcnrdkojkqtyji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014120.3321948-4017-219940237060375/AnsiballZ_copy.py
Dec 06 09:42:01 np0005548789.localdomain sudo[208745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:01 np0005548789.localdomain python3.9[208747]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014120.3321948-4017-219940237060375/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:01 np0005548789.localdomain sudo[208745]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:01 np0005548789.localdomain sudo[208855]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nskozuomazytlbvdkqysbxcelujfnbry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014121.6946347-4062-111495892961479/AnsiballZ_systemd.py
Dec 06 09:42:01 np0005548789.localdomain sudo[208855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44723 DF PROTO=TCP SPT=35312 DPT=9105 SEQ=125070986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6A7EF0000000001030307) 
Dec 06 09:42:02 np0005548789.localdomain python3.9[208857]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:42:02 np0005548789.localdomain systemd-sysv-generator[208882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:02 np0005548789.localdomain systemd-rc-local-generator[208878]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:02 np0005548789.localdomain systemd[1]: Reached target edpm_libvirt.target.
Dec 06 09:42:02 np0005548789.localdomain sudo[208855]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:03 np0005548789.localdomain sudo[209005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaszyaajjvypefrmvpteefyruvjbmvhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014122.9333167-4086-53939546820146/AnsiballZ_systemd.py
Dec 06 09:42:03 np0005548789.localdomain sudo[209005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:03 np0005548789.localdomain python3.9[209007]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:42:03 np0005548789.localdomain systemd-rc-local-generator[209032]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:03 np0005548789.localdomain systemd-sysv-generator[209036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:42:03 np0005548789.localdomain systemd-sysv-generator[209070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:04 np0005548789.localdomain systemd-rc-local-generator[209067]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:04 np0005548789.localdomain sudo[209005]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37729 DF PROTO=TCP SPT=48238 DPT=9102 SEQ=4120414039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6B26F0000000001030307) 
Dec 06 09:42:04 np0005548789.localdomain sshd[160725]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Dec 06 09:42:04 np0005548789.localdomain systemd[1]: session-53.scope: Consumed 3min 34.839s CPU time.
Dec 06 09:42:04 np0005548789.localdomain systemd-logind[766]: Session 53 logged out. Waiting for processes to exit.
Dec 06 09:42:04 np0005548789.localdomain systemd-logind[766]: Removed session 53.
Dec 06 09:42:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56350 DF PROTO=TCP SPT=41422 DPT=9102 SEQ=3406258533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6BDEF0000000001030307) 
Dec 06 09:42:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37731 DF PROTO=TCP SPT=48238 DPT=9102 SEQ=4120414039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6CA2F0000000001030307) 
Dec 06 09:42:11 np0005548789.localdomain sshd[209098]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:11 np0005548789.localdomain sshd[209098]: Accepted publickey for zuul from 192.168.122.30 port 44540 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:42:11 np0005548789.localdomain systemd-logind[766]: New session 54 of user zuul.
Dec 06 09:42:11 np0005548789.localdomain systemd[1]: Started Session 54 of User zuul.
Dec 06 09:42:11 np0005548789.localdomain sshd[209098]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:42:12 np0005548789.localdomain python3.9[209209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:42:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59301 DF PROTO=TCP SPT=42230 DPT=9101 SEQ=3380191360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6D7D20000000001030307) 
Dec 06 09:42:14 np0005548789.localdomain python3.9[209321]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:42:14 np0005548789.localdomain network[209338]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:42:14 np0005548789.localdomain network[209339]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:42:14 np0005548789.localdomain network[209340]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:42:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:16 np0005548789.localdomain sshd[209419]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59303 DF PROTO=TCP SPT=42230 DPT=9101 SEQ=3380191360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6E3EF0000000001030307) 
Dec 06 09:42:17 np0005548789.localdomain sshd[209419]: Received disconnect from 64.227.156.63 port 58176:11: Bye Bye [preauth]
Dec 06 09:42:17 np0005548789.localdomain sshd[209419]: Disconnected from authenticating user root 64.227.156.63 port 58176 [preauth]
Dec 06 09:42:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27424 DF PROTO=TCP SPT=60166 DPT=9105 SEQ=2454835699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6ECEF0000000001030307) 
Dec 06 09:42:20 np0005548789.localdomain sudo[209572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycuhwonaljzpkqopunzcshycjktzhkhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014139.8331869-102-69226328199753/AnsiballZ_setup.py
Dec 06 09:42:20 np0005548789.localdomain sudo[209572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:21 np0005548789.localdomain python3.9[209574]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:42:21 np0005548789.localdomain sudo[209572]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:21 np0005548789.localdomain sudo[209635]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owvdeorlfepqssjszqfqxftuiubulafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014139.8331869-102-69226328199753/AnsiballZ_dnf.py
Dec 06 09:42:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:42:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:42:21 np0005548789.localdomain sudo[209635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:21 np0005548789.localdomain podman[209638]: 2025-12-06 09:42:21.912723313 +0000 UTC m=+0.063887911 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:42:21 np0005548789.localdomain podman[209638]: 2025-12-06 09:42:21.954803597 +0000 UTC m=+0.105968275 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:42:21 np0005548789.localdomain systemd[1]: tmp-crun.EMgViU.mount: Deactivated successfully.
Dec 06 09:42:21 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:42:21 np0005548789.localdomain podman[209637]: 2025-12-06 09:42:21.973228801 +0000 UTC m=+0.125149412 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller)
Dec 06 09:42:22 np0005548789.localdomain podman[209637]: 2025-12-06 09:42:21.999975495 +0000 UTC m=+0.151896096 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:42:22 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:42:22 np0005548789.localdomain python3.9[209649]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:42:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27425 DF PROTO=TCP SPT=60166 DPT=9105 SEQ=2454835699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D6FCAF0000000001030307) 
Dec 06 09:42:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60540 DF PROTO=TCP SPT=38562 DPT=9882 SEQ=4256135414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7096F0000000001030307) 
Dec 06 09:42:29 np0005548789.localdomain sudo[209635]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59305 DF PROTO=TCP SPT=42230 DPT=9101 SEQ=3380191360 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D713F00000000001030307) 
Dec 06 09:42:30 np0005548789.localdomain sudo[209790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtvigsicxdluntkwcfkqjtqjtknwutgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014149.862659-138-222501761518688/AnsiballZ_stat.py
Dec 06 09:42:30 np0005548789.localdomain sudo[209790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:30 np0005548789.localdomain python3.9[209792]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:30 np0005548789.localdomain sudo[209790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:31 np0005548789.localdomain sudo[209902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diwiywiwwrtjuqxgklvcldknlnpmszcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014150.7057493-162-133357409751748/AnsiballZ_copy.py
Dec 06 09:42:31 np0005548789.localdomain sudo[209902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:31 np0005548789.localdomain python3.9[209904]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:31 np0005548789.localdomain sudo[209902]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:31 np0005548789.localdomain sudo[210012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koxekhfbxibimlulmlduopwpfoweubyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014151.545811-186-143890162207741/AnsiballZ_command.py
Dec 06 09:42:31 np0005548789.localdomain sudo[210012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:32 np0005548789.localdomain python3.9[210014]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:32 np0005548789.localdomain sudo[210012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27426 DF PROTO=TCP SPT=60166 DPT=9105 SEQ=2454835699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D71DF00000000001030307) 
Dec 06 09:42:33 np0005548789.localdomain sudo[210123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avngqzdyprmnicddvljkiwdivmmuddsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014152.4510577-210-265373012845206/AnsiballZ_command.py
Dec 06 09:42:33 np0005548789.localdomain sudo[210123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:33 np0005548789.localdomain python3.9[210125]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:33 np0005548789.localdomain sudo[210123]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:34 np0005548789.localdomain sudo[210234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlnexmjlgphimidxzsmheryxhbhbvkgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014153.8311353-234-136061188509483/AnsiballZ_command.py
Dec 06 09:42:34 np0005548789.localdomain sudo[210234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:34 np0005548789.localdomain python3.9[210236]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:34 np0005548789.localdomain sudo[210234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26719 DF PROTO=TCP SPT=42314 DPT=9102 SEQ=1157572135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D727B00000000001030307) 
Dec 06 09:42:34 np0005548789.localdomain sudo[210345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubbwvbpcqfusksbumvvotpkvkcyuhxzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014154.6702523-261-256028324537129/AnsiballZ_stat.py
Dec 06 09:42:34 np0005548789.localdomain sudo[210345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:35 np0005548789.localdomain python3.9[210347]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:35 np0005548789.localdomain sudo[210345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:36 np0005548789.localdomain sudo[210457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmheexnfbndcicutvrybrbvzhszgwgat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014156.080088-294-39824215319033/AnsiballZ_lineinfile.py
Dec 06 09:42:36 np0005548789.localdomain sudo[210457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:36 np0005548789.localdomain sshd[210460]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:36 np0005548789.localdomain python3.9[210459]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:36 np0005548789.localdomain sudo[210457]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:37 np0005548789.localdomain sudo[210569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqidnteqqgoevmlgfwytyktepofgnwzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014157.0972626-321-27448876360008/AnsiballZ_systemd_service.py
Dec 06 09:42:37 np0005548789.localdomain sudo[210569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1420 DF PROTO=TCP SPT=37706 DPT=9102 SEQ=949871166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D733EF0000000001030307) 
Dec 06 09:42:37 np0005548789.localdomain python3.9[210571]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:38 np0005548789.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 06 09:42:38 np0005548789.localdomain sudo[210569]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:38 np0005548789.localdomain sshd[210460]: Received disconnect from 103.234.151.178 port 22078:11: Bye Bye [preauth]
Dec 06 09:42:38 np0005548789.localdomain sshd[210460]: Disconnected from authenticating user root 103.234.151.178 port 22078 [preauth]
Dec 06 09:42:38 np0005548789.localdomain sudo[210683]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teyystcaicgqevnerrffuhomwxkeotyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014158.4497879-346-154026240373869/AnsiballZ_systemd_service.py
Dec 06 09:42:38 np0005548789.localdomain sudo[210683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:39 np0005548789.localdomain python3.9[210685]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:42:39 np0005548789.localdomain systemd-sysv-generator[210712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:39 np0005548789.localdomain systemd-rc-local-generator[210709]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: Starting Open-iSCSI...
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: If using hardware iscsi like qla4xxx this message can be ignored.
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Dec 06 09:42:39 np0005548789.localdomain iscsid[210726]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: Started Open-iSCSI.
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 06 09:42:39 np0005548789.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 06 09:42:39 np0005548789.localdomain sudo[210683]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26721 DF PROTO=TCP SPT=42314 DPT=9102 SEQ=1157572135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D73F6F0000000001030307) 
Dec 06 09:42:41 np0005548789.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 09:42:41 np0005548789.localdomain sudo[210836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgnrpdqvpwhzfxdhimjzkzaccwvuzcxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014161.0427895-378-253042237336014/AnsiballZ_service_facts.py
Dec 06 09:42:41 np0005548789.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 09:42:41 np0005548789.localdomain sudo[210836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:41 np0005548789.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Dec 06 09:42:41 np0005548789.localdomain python3.9[210838]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:42:41 np0005548789.localdomain network[210868]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:42:41 np0005548789.localdomain network[210869]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:42:41 np0005548789.localdomain network[210870]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 43e61b33-f78c-4af8-8283-58c8652417d8
Dec 06 09:42:42 np0005548789.localdomain setroubleshoot[210761]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Dec 06 09:42:42 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:42:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59714 DF PROTO=TCP SPT=51474 DPT=9101 SEQ=1479808194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D74D030000000001030307) 
Dec 06 09:42:45 np0005548789.localdomain sudo[210836]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:46 np0005548789.localdomain sudo[211102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvjifeyknbldtepmwbwyiizaqxpknoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014166.14169-408-135077031579025/AnsiballZ_file.py
Dec 06 09:42:46 np0005548789.localdomain sudo[211102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:46 np0005548789.localdomain python3.9[211104]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:42:46 np0005548789.localdomain sudo[211102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:42:47.272 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:42:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:42:47.273 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:42:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:42:47.275 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:42:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59716 DF PROTO=TCP SPT=51474 DPT=9101 SEQ=1479808194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D758EF0000000001030307) 
Dec 06 09:42:47 np0005548789.localdomain sudo[211212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qppwodmwfgtlyvndnqpfrezrncbfmeqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014167.0967386-432-70297973213314/AnsiballZ_modprobe.py
Dec 06 09:42:47 np0005548789.localdomain sudo[211212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:47 np0005548789.localdomain python3.9[211214]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 06 09:42:47 np0005548789.localdomain sudo[211212]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:48 np0005548789.localdomain sudo[211326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmbyrjcntctbtdfocnpfpteyfgadgyhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014168.0053067-456-67675844310832/AnsiballZ_stat.py
Dec 06 09:42:48 np0005548789.localdomain sudo[211326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:48 np0005548789.localdomain python3.9[211328]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:48 np0005548789.localdomain sudo[211326]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:48 np0005548789.localdomain sudo[211414]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgpmoqsdjvabvncqqewgmrsyfalcwmij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014168.0053067-456-67675844310832/AnsiballZ_copy.py
Dec 06 09:42:48 np0005548789.localdomain sudo[211414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:49 np0005548789.localdomain python3.9[211416]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014168.0053067-456-67675844310832/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:49 np0005548789.localdomain sudo[211414]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61308 DF PROTO=TCP SPT=48238 DPT=9100 SEQ=4031396398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D761EF0000000001030307) 
Dec 06 09:42:49 np0005548789.localdomain sudo[211524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owubtkdvbzbawtoxyleizqhczriiznls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014169.3986847-504-189177362361555/AnsiballZ_lineinfile.py
Dec 06 09:42:49 np0005548789.localdomain sudo[211524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:49 np0005548789.localdomain python3.9[211526]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:49 np0005548789.localdomain sudo[211524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:50 np0005548789.localdomain sudo[211634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcqtvcizqxpikyaywnkugrcumcpeofxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014170.071649-528-238337247725907/AnsiballZ_systemd.py
Dec 06 09:42:50 np0005548789.localdomain sudo[211634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:50 np0005548789.localdomain python3.9[211636]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:42:50 np0005548789.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:42:50 np0005548789.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:42:50 np0005548789.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:42:50 np0005548789.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:42:51 np0005548789.localdomain systemd-modules-load[211640]: Module 'msr' is built in
Dec 06 09:42:51 np0005548789.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:42:51 np0005548789.localdomain sudo[211634]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:51 np0005548789.localdomain sshd[211729]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:51 np0005548789.localdomain sudo[211750]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkltejvqrzlzfonezwzaaxfaqntdgrbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014171.2917347-552-276950076800916/AnsiballZ_file.py
Dec 06 09:42:51 np0005548789.localdomain sudo[211750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:51 np0005548789.localdomain python3.9[211752]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:42:51 np0005548789.localdomain sudo[211750]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 09:42:52 np0005548789.localdomain podman[211770]: 2025-12-06 09:42:52.559699562 +0000 UTC m=+0.064591468 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:42:52 np0005548789.localdomain podman[211771]: 2025-12-06 09:42:52.614931993 +0000 UTC m=+0.119900182 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:42:52 np0005548789.localdomain podman[211770]: 2025-12-06 09:42:52.641731714 +0000 UTC m=+0.146623640 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:42:52 np0005548789.localdomain podman[211771]: 2025-12-06 09:42:52.695778928 +0000 UTC m=+0.200747117 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:42:52 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:42:53 np0005548789.localdomain sudo[211903]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ednxwfxfxtmbrdbjbkomcbrsnkjotgqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014172.791931-579-230724815898985/AnsiballZ_stat.py
Dec 06 09:42:53 np0005548789.localdomain sudo[211903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:53 np0005548789.localdomain python3.9[211905]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:53 np0005548789.localdomain sudo[211903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41787 DF PROTO=TCP SPT=40548 DPT=9105 SEQ=3123358156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D771F00000000001030307) 
Dec 06 09:42:53 np0005548789.localdomain sudo[212013]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veqhnsctmowwqmtkxqbkyqodyetzpwbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014173.5931084-606-73483838836619/AnsiballZ_stat.py
Dec 06 09:42:53 np0005548789.localdomain sudo[212013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:53 np0005548789.localdomain python3.9[212015]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:42:54 np0005548789.localdomain sshd[211729]: Received disconnect from 103.192.152.59 port 36868:11: Bye Bye [preauth]
Dec 06 09:42:54 np0005548789.localdomain sshd[211729]: Disconnected from authenticating user root 103.192.152.59 port 36868 [preauth]
Dec 06 09:42:54 np0005548789.localdomain sudo[212013]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:55 np0005548789.localdomain sudo[212123]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdwjkdlevzpgaaefhtylydkotxgvdzyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014174.881062-630-98071389199335/AnsiballZ_stat.py
Dec 06 09:42:55 np0005548789.localdomain sudo[212123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:55 np0005548789.localdomain python3.9[212125]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:42:55 np0005548789.localdomain sudo[212123]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:55 np0005548789.localdomain sudo[212211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhlpvpmxeaahoeuzbtipgalppwfklazy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014174.881062-630-98071389199335/AnsiballZ_copy.py
Dec 06 09:42:55 np0005548789.localdomain sudo[212211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60543 DF PROTO=TCP SPT=38562 DPT=9882 SEQ=4256135414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D779F00000000001030307) 
Dec 06 09:42:55 np0005548789.localdomain python3.9[212213]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014174.881062-630-98071389199335/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:55 np0005548789.localdomain sudo[212211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:56 np0005548789.localdomain sshd[212231]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:42:56 np0005548789.localdomain sudo[212323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpxfaluguonfnmzahmkrwzktddisjzrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014176.2047255-675-279268506513649/AnsiballZ_command.py
Dec 06 09:42:56 np0005548789.localdomain sudo[212323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:56 np0005548789.localdomain python3.9[212325]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:42:56 np0005548789.localdomain sudo[212323]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:57 np0005548789.localdomain sudo[212434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqohltbnhzsgdrqeytsyrzznubbaumms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014176.8845603-699-252753976083036/AnsiballZ_lineinfile.py
Dec 06 09:42:57 np0005548789.localdomain sudo[212434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:57 np0005548789.localdomain python3.9[212436]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:57 np0005548789.localdomain sudo[212434]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548789.localdomain sudo[212544]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etgzltvhzcvcukjzztztffvibpofwpfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014177.7586353-723-103790776656409/AnsiballZ_replace.py
Dec 06 09:42:58 np0005548789.localdomain sudo[212544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:58 np0005548789.localdomain python3.9[212546]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:58 np0005548789.localdomain sudo[212544]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548789.localdomain sudo[212564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:58 np0005548789.localdomain sudo[212564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:58 np0005548789.localdomain sudo[212564]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:58 np0005548789.localdomain sudo[212582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:42:58 np0005548789.localdomain sudo[212582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:58 np0005548789.localdomain sudo[212690]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyizobvqfnmjbricejiehiezqiibfgiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014178.6232457-747-86185914791369/AnsiballZ_replace.py
Dec 06 09:42:58 np0005548789.localdomain sudo[212690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:59 np0005548789.localdomain sudo[212582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548789.localdomain python3.9[212692]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:59 np0005548789.localdomain sudo[212690]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548789.localdomain sudo[212730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:59 np0005548789.localdomain sudo[212730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:59 np0005548789.localdomain sudo[212730]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548789.localdomain sudo[212756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:42:59 np0005548789.localdomain sudo[212756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:42:59 np0005548789.localdomain sudo[212856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vunibaoaplcmhgumknypejdlgxaujrgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014179.3669894-774-55249019583223/AnsiballZ_lineinfile.py
Dec 06 09:42:59 np0005548789.localdomain sudo[212856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:42:59 np0005548789.localdomain python3.9[212858]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:42:59 np0005548789.localdomain sudo[212856]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59718 DF PROTO=TCP SPT=51474 DPT=9101 SEQ=1479808194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D789F00000000001030307) 
Dec 06 09:42:59 np0005548789.localdomain sudo[212756]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548789.localdomain sudo[212999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiawhaklkyziuupknbggfytectpbckdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014179.9747953-774-64573090295382/AnsiballZ_lineinfile.py
Dec 06 09:43:00 np0005548789.localdomain sudo[212999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:00 np0005548789.localdomain python3.9[213001]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:00 np0005548789.localdomain sudo[212999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548789.localdomain sudo[213023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:43:00 np0005548789.localdomain sudo[213023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:43:00 np0005548789.localdomain sudo[213023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:00 np0005548789.localdomain sudo[213127]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xinvoadtwcxfqombtmjhyfklmblgpses ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014180.5629218-774-250551545818086/AnsiballZ_lineinfile.py
Dec 06 09:43:00 np0005548789.localdomain sudo[213127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:01 np0005548789.localdomain python3.9[213129]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:01 np0005548789.localdomain sudo[213127]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:01 np0005548789.localdomain sudo[213237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eydkyjnjjtzqajsvzjqsczyfxigtzhje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014181.2192168-774-162058789106010/AnsiballZ_lineinfile.py
Dec 06 09:43:01 np0005548789.localdomain sudo[213237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:01 np0005548789.localdomain python3.9[213239]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:01 np0005548789.localdomain sudo[213237]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41788 DF PROTO=TCP SPT=40548 DPT=9105 SEQ=3123358156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D791F00000000001030307) 
Dec 06 09:43:02 np0005548789.localdomain sudo[213347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gylaqzrjmfswkuabapirvhfexyzrhuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014181.9913647-861-235279603447568/AnsiballZ_stat.py
Dec 06 09:43:02 np0005548789.localdomain sudo[213347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:02 np0005548789.localdomain python3.9[213349]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:02 np0005548789.localdomain sudo[213347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:02 np0005548789.localdomain sudo[213459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mybehhjffspanjptlylplistzeburnqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014182.6657708-885-162556106091460/AnsiballZ_file.py
Dec 06 09:43:02 np0005548789.localdomain sudo[213459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:03 np0005548789.localdomain python3.9[213461]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:03 np0005548789.localdomain sudo[213459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:03 np0005548789.localdomain sudo[213569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbqyubxjatdojophqzdwfbfcshgrbspy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014183.5108728-912-117568742703537/AnsiballZ_file.py
Dec 06 09:43:03 np0005548789.localdomain sudo[213569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:04 np0005548789.localdomain python3.9[213571]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:04 np0005548789.localdomain sudo[213569]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:04 np0005548789.localdomain sshd[213572]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:43:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6895 DF PROTO=TCP SPT=56978 DPT=9102 SEQ=1819866283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D79CF00000000001030307) 
Dec 06 09:43:05 np0005548789.localdomain sudo[213681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klbuwsbqfnfnxxlgaafnomgzsvwelfuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014184.8591015-937-134952750372852/AnsiballZ_stat.py
Dec 06 09:43:05 np0005548789.localdomain sudo[213681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:05 np0005548789.localdomain python3.9[213683]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:05 np0005548789.localdomain sudo[213681]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:05 np0005548789.localdomain sshd[213572]: Received disconnect from 118.193.38.207 port 43000:11: Bye Bye [preauth]
Dec 06 09:43:05 np0005548789.localdomain sshd[213572]: Disconnected from authenticating user root 118.193.38.207 port 43000 [preauth]
Dec 06 09:43:05 np0005548789.localdomain sudo[213738]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhqimrhybjrbesylrnvisoqzqujpttrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014184.8591015-937-134952750372852/AnsiballZ_file.py
Dec 06 09:43:05 np0005548789.localdomain sudo[213738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:05 np0005548789.localdomain python3.9[213740]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:05 np0005548789.localdomain sudo[213738]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:06 np0005548789.localdomain sudo[213848]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sabbxmseqyiesmdthgxsqzwzmrmowlem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014185.9378755-937-135781271327637/AnsiballZ_stat.py
Dec 06 09:43:06 np0005548789.localdomain sudo[213848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:07 np0005548789.localdomain python3.9[213850]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:07 np0005548789.localdomain sudo[213848]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:07 np0005548789.localdomain sudo[213905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srutzdezlmfctekxsxpkxcevydmcitqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014185.9378755-937-135781271327637/AnsiballZ_file.py
Dec 06 09:43:07 np0005548789.localdomain sudo[213905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:07 np0005548789.localdomain python3.9[213907]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:07 np0005548789.localdomain sudo[213905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:08 np0005548789.localdomain sudo[214015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emjtenneojkxjpxifgbhdqmygshnvote ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014187.8604224-1005-41628018912279/AnsiballZ_file.py
Dec 06 09:43:08 np0005548789.localdomain sudo[214015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:08 np0005548789.localdomain python3.9[214017]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:08 np0005548789.localdomain sudo[214015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:08 np0005548789.localdomain sudo[214125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbmhvhgmmlqojmzigryyeiewvglmahoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014188.5536616-1029-76363583416189/AnsiballZ_stat.py
Dec 06 09:43:08 np0005548789.localdomain sudo[214125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:09 np0005548789.localdomain python3.9[214127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:09 np0005548789.localdomain sudo[214125]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35844 DF PROTO=TCP SPT=33376 DPT=9882 SEQ=1522697681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7ADEF0000000001030307) 
Dec 06 09:43:09 np0005548789.localdomain sudo[214182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcwdusjytnlywivxdkdiogyiqixgtuzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014188.5536616-1029-76363583416189/AnsiballZ_file.py
Dec 06 09:43:09 np0005548789.localdomain sudo[214182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:09 np0005548789.localdomain python3.9[214184]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:09 np0005548789.localdomain sudo[214182]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:09 np0005548789.localdomain sudo[214292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlhnevozpuzmmzkweaeuohfxsvtsyoek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014189.7370803-1065-44844592887056/AnsiballZ_stat.py
Dec 06 09:43:09 np0005548789.localdomain sudo[214292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:10 np0005548789.localdomain python3.9[214294]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:10 np0005548789.localdomain sudo[214292]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:10 np0005548789.localdomain sudo[214349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzxmcrrayueavehgpxlzsxmfxrmaxwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014189.7370803-1065-44844592887056/AnsiballZ_file.py
Dec 06 09:43:10 np0005548789.localdomain sudo[214349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:10 np0005548789.localdomain python3.9[214351]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:10 np0005548789.localdomain sudo[214349]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6897 DF PROTO=TCP SPT=56978 DPT=9102 SEQ=1819866283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7B4AF0000000001030307) 
Dec 06 09:43:11 np0005548789.localdomain sudo[214459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ietmkxapyfunocyvvaklixsxhbhglpwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014190.9196477-1101-114623439752911/AnsiballZ_systemd.py
Dec 06 09:43:11 np0005548789.localdomain sudo[214459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:11 np0005548789.localdomain python3.9[214461]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:11 np0005548789.localdomain systemd-rc-local-generator[214484]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:11 np0005548789.localdomain systemd-sysv-generator[214488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:11 np0005548789.localdomain sudo[214459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:12 np0005548789.localdomain sudo[214607]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxyyaxtsxhvshsvmpgouruoagbfdybtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014192.219786-1125-68922403419793/AnsiballZ_stat.py
Dec 06 09:43:12 np0005548789.localdomain sudo[214607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:12 np0005548789.localdomain python3.9[214609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:12 np0005548789.localdomain sudo[214607]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:12 np0005548789.localdomain sudo[214664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjcfhwqsjeyrfgkvpxmdrvxwqwlgfxdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014192.219786-1125-68922403419793/AnsiballZ_file.py
Dec 06 09:43:12 np0005548789.localdomain sudo[214664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:13 np0005548789.localdomain python3.9[214666]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:13 np0005548789.localdomain sudo[214664]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:13 np0005548789.localdomain sudo[214774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzjjdrnhssidupgahwyfiaplosqeccof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014193.409094-1161-115632698460530/AnsiballZ_stat.py
Dec 06 09:43:13 np0005548789.localdomain sudo[214774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:13 np0005548789.localdomain python3.9[214776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:13 np0005548789.localdomain sudo[214774]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:14 np0005548789.localdomain sudo[214831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcqyyqqghfaxwcqabnsdvdknmblfqnie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014193.409094-1161-115632698460530/AnsiballZ_file.py
Dec 06 09:43:14 np0005548789.localdomain sudo[214831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:14 np0005548789.localdomain python3.9[214833]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60980 DF PROTO=TCP SPT=41308 DPT=9101 SEQ=302165480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7C2310000000001030307) 
Dec 06 09:43:14 np0005548789.localdomain sudo[214831]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:15 np0005548789.localdomain sudo[214941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oosksbvbcdqcvuomsmtzwejrnkykaghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014194.617242-1197-155630824275315/AnsiballZ_systemd.py
Dec 06 09:43:15 np0005548789.localdomain sudo[214941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:15 np0005548789.localdomain python3.9[214943]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:15 np0005548789.localdomain systemd-sysv-generator[214975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:15 np0005548789.localdomain systemd-rc-local-generator[214970]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:43:15 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:43:15 np0005548789.localdomain sudo[214941]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:16 np0005548789.localdomain sudo[215094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lesuwwuqpozcnthcgpkexrkxipqrbevt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.1805222-1227-230816475647224/AnsiballZ_file.py
Dec 06 09:43:16 np0005548789.localdomain sudo[215094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:16 np0005548789.localdomain python3.9[215096]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:16 np0005548789.localdomain sudo[215094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:17 np0005548789.localdomain sudo[215204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwmczgvwnmpunngapkexorhtgujrmkyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.9030704-1251-139307617380406/AnsiballZ_stat.py
Dec 06 09:43:17 np0005548789.localdomain sudo[215204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60982 DF PROTO=TCP SPT=41308 DPT=9101 SEQ=302165480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7CE2F0000000001030307) 
Dec 06 09:43:17 np0005548789.localdomain python3.9[215206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:17 np0005548789.localdomain sudo[215204]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:17 np0005548789.localdomain sudo[215292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbtorbzaqeyalgltlizjxjzxgvvtcxxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014196.9030704-1251-139307617380406/AnsiballZ_copy.py
Dec 06 09:43:17 np0005548789.localdomain sudo[215292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:17 np0005548789.localdomain python3.9[215294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014196.9030704-1251-139307617380406/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:17 np0005548789.localdomain sudo[215292]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:18 np0005548789.localdomain sudo[215402]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laucmqztmyucafpjinjhogksuuzswbdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014198.5178092-1302-47725686984781/AnsiballZ_file.py
Dec 06 09:43:18 np0005548789.localdomain sudo[215402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:19 np0005548789.localdomain python3.9[215404]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:19 np0005548789.localdomain sudo[215402]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:19 np0005548789.localdomain sudo[215512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrvjjlvjcpehhltwuloqotbmvrxfgjvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014199.2339156-1326-242869659582514/AnsiballZ_stat.py
Dec 06 09:43:19 np0005548789.localdomain sudo[215512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:19 np0005548789.localdomain python3.9[215514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:19 np0005548789.localdomain sudo[215512]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15154 DF PROTO=TCP SPT=49014 DPT=9105 SEQ=189848899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7D76F0000000001030307) 
Dec 06 09:43:19 np0005548789.localdomain sshd[215575]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:43:19 np0005548789.localdomain sudo[215602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwafvaavzhuvvemhgyoujlrzwzqsqrwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014199.2339156-1326-242869659582514/AnsiballZ_copy.py
Dec 06 09:43:19 np0005548789.localdomain sudo[215602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:20 np0005548789.localdomain python3.9[215604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014199.2339156-1326-242869659582514/.source.json _original_basename=.fffio8w1 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:20 np0005548789.localdomain sudo[215602]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:20 np0005548789.localdomain sudo[215712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrfuavnukpmajzvnafnwhgyznxcpjzpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014200.5405931-1371-18980714001177/AnsiballZ_file.py
Dec 06 09:43:20 np0005548789.localdomain sudo[215712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:20 np0005548789.localdomain python3.9[215714]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:20 np0005548789.localdomain sudo[215712]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:21 np0005548789.localdomain sudo[215822]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcloidpjsqheffryybgaavxbnuybjmtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014201.2619185-1395-43830603080699/AnsiballZ_stat.py
Dec 06 09:43:21 np0005548789.localdomain sudo[215822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:21 np0005548789.localdomain sshd[215575]: Received disconnect from 103.157.25.60 port 56354:11: Bye Bye [preauth]
Dec 06 09:43:21 np0005548789.localdomain sshd[215575]: Disconnected from authenticating user root 103.157.25.60 port 56354 [preauth]
Dec 06 09:43:21 np0005548789.localdomain sudo[215822]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:22 np0005548789.localdomain sudo[215910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwwfyvjtrvijqaeiaelzxfqddvbwlxfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014201.2619185-1395-43830603080699/AnsiballZ_copy.py
Dec 06 09:43:22 np0005548789.localdomain sudo[215910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:22 np0005548789.localdomain sudo[215910]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:43:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:43:22 np0005548789.localdomain podman[215968]: 2025-12-06 09:43:22.927079526 +0000 UTC m=+0.086384075 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec 06 09:43:22 np0005548789.localdomain podman[215969]: 2025-12-06 09:43:22.979209581 +0000 UTC m=+0.137117648 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Dec 06 09:43:22 np0005548789.localdomain podman[215968]: 2025-12-06 09:43:22.993315853 +0000 UTC m=+0.152620422 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 09:43:23 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:43:23 np0005548789.localdomain podman[215969]: 2025-12-06 09:43:23.009752246 +0000 UTC m=+0.167660323 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:43:23 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:43:23 np0005548789.localdomain sudo[216063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsuenphxramlvwiwgfnfjzfonpehonbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014202.7572672-1446-107861621209291/AnsiballZ_container_config_data.py
Dec 06 09:43:23 np0005548789.localdomain sudo[216063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:23 np0005548789.localdomain python3.9[216065]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 06 09:43:23 np0005548789.localdomain sudo[216063]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:23 np0005548789.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 06 09:43:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15155 DF PROTO=TCP SPT=49014 DPT=9105 SEQ=189848899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7E72F0000000001030307) 
Dec 06 09:43:24 np0005548789.localdomain sudo[216174]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-recwlwsyynmwzlicyarjyxymjyexrgsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014203.765832-1473-260408990936137/AnsiballZ_container_config_hash.py
Dec 06 09:43:24 np0005548789.localdomain sudo[216174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:24 np0005548789.localdomain python3.9[216176]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:43:24 np0005548789.localdomain sudo[216174]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:24 np0005548789.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 06 09:43:25 np0005548789.localdomain sudo[216285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdbvitlzovcmdssynvginrxhimksezjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014204.8309488-1500-278938441450266/AnsiballZ_podman_container_info.py
Dec 06 09:43:25 np0005548789.localdomain sudo[216285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:25 np0005548789.localdomain python3.9[216287]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:43:25 np0005548789.localdomain sudo[216285]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52920 DF PROTO=TCP SPT=56466 DPT=9882 SEQ=1807279947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7F3EF0000000001030307) 
Dec 06 09:43:29 np0005548789.localdomain sudo[216422]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-docuwzmnhtbyqinyhynfihjmhqpyteiv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014209.116937-1539-240764698132605/AnsiballZ_edpm_container_manage.py
Dec 06 09:43:29 np0005548789.localdomain sudo[216422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60984 DF PROTO=TCP SPT=41308 DPT=9101 SEQ=302165480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D7FDEF0000000001030307) 
Dec 06 09:43:29 np0005548789.localdomain python3[216424]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:43:31 np0005548789.localdomain podman[216438]: 2025-12-06 09:43:29.957711791 +0000 UTC m=+0.031573557 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:31 np0005548789.localdomain podman[216485]: 
Dec 06 09:43:31 np0005548789.localdomain podman[216485]: 2025-12-06 09:43:31.875802779 +0000 UTC m=+0.089606674 container create 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:43:31 np0005548789.localdomain podman[216485]: 2025-12-06 09:43:31.831977907 +0000 UTC m=+0.045781842 image pull  quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:31 np0005548789.localdomain python3[216424]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:43:32 np0005548789.localdomain sudo[216422]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15156 DF PROTO=TCP SPT=49014 DPT=9105 SEQ=189848899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D807EF0000000001030307) 
Dec 06 09:43:33 np0005548789.localdomain sudo[216630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfjzgprpavsxldxivenikxnbuteqvlgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014212.907073-1563-194209724904525/AnsiballZ_stat.py
Dec 06 09:43:33 np0005548789.localdomain sudo[216630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:33 np0005548789.localdomain python3.9[216632]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:33 np0005548789.localdomain sudo[216630]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 np0005548789.localdomain sudo[216742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glkzjpmulsuwonjwepcucmigsuoosong ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014213.8952959-1590-134665983715723/AnsiballZ_file.py
Dec 06 09:43:34 np0005548789.localdomain sudo[216742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:34 np0005548789.localdomain python3.9[216744]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:34 np0005548789.localdomain sudo[216742]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45753 DF PROTO=TCP SPT=55426 DPT=9102 SEQ=3589475987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D811EF0000000001030307) 
Dec 06 09:43:35 np0005548789.localdomain sudo[216797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgmeilaisvygjpaqzolwigtmmhhopuoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014213.8952959-1590-134665983715723/AnsiballZ_stat.py
Dec 06 09:43:35 np0005548789.localdomain sudo[216797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:35 np0005548789.localdomain python3.9[216799]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:35 np0005548789.localdomain sudo[216797]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:35 np0005548789.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 09:43:36 np0005548789.localdomain sudo[216907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewaxqpharrzjmkuncmhosvaxpfczhzdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5963624-1590-86490635546436/AnsiballZ_copy.py
Dec 06 09:43:36 np0005548789.localdomain sudo[216907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:36 np0005548789.localdomain python3.9[216909]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014215.5963624-1590-86490635546436/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:36 np0005548789.localdomain sudo[216907]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:36 np0005548789.localdomain sudo[216962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnhayukqsehmvckwhuxavvxettnogaea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5963624-1590-86490635546436/AnsiballZ_systemd.py
Dec 06 09:43:36 np0005548789.localdomain sudo[216962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:36 np0005548789.localdomain python3.9[216964]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:36 np0005548789.localdomain systemd-sysv-generator[216991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:36 np0005548789.localdomain systemd-rc-local-generator[216987]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:37 np0005548789.localdomain sudo[216962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:37 np0005548789.localdomain sudo[217053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyrnuhxmisehscdpzapyftucyseawljx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014215.5963624-1590-86490635546436/AnsiballZ_systemd.py
Dec 06 09:43:37 np0005548789.localdomain sudo[217053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26724 DF PROTO=TCP SPT=42314 DPT=9102 SEQ=1157572135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D81DEF0000000001030307) 
Dec 06 09:43:37 np0005548789.localdomain python3.9[217055]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:43:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:38 np0005548789.localdomain systemd-rc-local-generator[217080]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:38 np0005548789.localdomain systemd-sysv-generator[217087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: Starting multipathd container...
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:43:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/998945b2ed3c4b11e3b9ed62edc3a70e401b7792f1cfdc1d9e9e385864b9cbe5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/998945b2ed3c4b11e3b9ed62edc3a70e401b7792f1cfdc1d9e9e385864b9cbe5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:43:39 np0005548789.localdomain podman[217096]: 2025-12-06 09:43:39.372265836 +0000 UTC m=+0.157640597 container init 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + sudo -E kolla_set_configs
Dec 06 09:43:39 np0005548789.localdomain sudo[217117]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:43:39 np0005548789.localdomain sudo[217117]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:39 np0005548789.localdomain sudo[217117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:43:39 np0005548789.localdomain podman[217096]: 2025-12-06 09:43:39.420567354 +0000 UTC m=+0.205942085 container start 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 09:43:39 np0005548789.localdomain podman[217096]: multipathd
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: Started multipathd container.
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:43:39 np0005548789.localdomain sudo[217053]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: INFO:__main__:Validating config file
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: INFO:__main__:Writing out command to execute
Dec 06 09:43:39 np0005548789.localdomain sudo[217117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: ++ cat /run_command
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + ARGS=
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + sudo kolla_copy_cacerts
Dec 06 09:43:39 np0005548789.localdomain sudo[217132]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:43:39 np0005548789.localdomain sudo[217132]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:39 np0005548789.localdomain sudo[217132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:39 np0005548789.localdomain sudo[217132]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + [[ ! -n '' ]]
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + . kolla_extend_start
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + umask 0022
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: + exec /usr/sbin/multipathd -d
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: 10637.748465 | --------start up--------
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: 10637.748484 | read /etc/multipath.conf
Dec 06 09:43:39 np0005548789.localdomain multipathd[217111]: 10637.752458 | path checkers start up
Dec 06 09:43:39 np0005548789.localdomain podman[217120]: 2025-12-06 09:43:39.519030448 +0000 UTC m=+0.092569785 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:43:39 np0005548789.localdomain podman[217120]: 2025-12-06 09:43:39.533206762 +0000 UTC m=+0.106746099 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:43:39 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:43:40 np0005548789.localdomain python3.9[217258]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:40 np0005548789.localdomain systemd[1]: tmp-crun.D9LIbm.mount: Deactivated successfully.
Dec 06 09:43:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45755 DF PROTO=TCP SPT=55426 DPT=9102 SEQ=3589475987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D829B00000000001030307) 
Dec 06 09:43:41 np0005548789.localdomain sudo[217368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovyjwwxxzdbprstcbdxpauzkywolenne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014221.4000876-1698-218799905777599/AnsiballZ_command.py
Dec 06 09:43:41 np0005548789.localdomain sudo[217368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:41 np0005548789.localdomain python3.9[217370]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:43:42 np0005548789.localdomain sudo[217368]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:42 np0005548789.localdomain sudo[217491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjlfizhoixvuhkbspezbaepobubzhywf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014222.1879683-1722-246934866053961/AnsiballZ_systemd.py
Dec 06 09:43:42 np0005548789.localdomain sudo[217491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:42 np0005548789.localdomain python3.9[217493]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: Stopping multipathd container...
Dec 06 09:43:42 np0005548789.localdomain multipathd[217111]: 10641.158282 | exit (signal)
Dec 06 09:43:42 np0005548789.localdomain multipathd[217111]: 10641.158377 | --------shut down-------
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: libpod-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.scope: Deactivated successfully.
Dec 06 09:43:42 np0005548789.localdomain podman[217497]: 2025-12-06 09:43:42.940173058 +0000 UTC m=+0.097429753 container died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.timer: Deactivated successfully.
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: tmp-crun.GjYfYO.mount: Deactivated successfully.
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6-userdata-shm.mount: Deactivated successfully.
Dec 06 09:43:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-998945b2ed3c4b11e3b9ed62edc3a70e401b7792f1cfdc1d9e9e385864b9cbe5-merged.mount: Deactivated successfully.
Dec 06 09:43:43 np0005548789.localdomain podman[217497]: 2025-12-06 09:43:43.103499468 +0000 UTC m=+0.260756123 container cleanup 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 09:43:43 np0005548789.localdomain podman[217497]: multipathd
Dec 06 09:43:43 np0005548789.localdomain podman[217524]: 2025-12-06 09:43:43.197899607 +0000 UTC m=+0.065604469 container cleanup 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 09:43:43 np0005548789.localdomain podman[217524]: multipathd
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Stopped multipathd container.
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Starting multipathd container...
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:43:43 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/998945b2ed3c4b11e3b9ed62edc3a70e401b7792f1cfdc1d9e9e385864b9cbe5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:43 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/998945b2ed3c4b11e3b9ed62edc3a70e401b7792f1cfdc1d9e9e385864b9cbe5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:43:43 np0005548789.localdomain podman[217537]: 2025-12-06 09:43:43.363974982 +0000 UTC m=+0.136942443 container init 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + sudo -E kolla_set_configs
Dec 06 09:43:43 np0005548789.localdomain sudo[217557]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:43:43 np0005548789.localdomain sudo[217557]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:43 np0005548789.localdomain sudo[217557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:43 np0005548789.localdomain podman[217537]: 2025-12-06 09:43:43.407065951 +0000 UTC m=+0.180033372 container start 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:43:43 np0005548789.localdomain podman[217537]: multipathd
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: Started multipathd container.
Dec 06 09:43:43 np0005548789.localdomain sudo[217491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: INFO:__main__:Validating config file
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: INFO:__main__:Writing out command to execute
Dec 06 09:43:43 np0005548789.localdomain sudo[217557]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: ++ cat /run_command
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + ARGS=
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + sudo kolla_copy_cacerts
Dec 06 09:43:43 np0005548789.localdomain sudo[217578]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:43:43 np0005548789.localdomain sudo[217578]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:43:43 np0005548789.localdomain sudo[217578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:43:43 np0005548789.localdomain sudo[217578]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + [[ ! -n '' ]]
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + . kolla_extend_start
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + umask 0022
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: + exec /usr/sbin/multipathd -d
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: 10641.742103 | --------start up--------
Dec 06 09:43:43 np0005548789.localdomain podman[217559]: 2025-12-06 09:43:43.49226618 +0000 UTC m=+0.087992116 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: 10641.742124 | read /etc/multipath.conf
Dec 06 09:43:43 np0005548789.localdomain multipathd[217551]: 10641.745786 | path checkers start up
Dec 06 09:43:43 np0005548789.localdomain podman[217559]: 2025-12-06 09:43:43.505063611 +0000 UTC m=+0.100789577 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:43:43 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:43:44 np0005548789.localdomain sudo[217697]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fobogfrjwvpygchyjbryqzsqdwhqewms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014223.84706-1746-77172058373102/AnsiballZ_file.py
Dec 06 09:43:44 np0005548789.localdomain sudo[217697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28925 DF PROTO=TCP SPT=48408 DPT=9101 SEQ=2497040735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D837610000000001030307) 
Dec 06 09:43:44 np0005548789.localdomain python3.9[217699]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:44 np0005548789.localdomain sudo[217697]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:45 np0005548789.localdomain sudo[217807]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vskzkvwvyqdhhcpieyzkguruknlunoxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014225.3783562-1782-205728943343425/AnsiballZ_file.py
Dec 06 09:43:45 np0005548789.localdomain sudo[217807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:45 np0005548789.localdomain python3.9[217809]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:43:45 np0005548789.localdomain sudo[217807]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:46 np0005548789.localdomain sudo[217917]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhwsuolpnanuccscjoppviqswclfeanh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014226.109202-1806-88191120048566/AnsiballZ_modprobe.py
Dec 06 09:43:46 np0005548789.localdomain sudo[217917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:46 np0005548789.localdomain python3.9[217919]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 06 09:43:46 np0005548789.localdomain sudo[217917]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:43:47.273 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:43:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:43:47.273 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:43:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:43:47.274 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:43:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28927 DF PROTO=TCP SPT=48408 DPT=9101 SEQ=2497040735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8436F0000000001030307) 
Dec 06 09:43:47 np0005548789.localdomain sudo[218036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeifzqdnjxdqstbgifnrhtbakgnqxlwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014227.5146396-1830-144475602242227/AnsiballZ_stat.py
Dec 06 09:43:47 np0005548789.localdomain sudo[218036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:48 np0005548789.localdomain python3.9[218038]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:43:48 np0005548789.localdomain sudo[218036]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:48 np0005548789.localdomain sudo[218124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmagjilwmwxvvuqnxtgssfducxketemv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014227.5146396-1830-144475602242227/AnsiballZ_copy.py
Dec 06 09:43:48 np0005548789.localdomain sudo[218124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:48 np0005548789.localdomain python3.9[218126]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014227.5146396-1830-144475602242227/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:48 np0005548789.localdomain sudo[218124]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:49 np0005548789.localdomain sudo[218234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgavpoisgtshpknaxlhheedfshmisjts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014229.1588671-1878-103691813380640/AnsiballZ_lineinfile.py
Dec 06 09:43:49 np0005548789.localdomain sudo[218234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:49 np0005548789.localdomain python3.9[218236]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:49 np0005548789.localdomain sudo[218234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23163 DF PROTO=TCP SPT=50212 DPT=9105 SEQ=2414219979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D84C6F0000000001030307) 
Dec 06 09:43:50 np0005548789.localdomain sudo[218344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rztciywetfhxkoxwfupoqvkczzyoetqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014229.8655345-1902-222546117843835/AnsiballZ_systemd.py
Dec 06 09:43:50 np0005548789.localdomain sudo[218344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:50 np0005548789.localdomain python3.9[218346]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:43:50 np0005548789.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:43:50 np0005548789.localdomain systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:43:50 np0005548789.localdomain systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:43:50 np0005548789.localdomain systemd[1]: Starting Load Kernel Modules...
Dec 06 09:43:50 np0005548789.localdomain systemd-modules-load[218350]: Module 'msr' is built in
Dec 06 09:43:50 np0005548789.localdomain systemd[1]: Finished Load Kernel Modules.
Dec 06 09:43:50 np0005548789.localdomain sudo[218344]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:51 np0005548789.localdomain sudo[218458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtlfkcnpehmsisophgdonfuuqtqgggcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014231.0849485-1926-110080095360590/AnsiballZ_dnf.py
Dec 06 09:43:51 np0005548789.localdomain sudo[218458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:51 np0005548789.localdomain python3.9[218460]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:43:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23164 DF PROTO=TCP SPT=50212 DPT=9105 SEQ=2414219979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D85C2F0000000001030307) 
Dec 06 09:43:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:43:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:43:53 np0005548789.localdomain podman[218463]: 2025-12-06 09:43:53.917869871 +0000 UTC m=+0.081332415 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 09:43:54 np0005548789.localdomain systemd[1]: tmp-crun.J3FzmN.mount: Deactivated successfully.
Dec 06 09:43:54 np0005548789.localdomain podman[218464]: 2025-12-06 09:43:54.015933254 +0000 UTC m=+0.176574730 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:43:54 np0005548789.localdomain podman[218464]: 2025-12-06 09:43:54.022055814 +0000 UTC m=+0.182697270 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:43:54 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:43:54 np0005548789.localdomain podman[218463]: 2025-12-06 09:43:54.043264873 +0000 UTC m=+0.206727467 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:43:54 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:43:54 np0005548789.localdomain sshd[218505]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:43:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52923 DF PROTO=TCP SPT=56466 DPT=9882 SEQ=1807279947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D863F00000000001030307) 
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:55 np0005548789.localdomain systemd-rc-local-generator[218541]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:55 np0005548789.localdomain systemd-sysv-generator[218544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:55 np0005548789.localdomain sshd[218505]: Received disconnect from 64.227.156.63 port 40534:11: Bye Bye [preauth]
Dec 06 09:43:55 np0005548789.localdomain sshd[218505]: Disconnected from authenticating user root 64.227.156.63 port 40534 [preauth]
Dec 06 09:43:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:56 np0005548789.localdomain systemd-rc-local-generator[218574]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:56 np0005548789.localdomain systemd-sysv-generator[218579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd-logind[766]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 09:43:56 np0005548789.localdomain systemd-logind[766]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 09:43:56 np0005548789.localdomain lvm[218626]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:43:56 np0005548789.localdomain lvm[218626]: VG ceph_vg0 finished
Dec 06 09:43:56 np0005548789.localdomain lvm[218627]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 09:43:56 np0005548789.localdomain lvm[218627]: VG ceph_vg1 finished
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:43:56 np0005548789.localdomain systemd-sysv-generator[218679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:43:56 np0005548789.localdomain systemd-rc-local-generator[218674]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:56 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.247s CPU time.
Dec 06 09:43:57 np0005548789.localdomain systemd[1]: run-r63cb5f11d6da489d9f9b9756a15b8a69.service: Deactivated successfully.
Dec 06 09:43:58 np0005548789.localdomain sudo[218458]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:59 np0005548789.localdomain python3.9[219921]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:43:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28929 DF PROTO=TCP SPT=48408 DPT=9101 SEQ=2497040735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D873EF0000000001030307) 
Dec 06 09:44:00 np0005548789.localdomain sudo[220033]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvwupsxzeiiguparaxfbcfvbubmlniwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014240.3714314-1978-207566281937144/AnsiballZ_file.py
Dec 06 09:44:00 np0005548789.localdomain sudo[220033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:00 np0005548789.localdomain sudo[220036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:44:00 np0005548789.localdomain sudo[220036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:00 np0005548789.localdomain sudo[220036]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:00 np0005548789.localdomain sudo[220054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:44:00 np0005548789.localdomain sudo[220054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:00 np0005548789.localdomain python3.9[220035]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:00 np0005548789.localdomain sudo[220033]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:01 np0005548789.localdomain sudo[220054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23165 DF PROTO=TCP SPT=50212 DPT=9105 SEQ=2414219979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D87BEF0000000001030307) 
Dec 06 09:44:01 np0005548789.localdomain sudo[220211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfulffutbmhqmomzghsrvzkahxljehuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014241.7030988-2011-28266996591477/AnsiballZ_systemd_service.py
Dec 06 09:44:01 np0005548789.localdomain sudo[220211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:02 np0005548789.localdomain sudo[220214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:44:02 np0005548789.localdomain sudo[220214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:44:02 np0005548789.localdomain sudo[220214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:02 np0005548789.localdomain python3.9[220213]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:44:02 np0005548789.localdomain sshd[220233]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:02 np0005548789.localdomain systemd-sysv-generator[220261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:44:02 np0005548789.localdomain systemd-rc-local-generator[220258]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:02 np0005548789.localdomain sudo[220211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:03 np0005548789.localdomain python3.9[220377]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:44:03 np0005548789.localdomain network[220394]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:44:03 np0005548789.localdomain network[220395]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:44:03 np0005548789.localdomain network[220396]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:44:03 np0005548789.localdomain sshd[220233]: Received disconnect from 103.234.151.178 port 48210:11: Bye Bye [preauth]
Dec 06 09:44:03 np0005548789.localdomain sshd[220233]: Disconnected from authenticating user root 103.234.151.178 port 48210 [preauth]
Dec 06 09:44:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24775 DF PROTO=TCP SPT=44052 DPT=9102 SEQ=2062190782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8872F0000000001030307) 
Dec 06 09:44:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6900 DF PROTO=TCP SPT=56978 DPT=9102 SEQ=1819866283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D893EF0000000001030307) 
Dec 06 09:44:08 np0005548789.localdomain sudo[220629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbtaivalcsoxchxpxhomsnsouqomqsbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014248.112785-2068-223606719913956/AnsiballZ_systemd_service.py
Dec 06 09:44:08 np0005548789.localdomain sudo[220629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:08 np0005548789.localdomain python3.9[220631]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:08 np0005548789.localdomain sudo[220629]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:09 np0005548789.localdomain sudo[220740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpciwfwbkjirwkdmgckysmbzxwgpdmaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014248.8288348-2068-245726825235883/AnsiballZ_systemd_service.py
Dec 06 09:44:09 np0005548789.localdomain sudo[220740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:10 np0005548789.localdomain python3.9[220742]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:10 np0005548789.localdomain sudo[220740]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:10 np0005548789.localdomain sudo[220851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwombwreggekkphsgywtflpqryrapihm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014250.2862403-2068-202848309749521/AnsiballZ_systemd_service.py
Dec 06 09:44:10 np0005548789.localdomain sudo[220851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24777 DF PROTO=TCP SPT=44052 DPT=9102 SEQ=2062190782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D89EEF0000000001030307) 
Dec 06 09:44:10 np0005548789.localdomain python3.9[220853]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:10 np0005548789.localdomain sudo[220851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:11 np0005548789.localdomain sudo[220962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpmgpiuakuzhurvdqmokoidckljosnki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014251.2191527-2068-240316719432010/AnsiballZ_systemd_service.py
Dec 06 09:44:11 np0005548789.localdomain sudo[220962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:11 np0005548789.localdomain python3.9[220964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:11 np0005548789.localdomain sudo[220962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:12 np0005548789.localdomain sudo[221073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pobwjicrzjyfmaefjrdrwbjauxmpbhro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014252.0208807-2068-44446837569613/AnsiballZ_systemd_service.py
Dec 06 09:44:12 np0005548789.localdomain sudo[221073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:12 np0005548789.localdomain python3.9[221075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:12 np0005548789.localdomain sudo[221073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:13 np0005548789.localdomain sudo[221184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpdhrzvkpazadfpfxivvfocpxsdnttko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014252.7526195-2068-17820100232509/AnsiballZ_systemd_service.py
Dec 06 09:44:13 np0005548789.localdomain sudo[221184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:13 np0005548789.localdomain python3.9[221186]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:13 np0005548789.localdomain sudo[221184]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:13 np0005548789.localdomain sudo[221295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzfmamhzizbbeolyukyiamgwktoxoiyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014253.4946074-2068-119294580516473/AnsiballZ_systemd_service.py
Dec 06 09:44:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:44:13 np0005548789.localdomain sudo[221295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:13 np0005548789.localdomain podman[221297]: 2025-12-06 09:44:13.887685338 +0000 UTC m=+0.085119253 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:44:13 np0005548789.localdomain podman[221297]: 2025-12-06 09:44:13.900694831 +0000 UTC m=+0.098128756 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:44:13 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:44:14 np0005548789.localdomain python3.9[221298]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:14 np0005548789.localdomain sudo[221295]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6285 DF PROTO=TCP SPT=51992 DPT=9101 SEQ=2735779511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8AC900000000001030307) 
Dec 06 09:44:14 np0005548789.localdomain sudo[221427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmlwzbymrkhymnczoripfnztpysesxwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014254.2742972-2068-254764923504123/AnsiballZ_systemd_service.py
Dec 06 09:44:14 np0005548789.localdomain sudo[221427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:14 np0005548789.localdomain python3.9[221429]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:44:14 np0005548789.localdomain sudo[221427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6287 DF PROTO=TCP SPT=51992 DPT=9101 SEQ=2735779511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8B8AF0000000001030307) 
Dec 06 09:44:19 np0005548789.localdomain sudo[221538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmfkmtptvtiaraqaqgqafzqygmplpsaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014258.7721703-2245-46089688822397/AnsiballZ_file.py
Dec 06 09:44:19 np0005548789.localdomain sudo[221538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:19 np0005548789.localdomain python3.9[221540]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:19 np0005548789.localdomain sudo[221538]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:19 np0005548789.localdomain sudo[221648]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afrhvdxggjnudjdyqfyguljobvzpmuke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014259.377007-2245-82959443441745/AnsiballZ_file.py
Dec 06 09:44:19 np0005548789.localdomain sudo[221648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56655 DF PROTO=TCP SPT=58192 DPT=9105 SEQ=3867780288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8C1AF0000000001030307) 
Dec 06 09:44:19 np0005548789.localdomain python3.9[221650]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:19 np0005548789.localdomain sudo[221648]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:20 np0005548789.localdomain sudo[221758]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evcsfcaewfgfpegprdjsmadtnbfvordu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014260.0195913-2245-257332015216164/AnsiballZ_file.py
Dec 06 09:44:20 np0005548789.localdomain sudo[221758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:20 np0005548789.localdomain python3.9[221760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:20 np0005548789.localdomain sudo[221758]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:20 np0005548789.localdomain sudo[221868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgsvpaubartwmykudpxrxnrlbhhpgafd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014260.6549938-2245-16340631204879/AnsiballZ_file.py
Dec 06 09:44:20 np0005548789.localdomain sudo[221868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:21 np0005548789.localdomain python3.9[221870]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:21 np0005548789.localdomain sudo[221868]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:21 np0005548789.localdomain sudo[221978]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orivykmrokickymrearhbhgsknloemci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014261.3249602-2245-224384121300543/AnsiballZ_file.py
Dec 06 09:44:21 np0005548789.localdomain sudo[221978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:21 np0005548789.localdomain python3.9[221980]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:21 np0005548789.localdomain sudo[221978]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:22 np0005548789.localdomain sshd[222069]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:22 np0005548789.localdomain sudo[222090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owjurrtjhrjshyrehroghiehncoydqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014262.2127786-2245-52111766532890/AnsiballZ_file.py
Dec 06 09:44:22 np0005548789.localdomain sudo[222090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:22 np0005548789.localdomain python3.9[222092]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:22 np0005548789.localdomain sudo[222090]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:23 np0005548789.localdomain sudo[222200]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmngscifrtknmvzfpesigoewhkvuycie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014262.843759-2245-261677423856628/AnsiballZ_file.py
Dec 06 09:44:23 np0005548789.localdomain sudo[222200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:23 np0005548789.localdomain python3.9[222202]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:23 np0005548789.localdomain sudo[222200]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56656 DF PROTO=TCP SPT=58192 DPT=9105 SEQ=3867780288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8D16F0000000001030307) 
Dec 06 09:44:23 np0005548789.localdomain sudo[222310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbajowgylpclwctnepjruuenmbtvfhkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014263.4585612-2245-12042795325260/AnsiballZ_file.py
Dec 06 09:44:23 np0005548789.localdomain sudo[222310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:23 np0005548789.localdomain python3.9[222312]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:23 np0005548789.localdomain sudo[222310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:24 np0005548789.localdomain sshd[222069]: Received disconnect from 103.192.152.59 port 34942:11: Bye Bye [preauth]
Dec 06 09:44:24 np0005548789.localdomain sshd[222069]: Disconnected from authenticating user root 103.192.152.59 port 34942 [preauth]
Dec 06 09:44:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:44:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:44:24 np0005548789.localdomain systemd[1]: tmp-crun.Shilia.mount: Deactivated successfully.
Dec 06 09:44:24 np0005548789.localdomain podman[222330]: 2025-12-06 09:44:24.379449456 +0000 UTC m=+0.094071081 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 09:44:24 np0005548789.localdomain podman[222330]: 2025-12-06 09:44:24.429896802 +0000 UTC m=+0.144518477 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:44:24 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:44:24 np0005548789.localdomain podman[222331]: 2025-12-06 09:44:24.432126621 +0000 UTC m=+0.144791065 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:44:24 np0005548789.localdomain podman[222331]: 2025-12-06 09:44:24.511951758 +0000 UTC m=+0.224616192 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:44:24 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:44:25 np0005548789.localdomain systemd[1]: tmp-crun.wyUxBE.mount: Deactivated successfully.
Dec 06 09:44:25 np0005548789.localdomain sudo[222464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmrapfertslowbtewywhouunfyqqyfoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014265.4865513-2416-120314478983413/AnsiballZ_file.py
Dec 06 09:44:25 np0005548789.localdomain sudo[222464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8334 DF PROTO=TCP SPT=44462 DPT=9882 SEQ=3116503452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8D9EF0000000001030307) 
Dec 06 09:44:25 np0005548789.localdomain python3.9[222466]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:25 np0005548789.localdomain sudo[222464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:26 np0005548789.localdomain sudo[222574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etuwldslfqgwymwxoaazcurxcabzjqte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014266.0997272-2416-44984181715952/AnsiballZ_file.py
Dec 06 09:44:26 np0005548789.localdomain sudo[222574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:26 np0005548789.localdomain python3.9[222576]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:26 np0005548789.localdomain sudo[222574]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:27 np0005548789.localdomain sudo[222684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uttcuykbdijroxytougtukxvisxpnsen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014266.7602162-2416-121493638477568/AnsiballZ_file.py
Dec 06 09:44:27 np0005548789.localdomain sudo[222684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:27 np0005548789.localdomain python3.9[222686]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:27 np0005548789.localdomain sudo[222684]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:27 np0005548789.localdomain sudo[222794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpdslqxgytmdjnrmmoxfyobmirrjnwzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014267.3641713-2416-238251957978467/AnsiballZ_file.py
Dec 06 09:44:27 np0005548789.localdomain sudo[222794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:27 np0005548789.localdomain sshd[222797]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:28 np0005548789.localdomain python3.9[222796]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:28 np0005548789.localdomain sudo[222794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:28 np0005548789.localdomain sudo[222906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-varxzdnmfjnbomuwsyewhgypebnzllpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014268.1869235-2416-267150971651738/AnsiballZ_file.py
Dec 06 09:44:28 np0005548789.localdomain sudo[222906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:28 np0005548789.localdomain python3.9[222908]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:28 np0005548789.localdomain sudo[222906]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:29 np0005548789.localdomain sudo[223016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntemrrdrzhwpsgyddsvcinanamuqxdgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014268.7623427-2416-54638680203140/AnsiballZ_file.py
Dec 06 09:44:29 np0005548789.localdomain sudo[223016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:29 np0005548789.localdomain python3.9[223018]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:29 np0005548789.localdomain sudo[223016]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6289 DF PROTO=TCP SPT=51992 DPT=9101 SEQ=2735779511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8E7EF0000000001030307) 
Dec 06 09:44:29 np0005548789.localdomain sudo[223126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwjsmnnobnfpbrvcqeawatdcswpprtih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014269.3652682-2416-125860614750934/AnsiballZ_file.py
Dec 06 09:44:29 np0005548789.localdomain sudo[223126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:29 np0005548789.localdomain python3.9[223128]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:29 np0005548789.localdomain sudo[223126]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:30 np0005548789.localdomain sudo[223236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvruloahmmgmleldxfpjbjgyhhwymbsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014269.981945-2416-1141236587060/AnsiballZ_file.py
Dec 06 09:44:30 np0005548789.localdomain sudo[223236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:30 np0005548789.localdomain python3.9[223238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:30 np0005548789.localdomain sudo[223236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:31 np0005548789.localdomain sudo[223346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcfuxveyhgskgqflyhtyqhmssnobrtdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014271.1288369-2590-112758801761960/AnsiballZ_command.py
Dec 06 09:44:31 np0005548789.localdomain sudo[223346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:31 np0005548789.localdomain python3.9[223348]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:31 np0005548789.localdomain sudo[223346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:31 np0005548789.localdomain sshd[222797]: Received disconnect from 179.33.210.213 port 38396:11: Bye Bye [preauth]
Dec 06 09:44:31 np0005548789.localdomain sshd[222797]: Disconnected from authenticating user root 179.33.210.213 port 38396 [preauth]
Dec 06 09:44:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56657 DF PROTO=TCP SPT=58192 DPT=9105 SEQ=3867780288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8F1EF0000000001030307) 
Dec 06 09:44:32 np0005548789.localdomain python3.9[223458]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:44:33 np0005548789.localdomain sudo[223566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gayxjedpvhydoiadfpqyekzzgztwxtuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014272.804418-2644-35355102419568/AnsiballZ_systemd_service.py
Dec 06 09:44:33 np0005548789.localdomain sudo[223566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:33 np0005548789.localdomain python3.9[223568]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:44:33 np0005548789.localdomain systemd-sysv-generator[223595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:44:33 np0005548789.localdomain systemd-rc-local-generator[223592]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:44:33 np0005548789.localdomain sudo[223566]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3810 DF PROTO=TCP SPT=44092 DPT=9102 SEQ=1703384252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D8FC6F0000000001030307) 
Dec 06 09:44:35 np0005548789.localdomain sudo[223712]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdlxjorpywmfmurfkyvsryzdkszebodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014273.9965177-2668-221792740094545/AnsiballZ_command.py
Dec 06 09:44:35 np0005548789.localdomain sudo[223712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:35 np0005548789.localdomain python3.9[223714]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:35 np0005548789.localdomain sudo[223712]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:35 np0005548789.localdomain sudo[223823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hceawicrhhlexgxebjnurezwdlgheomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014275.661438-2668-42328297296888/AnsiballZ_command.py
Dec 06 09:44:35 np0005548789.localdomain sudo[223823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:36 np0005548789.localdomain python3.9[223825]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:36 np0005548789.localdomain sudo[223823]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:36 np0005548789.localdomain sudo[223934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrrrumsskmkohwlfgdpxtbiraouajhqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014276.3095033-2668-2416195909248/AnsiballZ_command.py
Dec 06 09:44:36 np0005548789.localdomain sudo[223934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:36 np0005548789.localdomain python3.9[223936]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:36 np0005548789.localdomain sudo[223934]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45758 DF PROTO=TCP SPT=55426 DPT=9102 SEQ=3589475987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D907F00000000001030307) 
Dec 06 09:44:37 np0005548789.localdomain sudo[224045]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eduscvuxxmajxgacccewvggtvsqvzxmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014276.9039109-2668-97845336338653/AnsiballZ_command.py
Dec 06 09:44:37 np0005548789.localdomain sudo[224045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:38 np0005548789.localdomain python3.9[224047]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:38 np0005548789.localdomain sudo[224045]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:38 np0005548789.localdomain sudo[224156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orehqhzwavugsgtjgkoiihfufzqtybhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014278.1681762-2668-248497879015064/AnsiballZ_command.py
Dec 06 09:44:38 np0005548789.localdomain sudo[224156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:38 np0005548789.localdomain python3.9[224158]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:38 np0005548789.localdomain sudo[224156]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:39 np0005548789.localdomain sudo[224267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnaucwdviboxndkwnlssnwkfjmaxqhtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014278.7565098-2668-143073017219388/AnsiballZ_command.py
Dec 06 09:44:39 np0005548789.localdomain sudo[224267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:39 np0005548789.localdomain python3.9[224269]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:39 np0005548789.localdomain sudo[224267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:39 np0005548789.localdomain sudo[224378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atejwsnmqvisgbbqypbclabvpiobbjos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014279.3757076-2668-134060456870897/AnsiballZ_command.py
Dec 06 09:44:39 np0005548789.localdomain sudo[224378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:39 np0005548789.localdomain python3.9[224380]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:39 np0005548789.localdomain sudo[224378]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:40 np0005548789.localdomain sudo[224489]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpkmzicghjfwksnwyytxfylxiwugqitx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014279.9653072-2668-226940818808125/AnsiballZ_command.py
Dec 06 09:44:40 np0005548789.localdomain sudo[224489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:40 np0005548789.localdomain python3.9[224491]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:40 np0005548789.localdomain sudo[224489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3812 DF PROTO=TCP SPT=44092 DPT=9102 SEQ=1703384252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9142F0000000001030307) 
Dec 06 09:44:42 np0005548789.localdomain sudo[224600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyegpwjbmtkfcojgwzdhcugwhvtkatxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014282.3243363-2875-271781747841198/AnsiballZ_file.py
Dec 06 09:44:42 np0005548789.localdomain sudo[224600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:42 np0005548789.localdomain python3.9[224602]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:42 np0005548789.localdomain sudo[224600]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:43 np0005548789.localdomain sudo[224710]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofmiazypjvthwladickbwpjqpramnuvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014282.9583616-2875-162411206640162/AnsiballZ_file.py
Dec 06 09:44:43 np0005548789.localdomain sudo[224710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:43 np0005548789.localdomain python3.9[224712]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:43 np0005548789.localdomain sudo[224710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:43 np0005548789.localdomain sudo[224820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpdnxjcstovkfkqxnhbkjvwbmrorwffw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014283.6132123-2875-257246125190665/AnsiballZ_file.py
Dec 06 09:44:43 np0005548789.localdomain sudo[224820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:44 np0005548789.localdomain python3.9[224822]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:44 np0005548789.localdomain sudo[224820]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37487 DF PROTO=TCP SPT=48462 DPT=9101 SEQ=1261408538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D921C00000000001030307) 
Dec 06 09:44:44 np0005548789.localdomain sudo[224930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myubsuxsofmembuvgyfogdgxglvgnquv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014284.3261096-2941-145546572140662/AnsiballZ_file.py
Dec 06 09:44:44 np0005548789.localdomain sudo[224930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:44:44 np0005548789.localdomain sshd[224944]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:44 np0005548789.localdomain systemd[1]: tmp-crun.qj1j5H.mount: Deactivated successfully.
Dec 06 09:44:44 np0005548789.localdomain podman[224933]: 2025-12-06 09:44:44.79319465 +0000 UTC m=+0.095000439 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 09:44:44 np0005548789.localdomain podman[224933]: 2025-12-06 09:44:44.802945122 +0000 UTC m=+0.104750911 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:44:44 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:44:44 np0005548789.localdomain python3.9[224932]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:44 np0005548789.localdomain sudo[224930]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:45 np0005548789.localdomain sudo[225061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdanwufwjljqiyyomaagofcekaxohtsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014285.0400105-2941-36870037316816/AnsiballZ_file.py
Dec 06 09:44:45 np0005548789.localdomain sudo[225061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:45 np0005548789.localdomain python3.9[225063]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:45 np0005548789.localdomain sudo[225061]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:45 np0005548789.localdomain sudo[225171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvojklpiuouomvlvyefwtwylelvudoze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014285.692828-2941-24442556294294/AnsiballZ_file.py
Dec 06 09:44:45 np0005548789.localdomain sudo[225171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:46 np0005548789.localdomain python3.9[225173]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:46 np0005548789.localdomain sudo[225171]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:46 np0005548789.localdomain sshd[224944]: Received disconnect from 103.157.25.60 port 58024:11: Bye Bye [preauth]
Dec 06 09:44:46 np0005548789.localdomain sshd[224944]: Disconnected from authenticating user root 103.157.25.60 port 58024 [preauth]
Dec 06 09:44:46 np0005548789.localdomain sudo[225281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udutfcplsjvhuooidehmjnwvyjwofncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014286.3471184-2941-160133709048094/AnsiballZ_file.py
Dec 06 09:44:46 np0005548789.localdomain sudo[225281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:46 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46888 DF PROTO=TCP SPT=51098 DPT=9105 SEQ=1871906699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D92AFD0000000001030307) 
Dec 06 09:44:46 np0005548789.localdomain python3.9[225283]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:46 np0005548789.localdomain sudo[225281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:47 np0005548789.localdomain sudo[225391]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fefkjzlydzilfwqrqxmwzeclfyqxecaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014286.9279912-2941-68516339706797/AnsiballZ_file.py
Dec 06 09:44:47 np0005548789.localdomain sudo[225391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:44:47.274 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:44:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:44:47.275 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:44:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:44:47.276 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:44:47 np0005548789.localdomain python3.9[225393]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:47 np0005548789.localdomain sudo[225391]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:48 np0005548789.localdomain sudo[225501]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omimwagoxyfixpwwggipbzoojgujuzve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014287.5445874-2941-151528393737359/AnsiballZ_file.py
Dec 06 09:44:48 np0005548789.localdomain sudo[225501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:49 np0005548789.localdomain python3.9[225503]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:49 np0005548789.localdomain sudo[225501]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:49 np0005548789.localdomain sudo[225611]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqadvuavdvmvuvlzqfkuqisjbhxurxqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014289.2023697-2941-43502021378728/AnsiballZ_file.py
Dec 06 09:44:49 np0005548789.localdomain sudo[225611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46890 DF PROTO=TCP SPT=51098 DPT=9105 SEQ=1871906699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D936EF0000000001030307) 
Dec 06 09:44:49 np0005548789.localdomain python3.9[225613]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:49 np0005548789.localdomain sudo[225611]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46891 DF PROTO=TCP SPT=51098 DPT=9105 SEQ=1871906699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D946AF0000000001030307) 
Dec 06 09:44:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:44:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:44:54 np0005548789.localdomain systemd[1]: tmp-crun.TJrHiV.mount: Deactivated successfully.
Dec 06 09:44:54 np0005548789.localdomain podman[225632]: 2025-12-06 09:44:54.93068448 +0000 UTC m=+0.088131834 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:44:54 np0005548789.localdomain podman[225632]: 2025-12-06 09:44:54.965202589 +0000 UTC m=+0.122649893 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:44:54 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:44:54 np0005548789.localdomain podman[225631]: 2025-12-06 09:44:54.978082853 +0000 UTC m=+0.138755900 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 09:44:55 np0005548789.localdomain podman[225631]: 2025-12-06 09:44:55.045538792 +0000 UTC m=+0.206211789 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:44:55 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:44:56 np0005548789.localdomain sshd[212231]: fatal: Timeout before authentication for 45.78.222.162 port 58910
Dec 06 09:44:56 np0005548789.localdomain sudo[225765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ommptsbbwmehjubyscguetnmfuwhljky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014295.9806583-3266-2389915889795/AnsiballZ_getent.py
Dec 06 09:44:56 np0005548789.localdomain sudo[225765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:56 np0005548789.localdomain python3.9[225767]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 06 09:44:56 np0005548789.localdomain sudo[225765]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50295 DF PROTO=TCP SPT=60554 DPT=9882 SEQ=1031214754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9536F0000000001030307) 
Dec 06 09:44:57 np0005548789.localdomain sudo[225876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efmbmamqgzpaaggopqxehrimlnifxzgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014296.854134-3290-259320248066291/AnsiballZ_group.py
Dec 06 09:44:57 np0005548789.localdomain sudo[225876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:57 np0005548789.localdomain python3.9[225878]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:44:57 np0005548789.localdomain groupadd[225879]: group added to /etc/group: name=nova, GID=42436
Dec 06 09:44:57 np0005548789.localdomain groupadd[225879]: group added to /etc/gshadow: name=nova
Dec 06 09:44:57 np0005548789.localdomain groupadd[225879]: new group: name=nova, GID=42436
Dec 06 09:44:57 np0005548789.localdomain sudo[225876]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:58 np0005548789.localdomain sudo[225992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-senotyxhqefwpcrngnmmzrrqzkgntgib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014297.8565679-3314-215028950294951/AnsiballZ_user.py
Dec 06 09:44:58 np0005548789.localdomain sudo[225992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:58 np0005548789.localdomain python3.9[225994]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:44:58 np0005548789.localdomain useradd[225996]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Dec 06 09:44:58 np0005548789.localdomain useradd[225996]: add 'nova' to group 'libvirt'
Dec 06 09:44:58 np0005548789.localdomain useradd[225996]: add 'nova' to shadow group 'libvirt'
Dec 06 09:44:58 np0005548789.localdomain sudo[225992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:59 np0005548789.localdomain sshd[226020]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:44:59 np0005548789.localdomain sshd[226020]: Accepted publickey for zuul from 192.168.122.30 port 37324 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:44:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37491 DF PROTO=TCP SPT=48462 DPT=9101 SEQ=1261408538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D95DF00000000001030307) 
Dec 06 09:44:59 np0005548789.localdomain systemd-logind[766]: New session 55 of user zuul.
Dec 06 09:44:59 np0005548789.localdomain systemd[1]: Started Session 55 of User zuul.
Dec 06 09:44:59 np0005548789.localdomain sshd[226020]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:44:59 np0005548789.localdomain sshd[226023]: Received disconnect from 192.168.122.30 port 37324:11: disconnected by user
Dec 06 09:44:59 np0005548789.localdomain sshd[226023]: Disconnected from user zuul 192.168.122.30 port 37324
Dec 06 09:44:59 np0005548789.localdomain sshd[226020]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:44:59 np0005548789.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Dec 06 09:44:59 np0005548789.localdomain systemd-logind[766]: Session 55 logged out. Waiting for processes to exit.
Dec 06 09:44:59 np0005548789.localdomain systemd-logind[766]: Removed session 55.
Dec 06 09:45:00 np0005548789.localdomain python3.9[226131]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:01 np0005548789.localdomain python3.9[226217]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014300.128928-3389-239748245121034/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46892 DF PROTO=TCP SPT=51098 DPT=9105 SEQ=1871906699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D967F00000000001030307) 
Dec 06 09:45:02 np0005548789.localdomain sudo[226276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:45:02 np0005548789.localdomain sudo[226276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:02 np0005548789.localdomain sudo[226276]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:02 np0005548789.localdomain sudo[226313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:45:02 np0005548789.localdomain sudo[226313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:02 np0005548789.localdomain python3.9[226361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:03 np0005548789.localdomain python3.9[226439]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:03 np0005548789.localdomain podman[226504]: 2025-12-06 09:45:03.221429453 +0000 UTC m=+0.082208183 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7)
Dec 06 09:45:03 np0005548789.localdomain podman[226504]: 2025-12-06 09:45:03.321461288 +0000 UTC m=+0.182240008 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, version=7, io.buildah.version=1.41.4, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Dec 06 09:45:03 np0005548789.localdomain sudo[226313]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:03 np0005548789.localdomain sudo[226571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:45:03 np0005548789.localdomain sudo[226571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:03 np0005548789.localdomain sudo[226571]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:03 np0005548789.localdomain sudo[226589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:45:03 np0005548789.localdomain sudo[226589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:04 np0005548789.localdomain python3.9[226711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:04 np0005548789.localdomain sudo[226589]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24998 DF PROTO=TCP SPT=58212 DPT=9102 SEQ=2104232652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D971B00000000001030307) 
Dec 06 09:45:04 np0005548789.localdomain python3.9[226816]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014303.8460283-3389-193099518521270/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:05 np0005548789.localdomain sudo[226905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:45:05 np0005548789.localdomain sudo[226905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:45:05 np0005548789.localdomain sudo[226905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:05 np0005548789.localdomain python3.9[226941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:05 np0005548789.localdomain python3.9[227028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014304.958637-3389-88115518599741/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=84cd402761cf817a5c030b63eb0a858a413df311 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:06 np0005548789.localdomain python3.9[227136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:07 np0005548789.localdomain python3.9[227222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014306.1094482-3389-222959011683758/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:07 np0005548789.localdomain python3.9[227330]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24780 DF PROTO=TCP SPT=44052 DPT=9102 SEQ=2062190782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D97DEF0000000001030307) 
Dec 06 09:45:08 np0005548789.localdomain python3.9[227416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014307.2097983-3389-89954502211531/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:09 np0005548789.localdomain sudo[227524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igsnriaycbhjoweyyrshzvwdxgqkxwug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014308.931783-3639-273820108913395/AnsiballZ_file.py
Dec 06 09:45:09 np0005548789.localdomain sudo[227524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:09 np0005548789.localdomain python3.9[227526]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:09 np0005548789.localdomain sudo[227524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:09 np0005548789.localdomain sudo[227634]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hupicjismwvbhjyfkvlwvyhrbfxcnqxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014309.6601024-3662-165660210747511/AnsiballZ_copy.py
Dec 06 09:45:09 np0005548789.localdomain sudo[227634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:10 np0005548789.localdomain python3.9[227636]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:10 np0005548789.localdomain sudo[227634]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:10 np0005548789.localdomain sudo[227744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezbqnbvkzpqnbwcqxgqefhuzwahomikm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014310.3217046-3687-241922720726611/AnsiballZ_stat.py
Dec 06 09:45:10 np0005548789.localdomain sudo[227744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:10 np0005548789.localdomain python3.9[227746]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:10 np0005548789.localdomain sudo[227744]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25000 DF PROTO=TCP SPT=58212 DPT=9102 SEQ=2104232652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D989700000000001030307) 
Dec 06 09:45:11 np0005548789.localdomain sudo[227856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tobacyolmpxaqyqahfssxenfayscbsgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014311.1217127-3713-119824242489980/AnsiballZ_file.py
Dec 06 09:45:11 np0005548789.localdomain sudo[227856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:11 np0005548789.localdomain python3.9[227858]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:11 np0005548789.localdomain sudo[227856]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:12 np0005548789.localdomain python3.9[227966]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:13 np0005548789.localdomain python3.9[228076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:13 np0005548789.localdomain python3.9[228162]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014312.5535722-3765-177223898333608/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:14 np0005548789.localdomain python3.9[228270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19451 DF PROTO=TCP SPT=36780 DPT=9101 SEQ=3422919803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D996F00000000001030307) 
Dec 06 09:45:14 np0005548789.localdomain python3.9[228356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014313.7258832-3809-30460689960347/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:15 np0005548789.localdomain sudo[228464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afdnfqgvvunfjjnojdggqgnppdtmbdkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014315.2536325-3860-125784632446390/AnsiballZ_container_config_data.py
Dec 06 09:45:15 np0005548789.localdomain sudo[228464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:45:15 np0005548789.localdomain podman[228467]: 2025-12-06 09:45:15.684963323 +0000 UTC m=+0.081469819 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:45:15 np0005548789.localdomain podman[228467]: 2025-12-06 09:45:15.697099433 +0000 UTC m=+0.093605929 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 09:45:15 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:45:15 np0005548789.localdomain python3.9[228466]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 06 09:45:15 np0005548789.localdomain sudo[228464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:16 np0005548789.localdomain sudo[228591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flwfkupvedxthaxbsmsafnjldmfcbcon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014316.1236951-3888-256691654372203/AnsiballZ_container_config_hash.py
Dec 06 09:45:16 np0005548789.localdomain sudo[228591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:16 np0005548789.localdomain python3.9[228593]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:45:16 np0005548789.localdomain sudo[228591]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19453 DF PROTO=TCP SPT=36780 DPT=9101 SEQ=3422919803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9A2EF0000000001030307) 
Dec 06 09:45:17 np0005548789.localdomain sudo[228701]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zielnjhjjuuoygulshniydzsyeqcvzhg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014317.090024-3917-88227394314292/AnsiballZ_edpm_container_manage.py
Dec 06 09:45:17 np0005548789.localdomain sudo[228701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:17 np0005548789.localdomain python3[228703]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:45:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23944 DF PROTO=TCP SPT=38432 DPT=9105 SEQ=1457126476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9AC2F0000000001030307) 
Dec 06 09:45:20 np0005548789.localdomain sshd[228730]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:45:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23945 DF PROTO=TCP SPT=38432 DPT=9105 SEQ=1457126476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9BBEF0000000001030307) 
Dec 06 09:45:24 np0005548789.localdomain sshd[228730]: Received disconnect from 45.78.222.162 port 58632:11: Bye Bye [preauth]
Dec 06 09:45:24 np0005548789.localdomain sshd[228730]: Disconnected from authenticating user root 45.78.222.162 port 58632 [preauth]
Dec 06 09:45:25 np0005548789.localdomain sshd[228758]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:45:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50298 DF PROTO=TCP SPT=60554 DPT=9882 SEQ=1031214754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9C3F00000000001030307) 
Dec 06 09:45:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:45:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:45:26 np0005548789.localdomain sshd[228758]: Received disconnect from 103.234.151.178 port 10804:11: Bye Bye [preauth]
Dec 06 09:45:26 np0005548789.localdomain sshd[228758]: Disconnected from authenticating user root 103.234.151.178 port 10804 [preauth]
Dec 06 09:45:27 np0005548789.localdomain podman[228760]: 2025-12-06 09:45:27.669911172 +0000 UTC m=+1.830018975 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec 06 09:45:27 np0005548789.localdomain podman[228760]: 2025-12-06 09:45:27.693995126 +0000 UTC m=+1.854102919 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:45:27 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:45:27 np0005548789.localdomain podman[228717]: 2025-12-06 09:45:17.807641773 +0000 UTC m=+0.045823724 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:27 np0005548789.localdomain podman[228761]: 2025-12-06 09:45:27.772435827 +0000 UTC m=+1.927701115 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:45:27 np0005548789.localdomain podman[228761]: 2025-12-06 09:45:27.803055101 +0000 UTC m=+1.958320389 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:45:27 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:45:27 np0005548789.localdomain podman[228827]: 
Dec 06 09:45:27 np0005548789.localdomain podman[228827]: 2025-12-06 09:45:27.909723029 +0000 UTC m=+0.073781962 container create a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 09:45:27 np0005548789.localdomain podman[228827]: 2025-12-06 09:45:27.870036644 +0000 UTC m=+0.034095617 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:27 np0005548789.localdomain python3[228703]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 06 09:45:28 np0005548789.localdomain sudo[228701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19455 DF PROTO=TCP SPT=36780 DPT=9101 SEQ=3422919803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9D3EF0000000001030307) 
Dec 06 09:45:29 np0005548789.localdomain sudo[228973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmtkwowsfnavoumqluxrjpoobmldxjtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014329.645456-3941-140027595326095/AnsiballZ_stat.py
Dec 06 09:45:29 np0005548789.localdomain sudo[228973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:30 np0005548789.localdomain python3.9[228975]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:30 np0005548789.localdomain sudo[228973]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:31 np0005548789.localdomain sudo[229085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgxpexvakbgfhbwdgjxxdwkbigaonfxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014330.9576998-3977-160711828814931/AnsiballZ_container_config_data.py
Dec 06 09:45:31 np0005548789.localdomain sudo[229085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:31 np0005548789.localdomain python3.9[229087]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 06 09:45:31 np0005548789.localdomain sudo[229085]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23946 DF PROTO=TCP SPT=38432 DPT=9105 SEQ=1457126476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9DBEF0000000001030307) 
Dec 06 09:45:32 np0005548789.localdomain sudo[229195]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxtovhqbmqhsfzrgtqjjjivsxewqwvdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014331.848066-4004-62436847046751/AnsiballZ_container_config_hash.py
Dec 06 09:45:32 np0005548789.localdomain sudo[229195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:32 np0005548789.localdomain python3.9[229197]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:45:32 np0005548789.localdomain sudo[229195]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:32 np0005548789.localdomain sshd[229253]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:45:33 np0005548789.localdomain sudo[229307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umiumoepanywadhywflkewbtpaunwvgz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014332.8211749-4034-133655062023742/AnsiballZ_edpm_container_manage.py
Dec 06 09:45:33 np0005548789.localdomain sudo[229307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:33 np0005548789.localdomain python3[229309]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:45:33 np0005548789.localdomain python3[229309]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:33 np0005548789.localdomain podman[229359]: 2025-12-06 09:45:33.853419821 +0000 UTC m=+0.138638117 container remove 41cf0c2eef405ae219098f5959c425c7035944a28046cea63417ef9b175b7007 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '18576754feb36b85b5c8742ad9b5643d-179caa3982511c1fd3314b961771f96c'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, managed_by=tripleo_ansible)
Dec 06 09:45:33 np0005548789.localdomain python3[229309]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Dec 06 09:45:33 np0005548789.localdomain podman[229372]: 
Dec 06 09:45:33 np0005548789.localdomain podman[229372]: 2025-12-06 09:45:33.953947781 +0000 UTC m=+0.081804170 container create 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:45:33 np0005548789.localdomain podman[229372]: 2025-12-06 09:45:33.917845172 +0000 UTC m=+0.045701481 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:45:33 np0005548789.localdomain python3[229309]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 06 09:45:34 np0005548789.localdomain sudo[229307]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:34 np0005548789.localdomain sshd[229253]: Received disconnect from 64.227.156.63 port 52426:11: Bye Bye [preauth]
Dec 06 09:45:34 np0005548789.localdomain sshd[229253]: Disconnected from authenticating user root 64.227.156.63 port 52426 [preauth]
Dec 06 09:45:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=507 DF PROTO=TCP SPT=40508 DPT=9102 SEQ=222705752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9E6AF0000000001030307) 
Dec 06 09:45:35 np0005548789.localdomain sudo[229518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thfotgkoqmxeunelaratzwzdupwscire ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014335.2770321-4058-257868049891377/AnsiballZ_stat.py
Dec 06 09:45:35 np0005548789.localdomain sudo[229518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:35 np0005548789.localdomain python3.9[229520]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:35 np0005548789.localdomain sudo[229518]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:36 np0005548789.localdomain sudo[229630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxljctskkimflgomkvwkiwfxajemzjnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.2054238-4085-26712439861146/AnsiballZ_file.py
Dec 06 09:45:36 np0005548789.localdomain sudo[229630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:36 np0005548789.localdomain python3.9[229632]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:36 np0005548789.localdomain sudo[229630]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:37 np0005548789.localdomain sudo[229739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rydxndkazqkougtvsuxliesuhgfeozmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.761537-4085-13462632909343/AnsiballZ_copy.py
Dec 06 09:45:37 np0005548789.localdomain sudo[229739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:37 np0005548789.localdomain python3.9[229741]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014336.761537-4085-13462632909343/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:45:37 np0005548789.localdomain sudo[229739]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:37 np0005548789.localdomain sudo[229794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iidqnhulqxkyupzptuxuuzecxfjdjwei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.761537-4085-13462632909343/AnsiballZ_systemd.py
Dec 06 09:45:37 np0005548789.localdomain sudo[229794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:37 np0005548789.localdomain python3.9[229796]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:45:37 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:45:37 np0005548789.localdomain systemd-rc-local-generator[229822]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:45:37 np0005548789.localdomain systemd-sysv-generator[229825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:38 np0005548789.localdomain sudo[229794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:38 np0005548789.localdomain sudo[229884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-segqtwtyfhjzqfpiqalmcfomdybarvxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014336.761537-4085-13462632909343/AnsiballZ_systemd.py
Dec 06 09:45:38 np0005548789.localdomain sudo[229884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:38 np0005548789.localdomain python3.9[229886]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:45:38 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:45:39 np0005548789.localdomain systemd-rc-local-generator[229909]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:45:39 np0005548789.localdomain systemd-sysv-generator[229913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:45:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47735 DF PROTO=TCP SPT=46962 DPT=9882 SEQ=1071838655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9F7EF0000000001030307) 
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:39 np0005548789.localdomain podman[229927]: 2025-12-06 09:45:39.39188973 +0000 UTC m=+0.119081208 container init 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125)
Dec 06 09:45:39 np0005548789.localdomain podman[229927]: 2025-12-06 09:45:39.401454027 +0000 UTC m=+0.128645505 container start 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:45:39 np0005548789.localdomain podman[229927]: nova_compute
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: tmp-crun.8RpS1f.mount: Deactivated successfully.
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + sudo -E kolla_set_configs
Dec 06 09:45:39 np0005548789.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:45:39 np0005548789.localdomain sudo[229884]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Validating config file
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying service configuration files
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Writing out command to execute
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: ++ cat /run_command
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + CMD=nova-compute
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + ARGS=
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + sudo kolla_copy_cacerts
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + [[ ! -n '' ]]
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + . kolla_extend_start
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: Running command: 'nova-compute'
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + umask 0022
Dec 06 09:45:39 np0005548789.localdomain nova_compute[229942]: + exec nova-compute
Dec 06 09:45:40 np0005548789.localdomain python3.9[230062]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=509 DF PROTO=TCP SPT=40508 DPT=9102 SEQ=222705752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52D9FE6F0000000001030307) 
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.183 229946 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.183 229946 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.183 229946 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.183 229946 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.301 229946 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.322 229946 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.323 229946 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.788 229946 INFO nova.virt.driver [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.906 229946 INFO nova.compute.provider_config [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.918 229946 WARNING nova.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.918 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.918 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.919 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.919 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.919 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.919 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.919 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.920 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.921 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] console_host                   = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.922 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.923 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.924 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.925 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.926 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.927 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.928 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.929 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.930 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.931 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.932 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.933 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.934 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.935 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.936 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.937 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.938 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.939 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.940 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.941 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.942 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.943 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.944 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.945 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.946 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.947 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.948 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.949 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.950 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.951 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.952 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.953 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.954 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.955 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.956 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.957 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.958 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.959 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.960 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.961 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.962 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.963 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.964 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.965 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.966 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.967 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.968 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.969 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.970 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.971 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.972 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.973 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.974 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.975 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.976 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.977 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.978 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.979 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.980 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.981 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.982 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.983 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.984 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.985 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.986 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.987 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 WARNING oslo_config.cfg [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: ).  Its value may be silently ignored in the future.
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.988 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.989 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.990 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.991 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.992 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.993 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.994 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.995 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.996 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.997 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.998 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:41 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:41.999 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.000 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.001 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.002 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.003 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.004 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.005 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.006 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.007 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.008 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.008 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.008 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.008 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.008 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.009 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.010 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.011 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.012 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.013 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.014 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.015 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.016 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.017 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.017 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.017 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.017 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.017 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.018 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.019 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.020 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.021 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.022 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.022 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.022 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.022 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.022 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.023 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.023 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.023 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.023 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.023 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.024 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.024 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.024 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.024 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.024 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.025 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.025 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.025 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.025 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.026 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.026 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.026 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.026 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.027 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.027 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.027 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.027 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.027 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.028 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.028 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.028 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.028 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.028 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.029 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.029 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.029 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.029 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.029 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.030 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.030 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.030 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.030 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.030 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.031 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.031 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.031 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.031 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.032 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.032 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.032 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.032 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.032 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.033 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.033 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.033 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.033 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.033 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.034 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.034 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.034 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.034 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.034 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.035 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.035 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.035 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.035 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.036 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.037 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.037 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.037 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.037 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.038 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.038 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.038 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.038 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.038 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.039 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.039 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.039 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.039 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.039 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.040 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.040 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.040 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.040 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.040 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.041 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.042 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.042 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.042 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.042 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.042 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.043 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.043 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.043 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.043 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.044 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.044 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.044 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.044 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.044 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.045 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.045 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.045 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.045 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.045 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.046 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.046 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.046 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.046 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.046 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.047 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.047 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.047 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.047 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.047 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.048 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.048 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.048 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.048 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.048 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.049 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.050 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.050 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.050 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.050 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.050 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.051 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.052 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.053 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.053 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.053 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.053 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.053 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.054 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.055 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.056 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.057 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.058 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.059 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.059 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.059 229946 DEBUG oslo_service.service [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.060 229946 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.072 229946 INFO nova.virt.node [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.072 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.073 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.073 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.073 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.085 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa37c2d8af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.088 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa37c2d8af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.088 229946 INFO nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Connection event '1' reason 'None'
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.097 229946 DEBUG nova.virt.libvirt.volume.mount [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.111 229946 INFO nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <host>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <uuid>0b20d7bd-1341-4912-afa7-eec4e2b0c648</uuid>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <arch>x86_64</arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model>EPYC-Rome-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <vendor>AMD</vendor>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <microcode version='16777317'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='x2apic'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='tsc-deadline'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='osxsave'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='hypervisor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='tsc_adjust'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='spec-ctrl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='stibp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='arch-capabilities'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='cmp_legacy'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='topoext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='virt-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='lbrv'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='tsc-scale'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='vmcb-clean'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='pause-filter'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='pfthreshold'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='svme-addr-chk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='rdctl-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='mds-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature name='pschange-mc-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <pages unit='KiB' size='4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <pages unit='KiB' size='2048'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <power_management>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <suspend_mem/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <suspend_disk/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <suspend_hybrid/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </power_management>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <iommu support='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <migration_features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <live/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <uri_transports>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <uri_transport>tcp</uri_transport>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <uri_transport>rdma</uri_transport>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </uri_transports>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </migration_features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <topology>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <cells num='1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <cell id='0'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <memory unit='KiB'>16116612</memory>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <distances>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <sibling id='0' value='10'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           </distances>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           <cpus num='8'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:           </cpus>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         </cell>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </cells>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </topology>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <cache>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </cache>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <secmodel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model>selinux</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <doi>0</doi>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </secmodel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <secmodel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model>dac</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <doi>0</doi>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </secmodel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </host>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <guest>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <os_type>hvm</os_type>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <arch name='i686'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <wordsize>32</wordsize>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <domain type='qemu'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <domain type='kvm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <pae/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <nonpae/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <apic default='on' toggle='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <cpuselection/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <deviceboot/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <externalSnapshot/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </guest>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <guest>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <os_type>hvm</os_type>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <arch name='x86_64'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <wordsize>64</wordsize>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <domain type='qemu'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <domain type='kvm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <apic default='on' toggle='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <cpuselection/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <deviceboot/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <externalSnapshot/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </guest>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: </capabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.116 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.131 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: <domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <domain>kvm</domain>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <arch>i686</arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <vcpu max='240'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <iothreads supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <os supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='firmware'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <loader supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>rom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pflash</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='readonly'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>yes</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='secure'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </loader>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </os>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='maximumMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <vendor>AMD</vendor>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='succor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='custom' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-128'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-256'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-512'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <memoryBacking supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='sourceType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>anonymous</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>memfd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </memoryBacking>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <disk supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='diskDevice'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>disk</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cdrom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>floppy</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>lun</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ide</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>fdc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>sata</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </disk>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <graphics supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vnc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egl-headless</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </graphics>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <video supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='modelType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vga</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cirrus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>none</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>bochs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ramfb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </video>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hostdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='mode'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>subsystem</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='startupPolicy'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>mandatory</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>requisite</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>optional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='subsysType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pci</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='capsType'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='pciBackend'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hostdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <rng supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>random</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </rng>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <filesystem supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='driverType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>path</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>handle</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtiofs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </filesystem>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <tpm supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-tis</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-crb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emulator</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>external</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendVersion'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>2.0</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </tpm>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <redirdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </redirdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <channel supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </channel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <crypto supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </crypto>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <interface supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>passt</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </interface>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <panic supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>isa</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>hyperv</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </panic>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <console supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>null</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dev</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pipe</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stdio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>udp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tcp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu-vdagent</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </console>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <gic supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <genid supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backup supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <async-teardown supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <ps2 supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sev supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sgx supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hyperv supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='features'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>relaxed</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vapic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>spinlocks</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vpindex</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>runtime</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>synic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stimer</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reset</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vendor_id</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>frequencies</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reenlightenment</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tlbflush</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ipi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>avic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emsr_bitmap</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>xmm_input</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hyperv>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <launchSecurity supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='sectype'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tdx</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </launchSecurity>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: </domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.137 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: <domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <domain>kvm</domain>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <arch>i686</arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <vcpu max='1024'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <iothreads supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <os supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='firmware'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <loader supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>rom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pflash</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='readonly'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>yes</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='secure'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </loader>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </os>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='maximumMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <vendor>AMD</vendor>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='succor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='custom' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-128'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-256'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-512'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <memoryBacking supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='sourceType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>anonymous</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>memfd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </memoryBacking>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <disk supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='diskDevice'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>disk</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cdrom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>floppy</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>lun</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>fdc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>sata</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </disk>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <graphics supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vnc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egl-headless</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </graphics>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <video supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='modelType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vga</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cirrus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>none</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>bochs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ramfb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </video>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hostdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='mode'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>subsystem</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='startupPolicy'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>mandatory</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>requisite</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>optional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='subsysType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pci</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='capsType'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='pciBackend'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hostdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <rng supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>random</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </rng>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <filesystem supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='driverType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>path</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>handle</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtiofs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </filesystem>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <tpm supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-tis</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-crb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emulator</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>external</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendVersion'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>2.0</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </tpm>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <redirdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </redirdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <channel supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </channel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <crypto supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </crypto>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <interface supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>passt</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </interface>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <panic supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>isa</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>hyperv</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </panic>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <console supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>null</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dev</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pipe</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stdio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>udp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tcp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu-vdagent</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </console>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <gic supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <genid supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backup supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <async-teardown supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <ps2 supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sev supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sgx supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hyperv supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='features'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>relaxed</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vapic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>spinlocks</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vpindex</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>runtime</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>synic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stimer</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reset</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vendor_id</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>frequencies</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reenlightenment</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tlbflush</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ipi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>avic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emsr_bitmap</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>xmm_input</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hyperv>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <launchSecurity supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='sectype'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tdx</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </launchSecurity>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: </domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.161 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.167 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: <domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <domain>kvm</domain>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <arch>x86_64</arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <vcpu max='240'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <iothreads supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <os supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='firmware'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <loader supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>rom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pflash</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='readonly'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>yes</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='secure'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </loader>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </os>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='maximumMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <vendor>AMD</vendor>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='succor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='custom' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-128'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-256'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-512'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <memoryBacking supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='sourceType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>anonymous</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>memfd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </memoryBacking>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <disk supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='diskDevice'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>disk</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cdrom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>floppy</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>lun</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ide</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>fdc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>sata</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </disk>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <graphics supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vnc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egl-headless</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </graphics>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <video supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='modelType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vga</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cirrus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>none</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>bochs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ramfb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </video>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hostdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='mode'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>subsystem</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='startupPolicy'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>mandatory</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>requisite</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>optional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='subsysType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pci</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='capsType'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='pciBackend'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hostdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <rng supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>random</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </rng>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <filesystem supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='driverType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>path</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>handle</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtiofs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </filesystem>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <tpm supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-tis</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-crb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emulator</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>external</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendVersion'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>2.0</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </tpm>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <redirdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </redirdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <channel supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </channel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <crypto supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </crypto>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <interface supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>passt</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </interface>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <panic supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>isa</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>hyperv</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </panic>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <console supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>null</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dev</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pipe</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stdio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>udp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tcp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu-vdagent</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </console>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <gic supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <genid supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backup supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <async-teardown supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <ps2 supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sev supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sgx supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hyperv supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='features'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>relaxed</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vapic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>spinlocks</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vpindex</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>runtime</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>synic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stimer</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reset</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vendor_id</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>frequencies</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reenlightenment</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tlbflush</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ipi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>avic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emsr_bitmap</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>xmm_input</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hyperv>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <launchSecurity supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='sectype'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tdx</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </launchSecurity>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: </domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.219 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: <domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <domain>kvm</domain>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <arch>x86_64</arch>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <vcpu max='1024'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <iothreads supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <os supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='firmware'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>efi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <loader supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>rom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pflash</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='readonly'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>yes</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='secure'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>yes</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>no</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </loader>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </os>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='maximumMigratable'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>on</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>off</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <vendor>AMD</vendor>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='succor'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <mode name='custom' supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Denverton-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='auto-ibrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amd-psfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='stibp-always-on'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='EPYC-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-128'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-256'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx10-512'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='prefetchiti'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Haswell-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512er'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512pf'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fma4'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tbm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xop'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='amx-tile'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-bf16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-fp16'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bitalg'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrc'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fzrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='la57'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='taa-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xfd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ifma'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cmpccxadd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fbsdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='fsrs'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ibrs-all'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mcdt-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pbrsb-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='psdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='serialize'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vaes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='hle'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='rtm'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512bw'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512cd'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512dq'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512f'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='avx512vl'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='invpcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pcid'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='pku'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='mpx'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='core-capability'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='split-lock-detect'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='cldemote'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='erms'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='gfni'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdir64b'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='movdiri'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='xsaves'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='athlon-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='core2duo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='coreduo-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='n270-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='ss'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <blockers model='phenom-v1'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnow'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <feature name='3dnowext'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </blockers>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </mode>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </cpu>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <memoryBacking supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <enum name='sourceType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>anonymous</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <value>memfd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </memoryBacking>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <disk supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='diskDevice'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>disk</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cdrom</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>floppy</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>lun</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>fdc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>sata</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </disk>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <graphics supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vnc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egl-headless</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </graphics>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <video supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='modelType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vga</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>cirrus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>none</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>bochs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ramfb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </video>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hostdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='mode'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>subsystem</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='startupPolicy'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>mandatory</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>requisite</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>optional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='subsysType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pci</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>scsi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='capsType'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='pciBackend'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hostdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <rng supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtio-non-transitional</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>random</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>egd</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </rng>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <filesystem supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='driverType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>path</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>handle</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>virtiofs</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </filesystem>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <tpm supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-tis</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tpm-crb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emulator</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>external</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendVersion'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>2.0</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </tpm>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <redirdev supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='bus'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>usb</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </redirdev>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <channel supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </channel>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <crypto supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendModel'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>builtin</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </crypto>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <interface supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='backendType'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>default</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>passt</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </interface>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <panic supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='model'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>isa</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>hyperv</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </panic>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <console supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='type'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>null</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vc</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pty</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dev</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>file</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>pipe</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stdio</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>udp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tcp</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>unix</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>qemu-vdagent</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>dbus</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </console>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </devices>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   <features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <gic supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <genid supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <backup supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <async-teardown supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <ps2 supported='yes'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sev supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <sgx supported='no'/>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <hyperv supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='features'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>relaxed</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vapic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>spinlocks</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vpindex</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>runtime</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>synic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>stimer</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reset</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>vendor_id</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>frequencies</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>reenlightenment</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tlbflush</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>ipi</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>avic</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>emsr_bitmap</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>xmm_input</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </defaults>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </hyperv>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     <launchSecurity supported='yes'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       <enum name='sectype'>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:         <value>tdx</value>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:       </enum>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:     </launchSecurity>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:   </features>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: </domainCapabilities>
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.272 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.272 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.272 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.273 229946 INFO nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Secure Boot support detected
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.275 229946 INFO nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.275 229946 INFO nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.289 229946 DEBUG nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.368 229946 INFO nova.virt.node [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.393 229946 DEBUG nova.compute.manager [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Verified node 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad matches my host np0005548789.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.436 229946 DEBUG nova.compute.manager [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.440 229946 DEBUG nova.virt.libvirt.vif [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005548789.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T08:44:43Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.440 229946 DEBUG nova.network.os_vif_util [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.441 229946 DEBUG nova.network.os_vif_util [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.442 229946 DEBUG os_vif [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.489 229946 DEBUG ovsdbapp.backend.ovs_idl [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.490 229946 DEBUG ovsdbapp.backend.ovs_idl [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.490 229946 DEBUG ovsdbapp.backend.ovs_idl [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.491 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.491 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.492 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.492 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.494 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.496 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.516 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.516 229946 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.516 229946 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:45:42 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.517 229946 INFO oslo.privsep.daemon [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpujlz3pxe/privsep.sock']
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.069 229946 INFO oslo.privsep.daemon [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.961 230338 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.966 230338 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.970 230338 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:42.970 230338 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230338
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.327 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.328 229946 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86fc0b7a-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.329 229946 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86fc0b7a-fb, col_values=(('external_ids', {'iface-id': '86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:77:f3', 'vm-uuid': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.329 229946 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.330 229946 INFO os_vif [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb')
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.330 229946 DEBUG nova.compute.manager [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.334 229946 DEBUG nova.compute.manager [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.334 229946 INFO nova.compute.manager [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.831 229946 INFO nova.service [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating service version for nova-compute on np0005548789.localdomain from 57 to 66
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.872 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.872 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.873 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.873 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:45:43 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:43.874 229946 DEBUG oslo_concurrency.processutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=757 DF PROTO=TCP SPT=49776 DPT=9101 SEQ=4200883413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA0C200000000001030307) 
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.322 229946 DEBUG oslo_concurrency.processutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.401 229946 DEBUG nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.401 229946 DEBUG nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:45:44 np0005548789.localdomain systemd[1]: Started libvirt nodedev daemon.
Dec 06 09:45:44 np0005548789.localdomain python3.9[230308]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.634 229946 WARNING nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.635 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12922MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.635 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.635 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.841 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.857 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.857 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.857 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.910 229946 DEBUG nova.scheduler.client.report [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.970 229946 DEBUG nova.scheduler.client.report [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.971 229946 DEBUG nova.compute.provider_tree [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:44 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:44.996 229946 DEBUG nova.scheduler.client.report [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.022 229946 DEBUG nova.scheduler.client.report [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_BMI2,HW_CPU_X86_BMI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_ACCELERATORS,HW_CPU_X86_ABM,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SSSE3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.070 229946 DEBUG oslo_concurrency.processutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.487 229946 DEBUG oslo_concurrency.processutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.492 229946 DEBUG nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.492 229946 INFO nova.virt.libvirt.host [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.493 229946 DEBUG nova.compute.provider_tree [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.494 229946 DEBUG nova.virt.libvirt.driver [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.544 229946 DEBUG nova.scheduler.client.report [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updated inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.545 229946 DEBUG nova.compute.provider_tree [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.545 229946 DEBUG nova.compute.provider_tree [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:45 np0005548789.localdomain python3.9[230554]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.640 229946 DEBUG nova.compute.provider_tree [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Updating resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.684 229946 DEBUG nova.compute.resource_tracker [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.685 229946 DEBUG oslo_concurrency.lockutils [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.685 229946 DEBUG nova.service [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.764 229946 DEBUG nova.service [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:45:45 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:45.765 229946 DEBUG nova.servicegroup.drivers.db [None req-100b2f1f-6f50-4f10-8014-cfdc1355b0b2 - - - - - -] DB_Driver: join new ServiceGroup member np0005548789.localdomain to the compute group, service = <Service: host=np0005548789.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:45:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:45:45 np0005548789.localdomain podman[230574]: 2025-12-06 09:45:45.913906868 +0000 UTC m=+0.078555285 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:45:45 np0005548789.localdomain podman[230574]: 2025-12-06 09:45:45.948030225 +0000 UTC m=+0.112678602 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:45:45 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:45:46 np0005548789.localdomain sudo[230684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csuaqscbijesidawhrzehfrirjxwmfkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014345.9295828-4265-32825569590415/AnsiballZ_podman_container.py
Dec 06 09:45:46 np0005548789.localdomain sudo[230684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:46 np0005548789.localdomain python3.9[230686]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:45:46 np0005548789.localdomain sudo[230684]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:46 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation.
Dec 06 09:45:46 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:45:46 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:45:46 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:45:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:45:47.275 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:45:47.276 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:45:47.277 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=759 DF PROTO=TCP SPT=49776 DPT=9101 SEQ=4200883413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA182F0000000001030307) 
Dec 06 09:45:47 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:47.497 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:48 np0005548789.localdomain sudo[230818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncvgcasizpqatjrxozxxsopqhjjjgckt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014347.7414608-4289-3907987803018/AnsiballZ_systemd.py
Dec 06 09:45:48 np0005548789.localdomain sudo[230818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:48 np0005548789.localdomain python3.9[230820]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:45:48 np0005548789.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:45:48 np0005548789.localdomain systemd[1]: tmp-crun.JEj9ic.mount: Deactivated successfully.
Dec 06 09:45:48 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:48.510 229946 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Dec 06 09:45:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38133 DF PROTO=TCP SPT=58278 DPT=9105 SEQ=1660505716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA21300000000001030307) 
Dec 06 09:45:49 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:49.878 229946 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:49 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:49.882 229946 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Dec 06 09:45:49 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:49.884 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:45:49 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:49.885 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:45:49 np0005548789.localdomain nova_compute[229942]: 2025-12-06 09:45:49.885 229946 DEBUG oslo_concurrency.lockutils [None req-8b348d35-d06d-4f73-9d7d-edbcdbc5087d - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:45:50 np0005548789.localdomain virtqemud[203911]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 06 09:45:50 np0005548789.localdomain virtqemud[203911]: hostname: np0005548789.localdomain
Dec 06 09:45:50 np0005548789.localdomain virtqemud[203911]: End of file while reading data: Input/output error
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: libpod-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8.scope: Deactivated successfully.
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: libpod-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8.scope: Consumed 4.712s CPU time.
Dec 06 09:45:50 np0005548789.localdomain podman[230824]: 2025-12-06 09:45:50.267923939 +0000 UTC m=+1.820139827 container died 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: tmp-crun.eHqPm3.mount: Deactivated successfully.
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8-userdata-shm.mount: Deactivated successfully.
Dec 06 09:45:50 np0005548789.localdomain podman[230824]: 2025-12-06 09:45:50.326175172 +0000 UTC m=+1.878390990 container cleanup 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Dec 06 09:45:50 np0005548789.localdomain podman[230824]: nova_compute
Dec 06 09:45:50 np0005548789.localdomain podman[230868]: error opening file `/run/crun/6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8/status`: No such file or directory
Dec 06 09:45:50 np0005548789.localdomain podman[230855]: 2025-12-06 09:45:50.422550789 +0000 UTC m=+0.068912096 container cleanup 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:45:50 np0005548789.localdomain podman[230855]: nova_compute
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:50 np0005548789.localdomain podman[230870]: 2025-12-06 09:45:50.549928703 +0000 UTC m=+0.091140021 container init 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:45:50 np0005548789.localdomain podman[230870]: 2025-12-06 09:45:50.560523373 +0000 UTC m=+0.101734691 container start 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 06 09:45:50 np0005548789.localdomain podman[230870]: nova_compute
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + sudo -E kolla_set_configs
Dec 06 09:45:50 np0005548789.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:45:50 np0005548789.localdomain sudo[230818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Validating config file
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying service configuration files
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Writing out command to execute
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: ++ cat /run_command
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + CMD=nova-compute
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + ARGS=
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + sudo kolla_copy_cacerts
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + [[ ! -n '' ]]
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + . kolla_extend_start
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: Running command: 'nova-compute'
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + umask 0022
Dec 06 09:45:50 np0005548789.localdomain nova_compute[230884]: + exec nova-compute
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.354 230888 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.355 230888 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.355 230888 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.355 230888 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.472 230888 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.492 230888 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.492 230888 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:45:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:52.891 230888 INFO nova.virt.driver [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.010 230888 INFO nova.compute.provider_config [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.018 230888 WARNING nova.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.018 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.018 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.018 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.019 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.019 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.019 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.019 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.019 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.020 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.021 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.021 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.021 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.021 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.021 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] console_host                   = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.022 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.023 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.023 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.023 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.023 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.023 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.024 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.025 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.025 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.025 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.025 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.025 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.026 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.027 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.028 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.028 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.028 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.028 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.028 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.029 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.030 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.031 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.031 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.031 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.031 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.031 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.032 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.033 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.033 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.033 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.033 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.033 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.034 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.034 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.034 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.034 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.034 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.035 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.036 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.037 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.037 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.037 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.037 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.037 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.038 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.039 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.040 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.040 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.040 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.040 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.040 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.041 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.042 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.042 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.042 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.042 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.042 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.043 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.044 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.045 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.045 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.045 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.045 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.045 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.046 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.047 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.047 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.047 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.047 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.047 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.048 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.049 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.049 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.049 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.049 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.049 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.050 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.051 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.051 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.051 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.051 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.051 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.052 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.053 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.054 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.054 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.054 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.054 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.054 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.055 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.056 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.057 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.057 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.057 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.057 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.057 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.058 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.059 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.059 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.059 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.059 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.059 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.060 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.060 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.060 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.060 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.060 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.061 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.062 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.062 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.062 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.062 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.062 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.063 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.064 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.064 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.064 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.064 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.064 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.065 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.066 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.066 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.066 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.066 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.066 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.067 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.068 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.068 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.068 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.068 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.068 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.069 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.070 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.070 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.070 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.070 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.070 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.071 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.072 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.072 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.072 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.072 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.072 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.073 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.074 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.074 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.074 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.074 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.074 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.075 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.076 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.076 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.076 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.076 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.076 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.077 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.078 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.078 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.078 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.078 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.078 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.079 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.080 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.080 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.080 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.080 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.080 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.081 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.081 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.081 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.081 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.081 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.082 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.083 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.083 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.083 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.083 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.084 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.085 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.085 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.085 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.085 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.085 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.086 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.087 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.088 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.088 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.088 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.088 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.088 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.089 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.090 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.090 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.090 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.090 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.090 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.091 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.092 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.092 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.092 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.092 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.092 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.093 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.093 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.093 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.093 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.093 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.094 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.095 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.096 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.096 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.096 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.096 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.096 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.097 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.098 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.098 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.098 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.098 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.098 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.099 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.100 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.100 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.100 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.100 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.100 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.101 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.102 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.102 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.102 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.102 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.102 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.103 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.104 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.104 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.104 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.104 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.104 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.105 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.106 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.106 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.106 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.106 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.106 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.107 230888 WARNING oslo_config.cfg [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: ).  Its value may be silently ignored in the future.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.108 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.108 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.108 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.108 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.109 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.109 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.109 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.109 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.110 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.110 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.110 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.110 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.110 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.111 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.112 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.112 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.112 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.112 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.112 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.113 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.114 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.114 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.114 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.114 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.114 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.115 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.115 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.115 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.115 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.116 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.117 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.117 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.117 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.117 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.117 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.118 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.119 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.120 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.120 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.120 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.120 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.120 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.121 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.121 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.121 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.121 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.121 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.122 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.123 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.124 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.124 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.124 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.124 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.124 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.125 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.125 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.125 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.125 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.125 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.126 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.127 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.128 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.128 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.128 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.128 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.128 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.129 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.129 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.129 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.129 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.129 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.130 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.131 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.131 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.131 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.131 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.131 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.132 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.133 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.133 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.133 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.133 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.133 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.134 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.134 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.134 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.134 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.134 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.135 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.136 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.136 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.136 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.136 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.136 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.137 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.137 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.137 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.137 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.137 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.138 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.139 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.140 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.140 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.140 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.140 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.140 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.141 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.141 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.141 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.141 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.141 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.142 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.142 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.142 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.142 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.142 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.143 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.144 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.144 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.144 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.144 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.144 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.145 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.146 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.146 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.146 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.146 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.146 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.147 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.147 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.147 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.147 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.147 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.148 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.149 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.149 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.149 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.149 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.149 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.150 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.151 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.152 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.152 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.152 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.152 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.152 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.153 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.154 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.154 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.154 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.154 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.154 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.155 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.155 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.155 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.155 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.155 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.156 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.157 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.157 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.157 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.157 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.157 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.158 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.158 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.158 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.158 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.158 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.159 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.160 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.160 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.160 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.160 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.160 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.161 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.161 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.161 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.161 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.161 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.162 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.163 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.163 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.163 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.163 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.163 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.164 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.164 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.164 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.164 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.164 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.165 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.166 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.166 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.166 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.166 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.166 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.167 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.168 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.169 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.169 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.169 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.169 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.169 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.170 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.170 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.170 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.170 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.170 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.171 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.172 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.173 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.173 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.173 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.173 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.173 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.174 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.175 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.176 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.177 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.177 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.177 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.177 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.177 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.178 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.178 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.178 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.178 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.178 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.179 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.180 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.180 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.180 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.180 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.180 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.181 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.181 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.181 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.181 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.181 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.182 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.182 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.182 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.182 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.182 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.183 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.184 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.184 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.184 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.184 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.185 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.186 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.186 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.186 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.186 230888 DEBUG oslo_service.service [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.188 230888 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.199 230888 INFO nova.virt.node [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.200 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.201 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.201 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.201 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.212 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f23dece83a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.214 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f23dece83a0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.215 230888 INFO nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Connection event '1' reason 'None'
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.221 230888 INFO nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <host>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <uuid>0b20d7bd-1341-4912-afa7-eec4e2b0c648</uuid>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <arch>x86_64</arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model>EPYC-Rome-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <vendor>AMD</vendor>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <microcode version='16777317'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='x2apic'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='tsc-deadline'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='osxsave'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='hypervisor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='tsc_adjust'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='spec-ctrl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='stibp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='arch-capabilities'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='cmp_legacy'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='topoext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='virt-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='lbrv'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='tsc-scale'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='vmcb-clean'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='pause-filter'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='pfthreshold'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='svme-addr-chk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='rdctl-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='mds-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature name='pschange-mc-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <pages unit='KiB' size='4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <pages unit='KiB' size='2048'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <power_management>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <suspend_mem/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <suspend_disk/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <suspend_hybrid/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </power_management>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <iommu support='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <migration_features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <live/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <uri_transports>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <uri_transport>tcp</uri_transport>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <uri_transport>rdma</uri_transport>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </uri_transports>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </migration_features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <topology>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <cells num='1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <cell id='0'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <memory unit='KiB'>16116612</memory>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <distances>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <sibling id='0' value='10'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           </distances>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           <cpus num='8'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:           </cpus>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         </cell>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </cells>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </topology>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <cache>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </cache>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <secmodel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model>selinux</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <doi>0</doi>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </secmodel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <secmodel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model>dac</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <doi>0</doi>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </secmodel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </host>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <guest>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <os_type>hvm</os_type>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <arch name='i686'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <wordsize>32</wordsize>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <domain type='qemu'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <domain type='kvm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <pae/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <nonpae/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <apic default='on' toggle='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <cpuselection/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <deviceboot/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <externalSnapshot/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </guest>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <guest>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <os_type>hvm</os_type>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <arch name='x86_64'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <wordsize>64</wordsize>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <domain type='qemu'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <domain type='kvm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <acpi default='on' toggle='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <apic default='on' toggle='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <cpuselection/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <deviceboot/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <externalSnapshot/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </guest>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: </capabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.225 230888 DEBUG nova.virt.libvirt.volume.mount [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.227 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.231 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: <domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <domain>kvm</domain>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <arch>i686</arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <vcpu max='240'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <iothreads supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <os supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='firmware'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <loader supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>rom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pflash</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='readonly'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>yes</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='secure'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </loader>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </os>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='maximumMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <vendor>AMD</vendor>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='succor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='custom' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-128'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-256'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-512'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <memoryBacking supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='sourceType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>anonymous</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>memfd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </memoryBacking>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <disk supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='diskDevice'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>disk</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cdrom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>floppy</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>lun</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ide</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>fdc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>sata</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </disk>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <graphics supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vnc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egl-headless</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </graphics>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <video supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='modelType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vga</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cirrus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>none</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>bochs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ramfb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </video>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hostdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='mode'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>subsystem</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='startupPolicy'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>mandatory</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>requisite</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>optional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='subsysType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pci</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='capsType'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='pciBackend'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hostdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <rng supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>random</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </rng>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <filesystem supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='driverType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>path</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>handle</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtiofs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </filesystem>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <tpm supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-tis</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-crb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emulator</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>external</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendVersion'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>2.0</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </tpm>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <redirdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </redirdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <channel supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </channel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <crypto supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </crypto>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <interface supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>passt</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </interface>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <panic supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>isa</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>hyperv</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </panic>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <console supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>null</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dev</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pipe</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stdio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>udp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tcp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu-vdagent</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </console>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <gic supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <genid supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backup supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <async-teardown supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <ps2 supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sev supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sgx supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hyperv supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='features'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>relaxed</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vapic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>spinlocks</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vpindex</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>runtime</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>synic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stimer</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reset</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vendor_id</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>frequencies</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reenlightenment</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tlbflush</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ipi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>avic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emsr_bitmap</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>xmm_input</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hyperv>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <launchSecurity supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='sectype'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tdx</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </launchSecurity>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: </domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.240 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: <domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <domain>kvm</domain>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <arch>i686</arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <vcpu max='1024'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <iothreads supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <os supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='firmware'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <loader supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>rom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pflash</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='readonly'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>yes</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='secure'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </loader>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </os>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='maximumMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <vendor>AMD</vendor>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='succor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='custom' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-128'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-256'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-512'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <memoryBacking supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='sourceType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>anonymous</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>memfd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </memoryBacking>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <disk supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='diskDevice'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>disk</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cdrom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>floppy</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>lun</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>fdc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>sata</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </disk>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <graphics supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vnc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egl-headless</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </graphics>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <video supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='modelType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vga</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cirrus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>none</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>bochs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ramfb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </video>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hostdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='mode'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>subsystem</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='startupPolicy'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>mandatory</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>requisite</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>optional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='subsysType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pci</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='capsType'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='pciBackend'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hostdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <rng supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>random</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </rng>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <filesystem supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='driverType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>path</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>handle</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtiofs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </filesystem>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <tpm supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-tis</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-crb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emulator</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>external</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendVersion'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>2.0</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </tpm>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <redirdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </redirdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <channel supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </channel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <crypto supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </crypto>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <interface supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>passt</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </interface>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <panic supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>isa</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>hyperv</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </panic>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <console supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>null</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dev</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pipe</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stdio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>udp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tcp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu-vdagent</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </console>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <gic supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <genid supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backup supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <async-teardown supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <ps2 supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sev supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sgx supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hyperv supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='features'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>relaxed</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vapic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>spinlocks</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vpindex</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>runtime</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>synic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stimer</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reset</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vendor_id</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>frequencies</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reenlightenment</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tlbflush</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ipi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>avic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emsr_bitmap</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>xmm_input</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hyperv>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <launchSecurity supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='sectype'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tdx</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </launchSecurity>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: </domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.261 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.265 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: <domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <domain>kvm</domain>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <arch>x86_64</arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <vcpu max='240'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <iothreads supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <os supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='firmware'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <loader supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>rom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pflash</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='readonly'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>yes</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='secure'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </loader>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </os>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='maximumMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <vendor>AMD</vendor>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='succor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='custom' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-128'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-256'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-512'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <memoryBacking supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='sourceType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>anonymous</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>memfd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </memoryBacking>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <disk supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='diskDevice'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>disk</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cdrom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>floppy</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>lun</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ide</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>fdc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>sata</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </disk>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <graphics supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vnc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egl-headless</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </graphics>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <video supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='modelType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vga</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cirrus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>none</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>bochs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ramfb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </video>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hostdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='mode'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>subsystem</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='startupPolicy'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>mandatory</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>requisite</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>optional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='subsysType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pci</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='capsType'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='pciBackend'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hostdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <rng supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>random</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </rng>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <filesystem supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='driverType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>path</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>handle</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtiofs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </filesystem>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <tpm supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-tis</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-crb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emulator</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>external</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendVersion'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>2.0</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </tpm>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <redirdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </redirdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <channel supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </channel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <crypto supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </crypto>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <interface supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>passt</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </interface>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <panic supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>isa</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>hyperv</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </panic>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <console supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>null</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dev</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pipe</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stdio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>udp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tcp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu-vdagent</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </console>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <gic supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <genid supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backup supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <async-teardown supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <ps2 supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sev supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sgx supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hyperv supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='features'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>relaxed</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vapic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>spinlocks</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vpindex</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>runtime</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>synic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stimer</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reset</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vendor_id</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>frequencies</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reenlightenment</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tlbflush</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ipi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>avic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emsr_bitmap</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>xmm_input</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hyperv>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <launchSecurity supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='sectype'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tdx</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </launchSecurity>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: </domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.317 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: <domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <domain>kvm</domain>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <arch>x86_64</arch>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <vcpu max='1024'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <iothreads supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <os supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='firmware'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>efi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <loader supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>rom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pflash</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='readonly'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>yes</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='secure'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>yes</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>no</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </loader>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </os>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='maximum' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='maximumMigratable'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>on</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>off</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='host-model' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <vendor>AMD</vendor>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='x2apic'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='stibp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='succor'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lbrv'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <mode name='custom' supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Broadwell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Cooperlake-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Denverton-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Dhyana-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='auto-ibrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amd-psfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='no-nested-data-bp'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='null-sel-clr-base'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='stibp-always-on'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='EPYC-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-128'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-256'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx10-512'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='prefetchiti'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Haswell-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='IvyBridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='KnightsMill-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4fmaps'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-4vnniw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512er'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512pf'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fma4'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tbm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xop'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='amx-tile'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-bf16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-fp16'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bitalg'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vbmi2'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrc'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fzrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='la57'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='taa-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='tsx-ldtrk'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xfd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='SierraForest-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ifma'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-ne-convert'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx-vnni-int8'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='bus-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cmpccxadd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fbsdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='fsrs'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ibrs-all'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mcdt-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pbrsb-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='psdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='serialize'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vaes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='vpclmulqdq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='hle'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='rtm'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512bw'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512cd'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512dq'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512f'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='avx512vl'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='invpcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pcid'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='pku'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='mpx'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v2'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v3'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='core-capability'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='split-lock-detect'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='Snowridge-v4'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='cldemote'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='erms'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='gfni'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdir64b'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='movdiri'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='xsaves'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='athlon-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='core2duo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='coreduo-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='n270-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='ss'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <blockers model='phenom-v1'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnow'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <feature name='3dnowext'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </blockers>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </mode>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </cpu>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <memoryBacking supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <enum name='sourceType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>anonymous</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <value>memfd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </memoryBacking>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <disk supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='diskDevice'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>disk</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cdrom</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>floppy</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>lun</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>fdc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>sata</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </disk>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <graphics supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vnc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egl-headless</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </graphics>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <video supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='modelType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vga</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>cirrus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>none</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>bochs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ramfb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </video>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hostdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='mode'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>subsystem</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='startupPolicy'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>mandatory</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>requisite</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>optional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='subsysType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pci</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>scsi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='capsType'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='pciBackend'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hostdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <rng supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtio-non-transitional</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>random</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>egd</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </rng>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <filesystem supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='driverType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>path</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>handle</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>virtiofs</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </filesystem>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <tpm supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-tis</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tpm-crb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emulator</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>external</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendVersion'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>2.0</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </tpm>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <redirdev supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='bus'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>usb</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </redirdev>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <channel supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </channel>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <crypto supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendModel'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>builtin</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </crypto>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <interface supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='backendType'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>default</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>passt</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </interface>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <panic supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='model'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>isa</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>hyperv</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </panic>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <console supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='type'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>null</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vc</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pty</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dev</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>file</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>pipe</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stdio</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>udp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tcp</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>unix</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>qemu-vdagent</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>dbus</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </console>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </devices>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   <features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <gic supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <vmcoreinfo supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <genid supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backingStoreInput supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <backup supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <async-teardown supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <ps2 supported='yes'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sev supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <sgx supported='no'/>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <hyperv supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='features'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>relaxed</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vapic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>spinlocks</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vpindex</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>runtime</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>synic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>stimer</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reset</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>vendor_id</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>frequencies</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>reenlightenment</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tlbflush</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>ipi</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>avic</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>emsr_bitmap</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>xmm_input</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <spinlocks>4095</spinlocks>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <stimer_direct>on</stimer_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </defaults>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </hyperv>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     <launchSecurity supported='yes'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       <enum name='sectype'>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:         <value>tdx</value>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:       </enum>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:     </launchSecurity>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:   </features>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: </domainCapabilities>
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.366 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.366 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.366 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.367 230888 INFO nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Secure Boot support detected
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.369 230888 INFO nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.369 230888 INFO nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.382 230888 DEBUG nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.422 230888 INFO nova.virt.node [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.444 230888 DEBUG nova.compute.manager [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Verified node 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad matches my host np0005548789.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.485 230888 DEBUG nova.compute.manager [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.488 230888 DEBUG nova.virt.libvirt.vif [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005548789.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T08:44:43Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.488 230888 DEBUG nova.network.os_vif_util [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.489 230888 DEBUG nova.network.os_vif_util [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.489 230888 DEBUG os_vif [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.556 230888 DEBUG ovsdbapp.backend.ovs_idl [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.557 230888 DEBUG ovsdbapp.backend.ovs_idl [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.557 230888 DEBUG ovsdbapp.backend.ovs_idl [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.557 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.557 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.557 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.558 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.559 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.561 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.571 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.571 230888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.572 230888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:45:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:53.572 230888 INFO oslo.privsep.daemon [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpu7bcbu8p/privsep.sock']
Dec 06 09:45:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38134 DF PROTO=TCP SPT=58278 DPT=9105 SEQ=1660505716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA30EF0000000001030307) 
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.189 230888 INFO oslo.privsep.daemon [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.079 230943 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.083 230943 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.087 230943 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.087 230943 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230943
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.467 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.468 230888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86fc0b7a-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.468 230888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86fc0b7a-fb, col_values=(('external_ids', {'iface-id': '86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:77:f3', 'vm-uuid': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.469 230888 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.470 230888 INFO os_vif [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb')
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.470 230888 DEBUG nova.compute.manager [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.474 230888 DEBUG nova.compute.manager [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.474 230888 INFO nova.compute.manager [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.609 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.610 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.610 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.611 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.611 230888 DEBUG oslo_concurrency.processutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:54.927 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.102 230888 DEBUG oslo_concurrency.processutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.199 230888 DEBUG nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.199 230888 DEBUG nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.400 230888 WARNING nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.401 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12920MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.401 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.401 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:45:55 np0005548789.localdomain sudo[231059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffjjgnvfqibbwvikdkzecnqebgvnzckw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014355.2751763-4316-13970164253913/AnsiballZ_podman_container.py
Dec 06 09:45:55 np0005548789.localdomain sudo[231059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.556 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.556 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.557 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.619 230888 DEBUG nova.scheduler.client.report [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.644 230888 DEBUG nova.scheduler.client.report [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.645 230888 DEBUG nova.compute.provider_tree [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.672 230888 DEBUG nova.scheduler.client.report [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.758 230888 DEBUG nova.scheduler.client.report [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,HW_CPU_X86_BMI,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:45:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:55.801 230888 DEBUG oslo_concurrency.processutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:45:55 np0005548789.localdomain python3.9[231061]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: Started libpod-conmon-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope.
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:45:56 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 09:45:56 np0005548789.localdomain podman[231101]: 2025-12-06 09:45:56.08157794 +0000 UTC m=+0.132867917 container init a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3)
Dec 06 09:45:56 np0005548789.localdomain podman[231101]: 2025-12-06 09:45:56.091928583 +0000 UTC m=+0.143218600 container start a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 09:45:56 np0005548789.localdomain python3.9[231061]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Applying nova statedir ownership
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/console.log
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/55d01870b6a0ce0995b6b5844cf47638cdf46fbf
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-55d01870b6a0ce0995b6b5844cf47638cdf46fbf
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 06 09:45:56 np0005548789.localdomain nova_compute_init[231127]: INFO:nova_statedir:Nova statedir ownership complete
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: libpod-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548789.localdomain podman[231128]: 2025-12-06 09:45:56.164856004 +0000 UTC m=+0.053538058 container died a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:45:56 np0005548789.localdomain sudo[231059]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.275 230888 DEBUG oslo_concurrency.processutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.281 230888 DEBUG nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.281 230888 INFO nova.virt.libvirt.host [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.283 230888 DEBUG nova.compute.provider_tree [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.283 230888 DEBUG nova.virt.libvirt.driver [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:45:56 np0005548789.localdomain podman[231141]: 2025-12-06 09:45:56.284125558 +0000 UTC m=+0.117681996 container cleanup a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: libpod-conmon-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.302 230888 DEBUG nova.scheduler.client.report [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.342 230888 DEBUG nova.compute.resource_tracker [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.343 230888 DEBUG oslo_concurrency.lockutils [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.942s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.343 230888 DEBUG nova.service [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.375 230888 DEBUG nova.service [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:45:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:56.376 230888 DEBUG nova.servicegroup.drivers.db [None req-e1f88f64-31d2-42bc-8944-ab08e0aca269 - - - - - -] DB_Driver: join new ServiceGroup member np0005548789.localdomain to the compute group, service = <Service: host=np0005548789.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:45:56 np0005548789.localdomain sshd[209098]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: session-54.scope: Consumed 2min 9.443s CPU time.
Dec 06 09:45:56 np0005548789.localdomain systemd-logind[766]: Session 54 logged out. Waiting for processes to exit.
Dec 06 09:45:56 np0005548789.localdomain systemd-logind[766]: Removed session 54.
Dec 06 09:45:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63508 DF PROTO=TCP SPT=37542 DPT=9882 SEQ=2811801932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA3DAF0000000001030307) 
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee-merged.mount: Deactivated successfully.
Dec 06 09:45:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b-userdata-shm.mount: Deactivated successfully.
Dec 06 09:45:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:45:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:45:57 np0005548789.localdomain systemd[1]: tmp-crun.9W9dZd.mount: Deactivated successfully.
Dec 06 09:45:57 np0005548789.localdomain podman[231187]: 2025-12-06 09:45:57.91218789 +0000 UTC m=+0.071397364 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 09:45:57 np0005548789.localdomain podman[231186]: 2025-12-06 09:45:57.966853212 +0000 UTC m=+0.125067405 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 09:45:57 np0005548789.localdomain podman[231187]: 2025-12-06 09:45:57.99731021 +0000 UTC m=+0.156519684 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:45:58 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:45:58 np0005548789.localdomain podman[231186]: 2025-12-06 09:45:58.054644086 +0000 UTC m=+0.212858359 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 09:45:58 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:45:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:58.608 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:45:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=761 DF PROTO=TCP SPT=49776 DPT=9101 SEQ=4200883413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA47EF0000000001030307) 
Dec 06 09:45:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:45:59.961 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:01 np0005548789.localdomain sshd[231230]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38135 DF PROTO=TCP SPT=58278 DPT=9105 SEQ=1660505716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA51EF0000000001030307) 
Dec 06 09:46:02 np0005548789.localdomain sshd[231232]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:02 np0005548789.localdomain sshd[231232]: Accepted publickey for zuul from 192.168.122.30 port 34622 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:46:02 np0005548789.localdomain systemd-logind[766]: New session 56 of user zuul.
Dec 06 09:46:03 np0005548789.localdomain systemd[1]: Started Session 56 of User zuul.
Dec 06 09:46:03 np0005548789.localdomain sshd[231232]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:46:03 np0005548789.localdomain sshd[231230]: Received disconnect from 103.192.152.59 port 44448:11: Bye Bye [preauth]
Dec 06 09:46:03 np0005548789.localdomain sshd[231230]: Disconnected from authenticating user root 103.192.152.59 port 44448 [preauth]
Dec 06 09:46:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:03.668 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:04 np0005548789.localdomain python3.9[231343]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:46:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58226 DF PROTO=TCP SPT=46606 DPT=9102 SEQ=1202293018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA5BEF0000000001030307) 
Dec 06 09:46:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:05.000 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:05 np0005548789.localdomain sudo[231419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:46:05 np0005548789.localdomain sudo[231419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:05 np0005548789.localdomain sudo[231419]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:05 np0005548789.localdomain sudo[231473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayfomayyxcirzaydowyaxwwuolnvxpcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014364.9108868-69-173149032978400/AnsiballZ_systemd_service.py
Dec 06 09:46:05 np0005548789.localdomain sudo[231473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:05 np0005548789.localdomain sudo[231474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:46:05 np0005548789.localdomain sudo[231474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:05 np0005548789.localdomain python3.9[231491]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:46:05 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:46:05 np0005548789.localdomain systemd-rc-local-generator[231535]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:05 np0005548789.localdomain systemd-sysv-generator[231539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:06 np0005548789.localdomain sudo[231474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:06 np0005548789.localdomain sudo[231473]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:06 np0005548789.localdomain sudo[231641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:46:06 np0005548789.localdomain sudo[231641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:46:06 np0005548789.localdomain sudo[231641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:07 np0005548789.localdomain python3.9[231687]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:46:07 np0005548789.localdomain network[231704]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:46:07 np0005548789.localdomain network[231705]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:46:07 np0005548789.localdomain network[231706]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:46:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25003 DF PROTO=TCP SPT=58212 DPT=9102 SEQ=2104232652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA67EF0000000001030307) 
Dec 06 09:46:08 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:08.712 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:10.051 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58228 DF PROTO=TCP SPT=46606 DPT=9102 SEQ=1202293018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA73AF0000000001030307) 
Dec 06 09:46:11 np0005548789.localdomain sshd[231849]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:12 np0005548789.localdomain sudo[231941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcvkhbkcljwtatcmmgpatsctonrbpaai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014372.7346199-126-19834308873536/AnsiballZ_systemd_service.py
Dec 06 09:46:12 np0005548789.localdomain sudo[231941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:13 np0005548789.localdomain python3.9[231943]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:46:13 np0005548789.localdomain sudo[231941]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:13 np0005548789.localdomain sshd[231849]: Received disconnect from 103.157.25.60 port 59706:11: Bye Bye [preauth]
Dec 06 09:46:13 np0005548789.localdomain sshd[231849]: Disconnected from authenticating user root 103.157.25.60 port 59706 [preauth]
Dec 06 09:46:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:13.752 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61365 DF PROTO=TCP SPT=36184 DPT=9101 SEQ=1986252779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA81500000000001030307) 
Dec 06 09:46:14 np0005548789.localdomain sudo[232052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktjadmghaktncanmledudmgrtupenimn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014373.9387224-156-212720989708918/AnsiballZ_file.py
Dec 06 09:46:14 np0005548789.localdomain sudo[232052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:14 np0005548789.localdomain python3.9[232054]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:14 np0005548789.localdomain sudo[232052]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:14 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Dec 06 09:46:14 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:46:14 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:46:14 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:46:15 np0005548789.localdomain sudo[232163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooxbnvkbrpqgvffnkgdmgyzwmwzjdxvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014374.7860358-180-23735074493090/AnsiballZ_file.py
Dec 06 09:46:15 np0005548789.localdomain sudo[232163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:15.105 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:15 np0005548789.localdomain python3.9[232165]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:15 np0005548789.localdomain sudo[232163]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:16 np0005548789.localdomain sudo[232273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybdtnryqsthfyscfhchgzhpbstzhqjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014375.663004-207-125373502786999/AnsiballZ_command.py
Dec 06 09:46:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:46:16 np0005548789.localdomain sudo[232273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:16 np0005548789.localdomain podman[232275]: 2025-12-06 09:46:16.200012877 +0000 UTC m=+0.086305308 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 09:46:16 np0005548789.localdomain podman[232275]: 2025-12-06 09:46:16.216011765 +0000 UTC m=+0.102304176 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 09:46:16 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:46:16 np0005548789.localdomain python3.9[232276]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:16 np0005548789.localdomain sudo[232273]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:17 np0005548789.localdomain python3.9[232405]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:46:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61367 DF PROTO=TCP SPT=36184 DPT=9101 SEQ=1986252779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA8D6F0000000001030307) 
Dec 06 09:46:17 np0005548789.localdomain sudo[232513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljyilfjobovcsgoxgnyptmkcquzzheya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014377.5421855-261-272656500581480/AnsiballZ_systemd_service.py
Dec 06 09:46:17 np0005548789.localdomain sudo[232513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:18 np0005548789.localdomain python3.9[232515]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:46:18 np0005548789.localdomain systemd-rc-local-generator[232538]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:18 np0005548789.localdomain systemd-sysv-generator[232541]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:18 np0005548789.localdomain sudo[232513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:18.793 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:19 np0005548789.localdomain sudo[232659]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ziyepulvymgqiyhwrzuqmheojzvzuxin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014379.265586-285-198505514859383/AnsiballZ_command.py
Dec 06 09:46:19 np0005548789.localdomain sudo[232659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19008 DF PROTO=TCP SPT=39750 DPT=9105 SEQ=3707890099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DA96700000000001030307) 
Dec 06 09:46:19 np0005548789.localdomain python3.9[232661]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:19 np0005548789.localdomain sudo[232659]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:20.130 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:20 np0005548789.localdomain sudo[232770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imdirwqqbgfxhfojcjgmyffwmpestetm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014380.0870385-312-263879188642304/AnsiballZ_file.py
Dec 06 09:46:20 np0005548789.localdomain sudo[232770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:20 np0005548789.localdomain python3.9[232772]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:20 np0005548789.localdomain sudo[232770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:21 np0005548789.localdomain python3.9[232880]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.378 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.413 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.413 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.413 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.414 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:22.509 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:22 np0005548789.localdomain python3.9[232990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:23 np0005548789.localdomain python3.9[233076]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014382.2838378-360-164839538130597/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=ff4e72663552f54a1c747481e1f73412f2607746 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19009 DF PROTO=TCP SPT=39750 DPT=9105 SEQ=3707890099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAA62F0000000001030307) 
Dec 06 09:46:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:23.797 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:23 np0005548789.localdomain sudo[233184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkjtssflfbxkronqveaavrhqyawaodwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014383.593719-405-233456725572665/AnsiballZ_group.py
Dec 06 09:46:23 np0005548789.localdomain sudo[233184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:24 np0005548789.localdomain python3.9[233186]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 06 09:46:24 np0005548789.localdomain sudo[233184]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:25 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:25.137 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:25 np0005548789.localdomain sudo[233294]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czwglhmsrakygyrgsvnymdmwfisiucfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014384.9730427-439-108016006654948/AnsiballZ_getent.py
Dec 06 09:46:25 np0005548789.localdomain sudo[233294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:25 np0005548789.localdomain python3.9[233296]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 06 09:46:25 np0005548789.localdomain sudo[233294]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63511 DF PROTO=TCP SPT=37542 DPT=9882 SEQ=2811801932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAADEF0000000001030307) 
Dec 06 09:46:26 np0005548789.localdomain sudo[233405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uglumvdtzazvanqxmaieizicmnevshgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014385.7899704-462-135044543013339/AnsiballZ_group.py
Dec 06 09:46:26 np0005548789.localdomain sudo[233405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:26 np0005548789.localdomain python3.9[233407]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:46:26 np0005548789.localdomain groupadd[233408]: group added to /etc/group: name=ceilometer, GID=42405
Dec 06 09:46:26 np0005548789.localdomain groupadd[233408]: group added to /etc/gshadow: name=ceilometer
Dec 06 09:46:26 np0005548789.localdomain groupadd[233408]: new group: name=ceilometer, GID=42405
Dec 06 09:46:26 np0005548789.localdomain sudo[233405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:27 np0005548789.localdomain sudo[233521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luqqzfkrbffotvedanwxanymezurifjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014386.6717408-486-262247131451604/AnsiballZ_user.py
Dec 06 09:46:27 np0005548789.localdomain sudo[233521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:27 np0005548789.localdomain python3.9[233523]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548789.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:46:27 np0005548789.localdomain useradd[233525]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Dec 06 09:46:27 np0005548789.localdomain useradd[233525]: add 'ceilometer' to group 'libvirt'
Dec 06 09:46:27 np0005548789.localdomain useradd[233525]: add 'ceilometer' to shadow group 'libvirt'
Dec 06 09:46:27 np0005548789.localdomain sudo[233521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:28.800 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:46:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:46:28 np0005548789.localdomain python3.9[233639]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:28 np0005548789.localdomain systemd[1]: tmp-crun.FvAixU.mount: Deactivated successfully.
Dec 06 09:46:28 np0005548789.localdomain podman[233640]: 2025-12-06 09:46:28.909322252 +0000 UTC m=+0.071232660 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:46:28 np0005548789.localdomain podman[233641]: 2025-12-06 09:46:28.92178984 +0000 UTC m=+0.081937932 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:46:28 np0005548789.localdomain podman[233640]: 2025-12-06 09:46:28.934365662 +0000 UTC m=+0.096276070 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 09:46:28 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:46:28 np0005548789.localdomain podman[233641]: 2025-12-06 09:46:28.949986738 +0000 UTC m=+0.110134790 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 09:46:28 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:46:29 np0005548789.localdomain python3.9[233768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014388.4323149-564-97107424243592/.source.conf _original_basename=ceilometer.conf follow=False checksum=e90760659247c177dccfbe1ef7de974794985ce9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61369 DF PROTO=TCP SPT=36184 DPT=9101 SEQ=1986252779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DABDF00000000001030307) 
Dec 06 09:46:29 np0005548789.localdomain python3.9[233876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:30.139 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:31 np0005548789.localdomain python3.9[233962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014389.5647614-564-214907326056160/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:31 np0005548789.localdomain python3.9[234070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:31 np0005548789.localdomain auditd[725]: Audit daemon rotating log files
Dec 06 09:46:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19010 DF PROTO=TCP SPT=39750 DPT=9105 SEQ=3707890099 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAC5EF0000000001030307) 
Dec 06 09:46:32 np0005548789.localdomain python3.9[234156]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765014391.2911236-564-100204522657405/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:33 np0005548789.localdomain python3.9[234264]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:33.836 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:34 np0005548789.localdomain python3.9[234372]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19318 DF PROTO=TCP SPT=48568 DPT=9102 SEQ=2118299854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAD12F0000000001030307) 
Dec 06 09:46:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:35.142 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:35 np0005548789.localdomain python3.9[234480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:35 np0005548789.localdomain sshd[234530]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:46:35 np0005548789.localdomain python3.9[234568]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014394.6638057-741-25553970523476/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:36 np0005548789.localdomain python3.9[234676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:36 np0005548789.localdomain python3.9[234731]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:37 np0005548789.localdomain python3.9[234839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:37 np0005548789.localdomain sshd[234530]: Received disconnect from 123.160.164.187 port 50192:11: Bye Bye [preauth]
Dec 06 09:46:37 np0005548789.localdomain sshd[234530]: Disconnected from authenticating user root 123.160.164.187 port 50192 [preauth]
Dec 06 09:46:37 np0005548789.localdomain python3.9[234925]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014396.8055358-741-273471521040312/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:38 np0005548789.localdomain python3.9[235033]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:38 np0005548789.localdomain python3.9[235119]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014397.9168751-741-120671930251414/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:38.886 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:39 np0005548789.localdomain python3.9[235227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41228 DF PROTO=TCP SPT=33374 DPT=9882 SEQ=2224775730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAE3F00000000001030307) 
Dec 06 09:46:39 np0005548789.localdomain python3.9[235313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014399.0052042-741-248476297696252/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:40 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:40.144 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:40 np0005548789.localdomain python3.9[235421]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19320 DF PROTO=TCP SPT=48568 DPT=9102 SEQ=2118299854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAE8EF0000000001030307) 
Dec 06 09:46:41 np0005548789.localdomain python3.9[235507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014400.1171174-741-54439433761440/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:41 np0005548789.localdomain python3.9[235615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:42 np0005548789.localdomain python3.9[235701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014401.276177-741-6489650128920/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:42 np0005548789.localdomain python3.9[235809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:43 np0005548789.localdomain python3.9[235895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014402.3464491-741-12700034770971/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:43.922 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63662 DF PROTO=TCP SPT=54692 DPT=9101 SEQ=1152780584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DAF6830000000001030307) 
Dec 06 09:46:44 np0005548789.localdomain python3.9[236003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:45 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:45.146 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:45 np0005548789.localdomain python3.9[236089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014404.3147995-741-250966041018160/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:45 np0005548789.localdomain python3.9[236197]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:46:46 np0005548789.localdomain systemd[1]: tmp-crun.Lysyqd.mount: Deactivated successfully.
Dec 06 09:46:46 np0005548789.localdomain podman[236247]: 2025-12-06 09:46:46.90239277 +0000 UTC m=+0.069527335 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 09:46:46 np0005548789.localdomain podman[236247]: 2025-12-06 09:46:46.918205633 +0000 UTC m=+0.085340178 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:46:46 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:46:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:46:47.276 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:46:47.277 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:46:47.278 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:47 np0005548789.localdomain python3.9[236302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014405.503078-741-140054624375160/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63664 DF PROTO=TCP SPT=54692 DPT=9101 SEQ=1152780584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB026F0000000001030307) 
Dec 06 09:46:47 np0005548789.localdomain python3.9[236410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:48 np0005548789.localdomain python3.9[236496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014407.4625413-741-17130614229957/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:48.965 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54002 DF PROTO=TCP SPT=33278 DPT=9105 SEQ=1469673539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB0BB00000000001030307) 
Dec 06 09:46:49 np0005548789.localdomain sudo[236604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksexykatbdpblqyimtvrvkpjtogoffln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014409.7436767-1206-49302113665040/AnsiballZ_file.py
Dec 06 09:46:49 np0005548789.localdomain sudo[236604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:50 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:50.200 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:50 np0005548789.localdomain python3.9[236606]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:50 np0005548789.localdomain sudo[236604]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:50 np0005548789.localdomain sudo[236714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txlflmtlaozmfoxwkjdqcpqdpeunritl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014410.476814-1230-41972215639304/AnsiballZ_systemd_service.py
Dec 06 09:46:50 np0005548789.localdomain sudo[236714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:51 np0005548789.localdomain python3.9[236716]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:46:51 np0005548789.localdomain systemd-rc-local-generator[236746]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:46:51 np0005548789.localdomain systemd-sysv-generator[236749]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:46:51 np0005548789.localdomain systemd[1]: Listening on Podman API Socket.
Dec 06 09:46:51 np0005548789.localdomain sudo[236714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:52 np0005548789.localdomain sudo[236864]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twxfcyojbriocayyfrfnwjiqpbjbxaxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/AnsiballZ_stat.py
Dec 06 09:46:52 np0005548789.localdomain sudo[236864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:52 np0005548789.localdomain python3.9[236866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:52 np0005548789.localdomain sudo[236864]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:52.596 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:52.597 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:52.598 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:46:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:52.598 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:46:52 np0005548789.localdomain sudo[236952]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crlkzpgugipnewilwegdcmkklokjddrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/AnsiballZ_copy.py
Dec 06 09:46:52 np0005548789.localdomain sudo[236952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:52 np0005548789.localdomain python3.9[236954]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:52 np0005548789.localdomain sudo[236952]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:53 np0005548789.localdomain sudo[237007]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvfnuzjzrnadbpwgnroqhhwmusfszxds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/AnsiballZ_stat.py
Dec 06 09:46:53 np0005548789.localdomain sudo[237007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:53 np0005548789.localdomain python3.9[237009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:53 np0005548789.localdomain sudo[237007]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54003 DF PROTO=TCP SPT=33278 DPT=9105 SEQ=1469673539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB1B6F0000000001030307) 
Dec 06 09:46:53 np0005548789.localdomain sudo[237095]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzgxtldrijiwsezrigurwkffkpknxxyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/AnsiballZ_copy.py
Dec 06 09:46:53 np0005548789.localdomain sudo[237095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:54.007 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:54 np0005548789.localdomain python3.9[237097]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014411.9035506-1257-149379748432715/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:54 np0005548789.localdomain sudo[237095]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:54.129 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:46:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:54.130 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:46:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:54.130 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:46:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:54.131 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:46:55 np0005548789.localdomain sudo[237205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orobxfwaxnzceiqaivziyvoxhlnyjehb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014414.591637-1341-142028926844214/AnsiballZ_container_config_data.py
Dec 06 09:46:55 np0005548789.localdomain sudo[237205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.202 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:55 np0005548789.localdomain python3.9[237207]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 06 09:46:55 np0005548789.localdomain sudo[237205]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.266 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.291 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.291 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.292 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.292 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.292 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.292 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.292 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.293 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.293 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.293 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.308 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.309 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.309 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.309 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.309 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.696 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.772 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.773 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:46:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41229 DF PROTO=TCP SPT=33374 DPT=9882 SEQ=2224775730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB23EF0000000001030307) 
Dec 06 09:46:55 np0005548789.localdomain sudo[237337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itxzuitrkatbpljvwdskrcnflfzxaqvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014415.536972-1368-122606801250381/AnsiballZ_container_config_hash.py
Dec 06 09:46:55 np0005548789.localdomain sudo[237337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.936 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.938 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12917MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.938 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:46:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:55.938 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.006 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.007 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.007 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.060 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:46:56 np0005548789.localdomain python3.9[237339]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:46:56 np0005548789.localdomain sudo[237337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.524 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.532 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.558 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.561 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:46:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:56.561 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:46:57 np0005548789.localdomain sudo[237469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmawxksznvthlxvqwwikjjxfekvkcysq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014416.6365445-1398-234381338605144/AnsiballZ_edpm_container_manage.py
Dec 06 09:46:57 np0005548789.localdomain sudo[237469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:58 np0005548789.localdomain python3[237471]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:46:58 np0005548789.localdomain python3[237471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2",
                                                                    "Digest": "sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:21:53.58682213Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 505175293,
                                                                    "VirtualSize": 505175293,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:a47016624274f5ebad76019f5a2e465c1737f96caa539b36f90ab8e33592f415",
                                                                              "sha256:38a03f5e96658211fb28e2f87c11ffad531281d1797368f48e6cd4af7ac97c0e"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.244673147Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:14:56.960273159Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:37.588899909Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:15:41.197123864Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:19.680010224Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:53.584924649Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:21:56.278821402Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:46:58 np0005548789.localdomain podman[237520]: 2025-12-06 09:46:58.389985843 +0000 UTC m=+0.096222257 container remove a37b9c3e0a94913f96d0acd0e46ec4af5b191c79194cc94d7212024b813b32c9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '728090aef247cfdd273031dadf6d1125'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible)
Dec 06 09:46:58 np0005548789.localdomain python3[237471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Dec 06 09:46:58 np0005548789.localdomain podman[237533]: 
Dec 06 09:46:58 np0005548789.localdomain podman[237533]: 2025-12-06 09:46:58.498724199 +0000 UTC m=+0.089803737 container create bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute)
Dec 06 09:46:58 np0005548789.localdomain podman[237533]: 2025-12-06 09:46:58.455265965 +0000 UTC m=+0.046345543 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 06 09:46:58 np0005548789.localdomain python3[237471]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 06 09:46:58 np0005548789.localdomain sudo[237469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:46:59.078 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:46:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63666 DF PROTO=TCP SPT=54692 DPT=9101 SEQ=1152780584 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB31EF0000000001030307) 
Dec 06 09:46:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:46:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:46:59 np0005548789.localdomain podman[237645]: 2025-12-06 09:46:59.939746186 +0000 UTC m=+0.091655014 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:46:59 np0005548789.localdomain systemd[1]: tmp-crun.5JkNAt.mount: Deactivated successfully.
Dec 06 09:46:59 np0005548789.localdomain podman[237645]: 2025-12-06 09:46:59.993466 +0000 UTC m=+0.145374788 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 09:47:00 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:47:00 np0005548789.localdomain podman[237646]: 2025-12-06 09:46:59.996338809 +0000 UTC m=+0.148739833 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 09:47:00 np0005548789.localdomain podman[237646]: 2025-12-06 09:47:00.076469023 +0000 UTC m=+0.228869947 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:47:00 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:47:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:00.205 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:00 np0005548789.localdomain sudo[237724]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uiuklrlndgvbudentzdbsrmlrfiwozxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014419.0236115-1422-15225543424136/AnsiballZ_stat.py
Dec 06 09:47:00 np0005548789.localdomain sudo[237724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:00 np0005548789.localdomain python3.9[237726]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:00 np0005548789.localdomain sudo[237724]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:01 np0005548789.localdomain sudo[237836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flqbebvwrlmltdobrldwvdggsrcernht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.042082-1449-181507627147484/AnsiballZ_file.py
Dec 06 09:47:01 np0005548789.localdomain sudo[237836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:01 np0005548789.localdomain python3.9[237838]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:01 np0005548789.localdomain sudo[237836]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:01 np0005548789.localdomain sudo[237945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-disjwvedqymbisllvibxzzvjxqlixsyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.528171-1449-63151730621133/AnsiballZ_copy.py
Dec 06 09:47:01 np0005548789.localdomain sudo[237945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54004 DF PROTO=TCP SPT=33278 DPT=9105 SEQ=1469673539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB3BEF0000000001030307) 
Dec 06 09:47:02 np0005548789.localdomain python3.9[237947]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014421.528171-1449-63151730621133/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:02 np0005548789.localdomain sudo[237945]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:02 np0005548789.localdomain sudo[238000]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxjzsdgjxtcftgvpyjmapfjxttrdpdjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.528171-1449-63151730621133/AnsiballZ_systemd.py
Dec 06 09:47:02 np0005548789.localdomain sudo[238000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:03 np0005548789.localdomain python3.9[238002]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:03 np0005548789.localdomain systemd-rc-local-generator[238023]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:03 np0005548789.localdomain systemd-sysv-generator[238029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:03 np0005548789.localdomain sudo[238000]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:03 np0005548789.localdomain sudo[238090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wemmsffeeppmhbkzabkkrmtavhungeuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014421.528171-1449-63151730621133/AnsiballZ_systemd.py
Dec 06 09:47:03 np0005548789.localdomain sudo[238090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:03 np0005548789.localdomain python3.9[238092]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:04.080 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:04 np0005548789.localdomain systemd-sysv-generator[238123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:04 np0005548789.localdomain systemd-rc-local-generator[238120]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6a3f0d73814bb22777000701e0d23d3cb3ada68f96a18b4fb977a15c11f67d2/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6a3f0d73814bb22777000701e0d23d3cb3ada68f96a18b4fb977a15c11f67d2/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:04 np0005548789.localdomain podman[238133]: 2025-12-06 09:47:04.498897845 +0000 UTC m=+0.153745467 container init bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: tmp-crun.hlwHui.mount: Deactivated successfully.
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + sudo -E kolla_set_configs
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:04 np0005548789.localdomain sudo[238154]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:47:04 np0005548789.localdomain sudo[238154]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:04 np0005548789.localdomain sudo[238154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:04 np0005548789.localdomain podman[238133]: 2025-12-06 09:47:04.542744841 +0000 UTC m=+0.197592423 container start bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:47:04 np0005548789.localdomain podman[238133]: ceilometer_agent_compute
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Validating config file
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Copying service configuration files
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: INFO:__main__:Writing out command to execute
Dec 06 09:47:04 np0005548789.localdomain sudo[238154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: ++ cat /run_command
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + ARGS=
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + sudo kolla_copy_cacerts
Dec 06 09:47:04 np0005548789.localdomain sudo[238090]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:04 np0005548789.localdomain sudo[238169]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:47:04 np0005548789.localdomain sudo[238169]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:04 np0005548789.localdomain sudo[238169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:04 np0005548789.localdomain sudo[238169]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + [[ ! -n '' ]]
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + . kolla_extend_start
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + umask 0022
Dec 06 09:47:04 np0005548789.localdomain ceilometer_agent_compute[238148]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 06 09:47:04 np0005548789.localdomain podman[238157]: 2025-12-06 09:47:04.655089959 +0000 UTC m=+0.098893081 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:47:04 np0005548789.localdomain podman[238157]: 2025-12-06 09:47:04.689139793 +0000 UTC m=+0.132942905 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:47:04 np0005548789.localdomain podman[238157]: unhealthy
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:04 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:47:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8779 DF PROTO=TCP SPT=56060 DPT=9102 SEQ=3650887006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB466F0000000001030307) 
Dec 06 09:47:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:05.239 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:05 np0005548789.localdomain sudo[238286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrtltmwzmeifblykzfkoebvmnmsequzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014424.9753392-1521-142867968956536/AnsiballZ_systemd.py
Dec 06 09:47:05 np0005548789.localdomain sudo[238286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.326 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.327 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.328 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.329 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.330 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.331 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.332 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.333 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.334 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.335 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.336 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.337 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.338 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.339 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.340 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.341 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.342 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.343 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.344 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.345 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.345 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.345 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.363 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.364 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.365 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.463 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.525 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.526 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.527 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.528 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.529 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.530 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.531 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.532 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.533 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.534 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.535 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.536 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.537 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.538 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.539 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.540 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.541 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.542 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.543 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.544 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.544 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.544 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.547 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.555 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:05 np0005548789.localdomain python3.9[238288]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:05 np0005548789.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Dec 06 09:47:05 np0005548789.localdomain systemd[1]: tmp-crun.K7e0ij.mount: Deactivated successfully.
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.831 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.933 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.933 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.933 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Dec 06 09:47:05 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:05.956 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ad4f29dde4290bcb083efcb5841fb43151872879a45528968948af6eeaac0c77" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.115 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sat, 06 Dec 2025 09:47:05 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-87b31b8a-9cf1-41ac-800e-43788298cefa x-openstack-request-id: req-87b31b8a-9cf1-41ac-800e-43788298cefa _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.116 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.116 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-87b31b8a-9cf1-41ac-800e-43788298cefa request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.118 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ad4f29dde4290bcb083efcb5841fb43151872879a45528968948af6eeaac0c77" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.177 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sat, 06 Dec 2025 09:47:06 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d31e78df-e387-4ee7-940a-4db0ce94f506 x-openstack-request-id: req-d31e78df-e387-4ee7-940a-4db0ce94f506 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.178 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.178 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6 used request id req-d31e78df-e387-4ee7-940a-4db0ce94f506 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.179 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.180 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.206 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 49840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43574c69-5275-4834-a13b-d54e2a36a442', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49840000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:47:06.180341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '867ad82e-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.454932007, 'message_signature': 'd9d981c30097d3a296aafbc37ca5f13d258d0769e7c742d30aaf3112282760d7'}]}, 'timestamp': '2025-12-06 09:47:06.207307', '_unique_id': '87d2e190b0584fb6aec0c587b7bce02a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.214 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.222 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b7ed0a2e-9350-4933-9334-4e5e08d3e6aa / tap86fc0b7a-fb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.223 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a23cb439-5802-42d6-b5b7-084ba2a1bbb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.219607', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '867d6896-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '94654625957fe1fcd98c6e6aa08f8e57d1c7b15ff95d6f0e14000aaff15d4517'}]}, 'timestamp': '2025-12-06 09:47:06.224080', '_unique_id': '597eb9dd3d7a481e8d58890ba6a382df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.225 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.226 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.227 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.227 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4faa2e96-ea9c-429a-b657-870df8b2f0ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.227875', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '867e1c0a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '6e392f3921340eede9f263145819c6f8eb2a6f19ea37015ca57659ed515e01e7'}]}, 'timestamp': '2025-12-06 09:47:06.228488', '_unique_id': '87770a81ec4f465ba200faa9003b0d24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.229 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.230 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.231 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01036f4d-4491-4314-870a-d9bbbd35ab46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:47:06.231113', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '867e9810-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.454932007, 'message_signature': '1ca92c650d008040d5c3a704694bda2fde08d36b5aede03b0f5ae15f12938e9c'}]}, 'timestamp': '2025-12-06 09:47:06.231682', '_unique_id': 'cb4a712f36764e7f89d6d29b9b65d25a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.232 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.275 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.276 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca17492f-4922-4e05-afe8-0f928d214863', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.234496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '86856da2-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': 'fb6dd60d85eafabccbab3a66f3283928d6e2fd83a47cc82d195a6e772a743b15'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.234496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8685876a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '5a9fff3b3cd3305826402cdabed95c2161bbe227dcef7e249a128e0d681ac659'}]}, 'timestamp': '2025-12-06 09:47:06.277183', '_unique_id': 'c7a4ea250d454aa3bfcc142e9b121718'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.278 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.280 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42317a8e-ce9d-4c81-8a73-8215765f7a43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.280475', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '86861f9a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '9eee8776d8f46eeb944ffe159b15c1079527b8285ee34cd103e3f7af18921b53'}]}, 'timestamp': '2025-12-06 09:47:06.281062', '_unique_id': 'c1fc5c29f4c341afa0e5d6a0c624f119'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.283 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.283 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.284 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.285 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd644106-91d5-43d6-a21f-0087d09d9088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.284520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8686c0da-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '94a8812307e190162568288f41a43ea5bb83b38e8040e311b42613450ce51bb5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.284520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8686d3b8-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '1b57b05e75eeea79e59279f28dbfef2dadd01499c5abea79394e9f8616752059'}]}, 'timestamp': '2025-12-06 09:47:06.285651', '_unique_id': '1db1ee2412214f91b08d4c2b1ecfc5f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.286 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.302 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.302 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bb1aa47-2716-4408-b5b4-0605aca2ca9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.288792', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868973d4-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': 'b22e2ded19e5ba3c14c2c1c9801b3e2f074b63ed4539d4f3655ff3e330c650f0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.288792', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '86898aae-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': 'a50bc2378cf77b1d5a858772eaa319b77c467c6355b1b57db1f571e7020284f5'}]}, 'timestamp': '2025-12-06 09:47:06.303390', '_unique_id': '1a184208b3ff41deb36ea0eeb73be0f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.304 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.305 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.306 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.306 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70490da6-62dc-4c90-ba1f-e394f95114b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.306063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868a070e-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '26160b809231fae25f766a4535df79ce6db2ac6434fff7bb57f9b012b97c1aa5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.306063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868a1aaa-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '3d7aa37ac7098c5d216b9f49ecfb90619d5778016be2d91df1701566ee8ec54e'}]}, 'timestamp': '2025-12-06 09:47:06.307072', '_unique_id': 'e55fb5b7ffb042e895ff8f0fbb89a513'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.308 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.309 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.310 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad7342b-8fcc-4a1f-8dee-a594c88169cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.309662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868a94c6-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '01f353cbc80918f180f636f67b5f23eda1aebc70474d897a72817f798fe3094b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.309662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868aa77c-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '525661498fe6a439eb97103efbf9700f078b28b7971bc6880267902adf2a994d'}]}, 'timestamp': '2025-12-06 09:47:06.310668', '_unique_id': '3bab024789734f80acec495d2608953b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.311 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.313 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1315b5f-41ba-4435-9d38-11636c0d8a9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.313172', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868b1cac-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': 'b0994697493d077cac2a6ae12618ab9719098c5e76184dc67abf12344a88cbb4'}]}, 'timestamp': '2025-12-06 09:47:06.313731', '_unique_id': 'b18f60de58d8436baaf93a579bd18c97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.316 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e5a2b17-a87d-45a2-90b7-d65b09b8ac91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.316169', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868b9196-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '8217b55648c0850ebe34be4cdeb3ec2e4e72eebec652c1fadb861fb6a6d2ceb8'}]}, 'timestamp': '2025-12-06 09:47:06.316692', '_unique_id': 'f2fcccdf00704d968d9849c6324fb2a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.317 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.319 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62ab4f5d-a291-4851-9550-120e4013a0bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.319165', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868c05ea-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '065d0a908392f84c2b77b5b818e4886d1ab69d1f432f96fe750d20248cfad22a'}]}, 'timestamp': '2025-12-06 09:47:06.319653', '_unique_id': '4ace43eee3e6418bae23c4ee58952043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.320 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.322 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.322 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b8c5327-dffa-4968-9467-de33ab605afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.321998', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868c73fe-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '42723c99cffe9fbd5f204a95eafee9f14db9fefd1a3352568be180ce09655019'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.321998', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868c8ad8-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '9e6c45cc6a5c9349877974cb13febdac501abbac3fe8a9de875ff47fce566ef8'}]}, 'timestamp': '2025-12-06 09:47:06.323036', '_unique_id': 'fe29a9faa6004ab7bd284a390c3fea83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.325 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c79d3a4-8f7f-45c3-8c28-6b01301d92ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.325423', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868cfbf8-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '8cc8eadf83ae49f8fcdd991119e3fdc5abee450eaecd45d2207d552718c17072'}]}, 'timestamp': '2025-12-06 09:47:06.325988', '_unique_id': 'ad2c7b755727463fbb5985b0c8c50ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.326 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.328 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b6f6247-2b01-439b-adc2-46e365c0bf2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.328277', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868d6a16-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': 'c95bcf864ceb531f80fe370dd8d12633346c2973f2695f8f1809bfe8d46b9ce4'}]}, 'timestamp': '2025-12-06 09:47:06.328814', '_unique_id': '9d362290e72b4e4991355934fe4e7ee9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.331 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.331 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.331 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.332 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4674df87-cc83-4ad7-a774-f5838974aef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.331634', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868dede2-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': 'e1600d33a79214c9c69f5fe5f8fd32cff7dcdcc98c60c450c8fac956bf7e1627'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.331634', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868e007a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': '9125e3213bdba1a57e4a81da0c2f32421c7dd7fdb75238b6ece4d77ecb33a4c0'}]}, 'timestamp': '2025-12-06 09:47:06.332590', '_unique_id': 'a63e710eaa3f4febafe1acecffc6781f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.334 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.335 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9b432f8-0af9-426c-9cbd-07177162b7b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.334874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868e6b0a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': '15a035843f7d818baf9bed7b0a1deca5b4d50d4a3ec82653d5eb9ffa71a27b81'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.334874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868e7da2-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.538337013, 'message_signature': '1885bb9644f4421914489c662d32e2c4256853fc2442f95b7d56f0aa85cc8771'}]}, 'timestamp': '2025-12-06 09:47:06.335832', '_unique_id': '3b07c52e5f1f42a299c9f7e921f6900a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.336 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.338 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '929ab98c-9424-4e7c-99c9-b94faba653fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.338146', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868eed00-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '7f3af896d316f2a4f88b986f0a5297b14449b1626ba4ce222af200558806395d'}]}, 'timestamp': '2025-12-06 09:47:06.338692', '_unique_id': '129f1bac19854712b0b80f5d2c4fbd36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.339 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.341 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7417c945-1e2d-4d31-83be-f4d99fc99aef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:06.340981', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '868f59b6-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.469142909, 'message_signature': '6db924158fa550c3cbd26dbf1fc0304aff301e780d9765d3be66ab7e824a8c70'}]}, 'timestamp': '2025-12-06 09:47:06.341455', '_unique_id': '05555a75d21d4e9b96c1f8ba54d19cfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.342 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.343 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.343 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.344 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.344 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.344 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be8fa5ef-171c-45fc-ba11-302cdf301ef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:06.344427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '868fdf3a-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '96cc563d113516c948999c1ba965702309788787fe661d6c4f421c41ed8d3a82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:06.344427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '868fea98-d288-11f0-9b72-fa163e118844', 'monotonic_time': 10844.484004242, 'message_signature': '5022b0bf31103c112fcb22cd96c6a827d578163251fa29699d214dbcbfa2ab44'}]}, 'timestamp': '2025-12-06 09:47:06.345055', '_unique_id': '2bb165262c5c46fcb08e6dda14372c4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.345 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:06 np0005548789.localdomain virtqemud[203911]: End of file while reading data: Input/output error
Dec 06 09:47:06 np0005548789.localdomain virtqemud[203911]: End of file while reading data: Input/output error
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238148]: 2025-12-06 09:47:06.354 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: libpod-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain podman[238298]: 2025-12-06 09:47:06.49220996 +0000 UTC m=+0.736935938 container died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: libpod-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope: Consumed 1.301s CPU time.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.timer: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: tmp-crun.wRwsI2.mount: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c6a3f0d73814bb22777000701e0d23d3cb3ada68f96a18b4fb977a15c11f67d2-merged.mount: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094-userdata-shm.mount: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain podman[238298]: 2025-12-06 09:47:06.555989321 +0000 UTC m=+0.800715259 container cleanup bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:47:06 np0005548789.localdomain podman[238298]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548789.localdomain podman[238325]: 2025-12-06 09:47:06.652630129 +0000 UTC m=+0.064423653 container cleanup bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:47:06 np0005548789.localdomain podman[238325]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6a3f0d73814bb22777000701e0d23d3cb3ada68f96a18b4fb977a15c11f67d2/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6a3f0d73814bb22777000701e0d23d3cb3ada68f96a18b4fb977a15c11f67d2/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:06 np0005548789.localdomain podman[238337]: 2025-12-06 09:47:06.830543344 +0000 UTC m=+0.144356237 container init bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm)
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + sudo -E kolla_set_configs
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:06 np0005548789.localdomain sudo[238357]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:47:06 np0005548789.localdomain sudo[238357]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:06 np0005548789.localdomain sudo[238357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:06 np0005548789.localdomain podman[238337]: 2025-12-06 09:47:06.869250007 +0000 UTC m=+0.183062910 container start bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:47:06 np0005548789.localdomain podman[238337]: ceilometer_agent_compute
Dec 06 09:47:06 np0005548789.localdomain systemd[1]: Started ceilometer_agent_compute container.
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Validating config file
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Copying service configuration files
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: INFO:__main__:Writing out command to execute
Dec 06 09:47:06 np0005548789.localdomain sudo[238286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548789.localdomain sudo[238357]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: ++ cat /run_command
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + ARGS=
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + sudo kolla_copy_cacerts
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: sudo: unable to send audit message: Operation not permitted
Dec 06 09:47:06 np0005548789.localdomain sudo[238372]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:47:06 np0005548789.localdomain sudo[238372]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:47:06 np0005548789.localdomain sudo[238372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 06 09:47:06 np0005548789.localdomain sudo[238372]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + [[ ! -n '' ]]
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + . kolla_extend_start
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + umask 0022
Dec 06 09:47:06 np0005548789.localdomain ceilometer_agent_compute[238351]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 06 09:47:06 np0005548789.localdomain podman[238360]: 2025-12-06 09:47:06.9761374 +0000 UTC m=+0.098118865 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:47:07 np0005548789.localdomain podman[238360]: 2025-12-06 09:47:07.005354441 +0000 UTC m=+0.127335866 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute)
Dec 06 09:47:07 np0005548789.localdomain podman[238360]: unhealthy
Dec 06 09:47:07 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:07 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:47:07 np0005548789.localdomain sudo[238389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:47:07 np0005548789.localdomain sudo[238389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:07 np0005548789.localdomain sudo[238389]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548789.localdomain sudo[238415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:47:07 np0005548789.localdomain sudo[238415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:07 np0005548789.localdomain sudo[238524]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kicmrhhklwytxmltpedglwzaxhiyqddp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014427.103299-1545-178586005545684/AnsiballZ_stat.py
Dec 06 09:47:07 np0005548789.localdomain sudo[238524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:07 np0005548789.localdomain python3.9[238538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:07 np0005548789.localdomain sudo[238524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58231 DF PROTO=TCP SPT=46606 DPT=9102 SEQ=1202293018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB51EF0000000001030307) 
Dec 06 09:47:07 np0005548789.localdomain sudo[238415]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.728 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.729 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.730 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.731 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.732 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.733 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.734 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.735 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.736 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.737 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.738 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.739 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.740 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.741 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.742 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.742 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.742 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.758 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.759 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.759 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.767 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.865 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.866 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.867 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.868 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.869 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.870 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.871 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.872 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.873 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.874 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.875 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.876 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.877 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.878 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.879 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.880 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.880 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.880 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.880 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.880 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.881 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.882 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.883 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.892 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.893 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.893 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.893 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.897 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 06 09:47:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:07.905 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 06 09:47:07 np0005548789.localdomain sudo[238649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojwfmpnlvyyiaecumahpxobtevakqzwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014427.103299-1545-178586005545684/AnsiballZ_copy.py
Dec 06 09:47:07 np0005548789.localdomain sudo[238649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:08 np0005548789.localdomain python3.9[238651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014427.103299-1545-178586005545684/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:08 np0005548789.localdomain sudo[238649]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.224 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}172727ab3d6cbaaa6568bf97b3413b148d1c73ec77830424529759a582bd30ea" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.285 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Sat, 06 Dec 2025 09:47:08 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-87316794-dae6-4ab1-b170-88590f7f7e0e x-openstack-request-id: req-87316794-dae6-4ab1-b170-88590f7f7e0e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.286 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.286 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-87316794-dae6-4ab1-b170-88590f7f7e0e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.288 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}172727ab3d6cbaaa6568bf97b3413b148d1c73ec77830424529759a582bd30ea" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.312 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Sat, 06 Dec 2025 09:47:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-82509171-38d3-47a9-be25-867586b759f2 x-openstack-request-id: req-82509171-38d3-47a9-be25-867586b759f2 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.312 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.312 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6 used request id req-82509171-38d3-47a9-be25-867586b759f2 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.314 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.315 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.338 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 49850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ee337f8-05b5-403d-87a7-4d467303c798', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49850000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:47:08.315494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '87c02c2a-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.587113354, 'message_signature': '141083752a6e378674e978b44e8a6db2cd113b9e946aa6f7aff6e3df5b0b52f9'}]}, 'timestamp': '2025-12-06 09:47:08.339291', '_unique_id': '043ec992201248f7af02ff2b1b68a493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.347 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.354 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b7ed0a2e-9350-4933-9334-4e5e08d3e6aa / tap86fc0b7a-fb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.354 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6184e40-16ad-45dc-a899-1dac164651b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.350900', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87c298de-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '9724f294b3e72005f3fb28933cae8a0e6465eda485670d43920e77fcd5154705'}]}, 'timestamp': '2025-12-06 09:47:08.355097', '_unique_id': 'a690185a67aa41ef9f8230197a675c28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.356 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.357 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd840138-50ab-471a-8b69-065e3b8d4863', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.357739', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87c316d8-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '6cae56bf076e2972b3435578c92164e131e6269695daeb69fa58fa69c4de3899'}]}, 'timestamp': '2025-12-06 09:47:08.358259', '_unique_id': '2e547c75670f42498072442690810c68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.359 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.360 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.360 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '729db4e5-66cb-4301-855c-a095ddce4baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.360588', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87c38690-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '2272b30d772f4d0d04446c172ac027f2d0f482fb0e3b18cbbb8870bc45c2bf0b'}]}, 'timestamp': '2025-12-06 09:47:08.361142', '_unique_id': '5e0c99b8d4a440fa84e66daf0139b44b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.362 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.363 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.363 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.363 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.401 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.402 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '254a3614-647c-4912-8913-76e94bd518ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.364318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87c9cd02-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': 'e1d1311fef3ccce7a2ba31972cab699af31dbfd6a8e5e71d5541f082de981e4d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.364318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87c9df7c-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '0b78166d14b7caa44b0ab039cadd1a9dffe944990061def47852c22ca3b9642c'}]}, 'timestamp': '2025-12-06 09:47:08.402693', '_unique_id': '00eeceb18d8b4529b777b4ddfd08ac93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.404 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.405 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06759330-a8c3-40b3-acf7-4623dce8931c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.405665', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87ca66c2-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': 'ea29b49058a6bd1787702e5d19896a5b26a49da5e3055f8bbb35cb868bb08a08'}]}, 'timestamp': '2025-12-06 09:47:08.406182', '_unique_id': '5ab7cfedbe9d4cffa6c1819860dd0f1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.407 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.424 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.425 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cffeb91b-cfed-4064-88b2-1db573ec9e77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.408472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87cd45ea-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': '3c70c524c9d202c3c28e03d9b322221dfd045275a70161a0564d93bccb038b27'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.408472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87cd614c-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': '5c7c838ab57eba479a671623e32a962902627cfdab2a8eb375c977c189a05987'}]}, 'timestamp': '2025-12-06 09:47:08.425828', '_unique_id': '7e1b3b0291154693a7e36076eabc2bb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.427 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.429 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.429 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.429 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.430 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.430 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1e8ac93-5aea-4d95-8dfb-5fca48559a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.430216', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87ce28c0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '1c404a1481d74595c317e90ffda926e16b7ce5240ed56a70cf78a6c84d42b491'}]}, 'timestamp': '2025-12-06 09:47:08.430963', '_unique_id': '4a6201e1523146989debaed950930891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.432 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.434 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38913422-47e2-4a2c-a9bf-f9d29bb66abb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.434187', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87cec406-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '9f27ed3882202641510e8cba8cbe290e8fc2996935b995f8408db66d502d1248'}]}, 'timestamp': '2025-12-06 09:47:08.434936', '_unique_id': '4055b036c2014b0fb5b4d8b569b02569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.436 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.438 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d764d1d-33d0-4218-9206-1eca67dd8549', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.438150', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87cf5ea2-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '79e12d15840017e4edb84c9d72765c1fdc1a64659286074a9e925fc0dd4c1596'}]}, 'timestamp': '2025-12-06 09:47:08.438893', '_unique_id': 'bd907a99120e4cba827c9f17db4a5b7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.440 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.442 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.442 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6f2f3ab-10c1-4877-b9ec-51594d7ca77f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.442582', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87d01068-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '3baa851caed6f1a7247b65bd2b0c3a6ac486d1ae6bf3731127ce213407860660'}]}, 'timestamp': '2025-12-06 09:47:08.443444', '_unique_id': 'c69e1af819a042ff8b9c1bb75368294c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.444 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.445 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.445 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.445 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b0df14a-c349-4dcc-ba41-cd7f6cb7e74b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.445536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d07a26-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '4871a9dd07ffc001fc9035d9564101dc09d88727efa47468d9bca8e39e1adbd3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.445536', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d08aac-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '656b88e09e263ae42b7964b32d9e86f6d7e3abcd7afb16dabe4cc096067d2b64'}]}, 'timestamp': '2025-12-06 09:47:08.446371', '_unique_id': '8361f74a8c89462489af002afaf49d93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.447 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.448 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.448 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edd27b9b-7080-4d53-9e89-4863f26aa27a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:47:08.448412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '87d0ea2e-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.587113354, 'message_signature': 'fa4a152b247ae6d4967f2b9701b8416a8bc8d9e15ec60b9a84db020bc9a15700'}]}, 'timestamp': '2025-12-06 09:47:08.448843', '_unique_id': '1dad26a12b3d4822bbfbb5c482402937'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.449 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.450 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.451 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f325470c-e279-4663-b32d-5bceea011757', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.450900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d14c08-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': '23617741c5d38c3f00a2556fe8477a59d52d4625c0f0d099d84601a08f644ee4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.450900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d15c20-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': '2d8806066c7bca53bb0e0cf0c4cc4e555f3c2daf25f427eb01a12dfe50e96a2a'}]}, 'timestamp': '2025-12-06 09:47:08.451690', '_unique_id': '60493257326741da844de1837d9900d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain sudo[238669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.452 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.453 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e50cea33-9d6a-4437-9ff0-118fcd140b15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.453436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d1ac02-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '96906f12f50b7fff9d1723bcba0a624e1ff69b9cf472736de27ad39dd412a53d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.453436', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d1b706-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '7a847846b9fa09acd6b674d6c3ad56d15381563ade9dddaf338d53a382353f9a'}]}, 'timestamp': '2025-12-06 09:47:08.453991', '_unique_id': '6c4368177e2246449e4f6aa9f13a9643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.454 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.455 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b52f1a4-e59d-42a4-9e4f-4d4303c9d88c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.455331', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87d1f6d0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': 'afc05951899ed3aa62e463000169ecf8e169cd05c2e1693dd1b8223794d5eee2'}]}, 'timestamp': '2025-12-06 09:47:08.455642', '_unique_id': 'c5f16cd6faff4003974eae47d8cdf074'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain sudo[238669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.456 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.457 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.457 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain sudo[238669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9df690ff-4ab8-4973-bbe3-fe5e22483b3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.457004', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d23712-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '450f9701a8b1fc7f96558206a00588d141229dee53ead8d3e944cde1acafd95c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.457004', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d240e0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': '8d5c15e804d98f1281c6f79ef01d48d33a9a2a1aeab04c57769a7e12ace72247'}]}, 'timestamp': '2025-12-06 09:47:08.457520', '_unique_id': '783e7ae0f8094428beec6e772df680be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.458 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.459 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f55e46d-722d-4f38-a0df-386c794f591d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:47:08.459140', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '87d28abe-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.600412848, 'message_signature': '7875dfc9bd14af5dc33ad2ec7d9897560e7557c967633f035a032d7352414dcb'}]}, 'timestamp': '2025-12-06 09:47:08.459485', '_unique_id': '6b828fa2b006425ab7e64972b3560875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.460 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.461 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.461 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.461 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58af9a58-7b6f-4c57-9596-145823c9aee5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.461204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d2db36-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': 'b71161818da20df873dbd485bf049b0c261dd18fd6096f7bea9f646c626e9c58'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.461204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d2e536-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': 'b708a621971869605a253ea72424fd6f6b32e5f3f7cc3c83d0d8daed8274f24e'}]}, 'timestamp': '2025-12-06 09:47:08.461725', '_unique_id': '2ab02862a7d14f968cdb13009e1ceada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.462 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.463 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.463 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.463 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '706eb7a4-138f-4aa5-b295-73c688676eb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.463327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d32f28-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': '27bb5d4cd6b5d8eaa3810f21a5f2d0ede3d948d6d955d6c3b91b71726647bae8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.463327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d33950-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.658099055, 'message_signature': 'b6015ea64bcdda8bbc1a55d27a743e1eb721ea45f7842c84016901a4fe2b8826'}]}, 'timestamp': '2025-12-06 09:47:08.463896', '_unique_id': '178e33c67a49493ea3a7dbd4c363431c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.464 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.465 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.465 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.465 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd666734d-860b-42cd-9f7d-21f35447f0b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:47:08.465320', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '87d37be0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': 'f745f8184ee388da9096fb44fbaed756b25dc0939c668af9afcaf3f0ea6b13f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:47:08.465320', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '87d385ea-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10846.613815705, 'message_signature': 'e3f4e83116a999c16d1507c59e35812faf4289a22e11dffb8f7df6966d43946d'}]}, 'timestamp': '2025-12-06 09:47:08.465868', '_unique_id': '793b3e197c4542f19725c50e9789ddec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:47:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:47:08.466 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:47:08 np0005548789.localdomain sudo[238777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzghodrbubnisrljsedddbcbmhlygsrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014428.575961-1596-56872680104276/AnsiballZ_container_config_data.py
Dec 06 09:47:08 np0005548789.localdomain sudo[238777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:09 np0005548789.localdomain python3.9[238779]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 06 09:47:09 np0005548789.localdomain sudo[238777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:09 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:09.123 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:09 np0005548789.localdomain sshd[238780]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:10.272 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8781 DF PROTO=TCP SPT=56060 DPT=9102 SEQ=3650887006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB5E2F0000000001030307) 
Dec 06 09:47:10 np0005548789.localdomain sudo[238889]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfqdvlrxrtmkuobtpfjnwcrauucgsqvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014430.5013742-1623-144808541377689/AnsiballZ_container_config_hash.py
Dec 06 09:47:10 np0005548789.localdomain sudo[238889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:11 np0005548789.localdomain python3.9[238891]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:11 np0005548789.localdomain sudo[238889]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:11 np0005548789.localdomain sshd[238780]: Received disconnect from 64.227.156.63 port 59186:11: Bye Bye [preauth]
Dec 06 09:47:11 np0005548789.localdomain sshd[238780]: Disconnected from authenticating user root 64.227.156.63 port 59186 [preauth]
Dec 06 09:47:12 np0005548789.localdomain sudo[238999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibnnducnlrydtzbgydyoasepxrlblhgi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014431.9453373-1653-170897170015967/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:12 np0005548789.localdomain sudo[238999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:12 np0005548789.localdomain python3[239001]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:12 np0005548789.localdomain podman[239037]: 
Dec 06 09:47:12 np0005548789.localdomain podman[239037]: 2025-12-06 09:47:12.774855856 +0000 UTC m=+0.076347962 container create d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Dec 06 09:47:12 np0005548789.localdomain podman[239037]: 2025-12-06 09:47:12.741949118 +0000 UTC m=+0.043441214 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 06 09:47:12 np0005548789.localdomain python3[239001]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 06 09:47:12 np0005548789.localdomain sudo[238999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:13 np0005548789.localdomain sudo[239183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpigjvefqtotycrewczledjgrglmszds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014433.1767595-1677-71744956987306/AnsiballZ_stat.py
Dec 06 09:47:13 np0005548789.localdomain sudo[239183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:13 np0005548789.localdomain python3.9[239185]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:13 np0005548789.localdomain sudo[239183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:14 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:14.147 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38770 DF PROTO=TCP SPT=42688 DPT=9101 SEQ=1025857698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB6BB00000000001030307) 
Dec 06 09:47:14 np0005548789.localdomain sudo[239295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diqlzbqfuitiayljuvfveqcplrhaxopl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.2329168-1704-226813373159224/AnsiballZ_file.py
Dec 06 09:47:14 np0005548789.localdomain sudo[239295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:14 np0005548789.localdomain python3.9[239297]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:14 np0005548789.localdomain sudo[239295]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:15 np0005548789.localdomain sudo[239404]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gekvxampakjxeamaoxiqyicggrqnrrjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.794463-1704-153090461926392/AnsiballZ_copy.py
Dec 06 09:47:15 np0005548789.localdomain sudo[239404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:15.308 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:15 np0005548789.localdomain python3.9[239406]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014434.794463-1704-153090461926392/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:15 np0005548789.localdomain sudo[239404]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:15 np0005548789.localdomain sudo[239459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqswpstckkyaeshgghksxgobgztglesb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.794463-1704-153090461926392/AnsiballZ_systemd.py
Dec 06 09:47:15 np0005548789.localdomain sudo[239459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:16 np0005548789.localdomain python3.9[239461]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:16 np0005548789.localdomain systemd-sysv-generator[239488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:16 np0005548789.localdomain systemd-rc-local-generator[239483]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:16 np0005548789.localdomain sudo[239459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:16 np0005548789.localdomain sudo[239550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzkzmenmokugmldlhfuwytgwerdnhnis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.794463-1704-153090461926392/AnsiballZ_systemd.py
Dec 06 09:47:16 np0005548789.localdomain sudo[239550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:17 np0005548789.localdomain python3.9[239552]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:17 np0005548789.localdomain podman[239554]: 2025-12-06 09:47:17.184492217 +0000 UTC m=+0.105858043 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 09:47:17 np0005548789.localdomain podman[239554]: 2025-12-06 09:47:17.205058031 +0000 UTC m=+0.126423867 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:47:17 np0005548789.localdomain systemd-rc-local-generator[239596]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:17 np0005548789.localdomain systemd-sysv-generator[239600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38772 DF PROTO=TCP SPT=42688 DPT=9101 SEQ=1025857698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB77AF0000000001030307) 
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Starting node_exporter container...
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:17 np0005548789.localdomain podman[239611]: 2025-12-06 09:47:17.608397975 +0000 UTC m=+0.143112118 container init d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.625Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.625Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.625Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.625Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.626Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.626Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.626Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.626Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=arp
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=bcache
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=bonding
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=cpu
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=edac
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=filefd
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=netclass
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=netdev
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=netstat
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=nfs
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=nvme
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=softnet
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=systemd
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=xfs
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.627Z caller=node_exporter.go:117 level=info collector=zfs
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.628Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 06 09:47:17 np0005548789.localdomain node_exporter[239626]: ts=2025-12-06T09:47:17.628Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:17 np0005548789.localdomain podman[239611]: 2025-12-06 09:47:17.643352879 +0000 UTC m=+0.178066982 container start d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:17 np0005548789.localdomain podman[239611]: node_exporter
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: Started node_exporter container.
Dec 06 09:47:17 np0005548789.localdomain sudo[239550]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:17 np0005548789.localdomain podman[239635]: 2025-12-06 09:47:17.772744899 +0000 UTC m=+0.122630156 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:47:17 np0005548789.localdomain podman[239635]: 2025-12-06 09:47:17.784174783 +0000 UTC m=+0.134060030 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:17 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:47:18 np0005548789.localdomain sshd[239676]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:19 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:19.190 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:19 np0005548789.localdomain sudo[239768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvyeajkpfxohtwfixrtidgltcybrpmid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014438.9997904-1776-87954546963182/AnsiballZ_systemd.py
Dec 06 09:47:19 np0005548789.localdomain sudo[239768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:19 np0005548789.localdomain python3.9[239770]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: Stopping node_exporter container...
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: libpod-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.scope: Deactivated successfully.
Dec 06 09:47:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51078 DF PROTO=TCP SPT=36754 DPT=9105 SEQ=1920759448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB80EF0000000001030307) 
Dec 06 09:47:19 np0005548789.localdomain podman[239774]: 2025-12-06 09:47:19.733442006 +0000 UTC m=+0.075408373 container died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.timer: Deactivated successfully.
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538-userdata-shm.mount: Deactivated successfully.
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c91146eb3363a08d75f235168b658b230f8ccbe671d00a8efc84b46337c2ff5e-merged.mount: Deactivated successfully.
Dec 06 09:47:19 np0005548789.localdomain podman[239774]: 2025-12-06 09:47:19.781215857 +0000 UTC m=+0.123182224 container cleanup d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:19 np0005548789.localdomain podman[239774]: node_exporter
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:47:19 np0005548789.localdomain podman[239800]: 2025-12-06 09:47:19.878099062 +0000 UTC m=+0.069036819 container cleanup d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:47:19 np0005548789.localdomain podman[239800]: node_exporter
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: Stopped node_exporter container.
Dec 06 09:47:19 np0005548789.localdomain systemd[1]: Starting node_exporter container...
Dec 06 09:47:20 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:20 np0005548789.localdomain podman[239813]: 2025-12-06 09:47:20.045787842 +0000 UTC m=+0.136529718 container init d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.059Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.059Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.059Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.060Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.060Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.060Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.060Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.060Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=arp
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=bcache
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=bonding
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=cpu
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=edac
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=filefd
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=netclass
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=netdev
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=netstat
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=nfs
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=nvme
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=softnet
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=systemd
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.061Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=node_exporter.go:117 level=info collector=xfs
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=node_exporter.go:117 level=info collector=zfs
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 06 09:47:20 np0005548789.localdomain node_exporter[239828]: ts=2025-12-06T09:47:20.062Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 06 09:47:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:20 np0005548789.localdomain podman[239813]: 2025-12-06 09:47:20.08024757 +0000 UTC m=+0.170989406 container start d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:20 np0005548789.localdomain podman[239813]: node_exporter
Dec 06 09:47:20 np0005548789.localdomain systemd[1]: Started node_exporter container.
Dec 06 09:47:20 np0005548789.localdomain sudo[239768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:20 np0005548789.localdomain podman[239837]: 2025-12-06 09:47:20.169840232 +0000 UTC m=+0.084675147 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:47:20 np0005548789.localdomain podman[239837]: 2025-12-06 09:47:20.182681451 +0000 UTC m=+0.097516356 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:47:20 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:47:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:20.349 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:20 np0005548789.localdomain sudo[239967]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjduifpdezslfnmtwyrvtbbmdzpdyrff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.381052-1800-196194191017888/AnsiballZ_stat.py
Dec 06 09:47:20 np0005548789.localdomain sudo[239967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:20 np0005548789.localdomain python3.9[239969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:20 np0005548789.localdomain sudo[239967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:22 np0005548789.localdomain sudo[240055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzwjpiyetxkpnzcvrbqeeoczrjiyoucc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.381052-1800-196194191017888/AnsiballZ_copy.py
Dec 06 09:47:22 np0005548789.localdomain sudo[240055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:22 np0005548789.localdomain sshd[239676]: Received disconnect from 179.33.210.213 port 41712:11: Bye Bye [preauth]
Dec 06 09:47:22 np0005548789.localdomain sshd[239676]: Disconnected from authenticating user root 179.33.210.213 port 41712 [preauth]
Dec 06 09:47:22 np0005548789.localdomain python3.9[240057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014440.381052-1800-196194191017888/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:22 np0005548789.localdomain sudo[240055]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:23 np0005548789.localdomain sudo[240165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqclqyxmrlepjdqjdqwwgetfvqfqduwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014443.0372932-1851-240490065089208/AnsiballZ_container_config_data.py
Dec 06 09:47:23 np0005548789.localdomain sudo[240165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:23 np0005548789.localdomain python3.9[240167]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 06 09:47:23 np0005548789.localdomain sudo[240165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51079 DF PROTO=TCP SPT=36754 DPT=9105 SEQ=1920759448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB90AF0000000001030307) 
Dec 06 09:47:24 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:24.192 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:24 np0005548789.localdomain sudo[240275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqmmyjioemeauvnednsqfxgapebqspzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014444.289155-1879-102797082271812/AnsiballZ_container_config_hash.py
Dec 06 09:47:24 np0005548789.localdomain sudo[240275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:24 np0005548789.localdomain python3.9[240277]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:24 np0005548789.localdomain sudo[240275]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:25 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:25.390 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:25 np0005548789.localdomain sudo[240385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrmpkxegvgbpcfyckgrucimfatihqmav ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014445.2481897-1908-277278884339641/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:25 np0005548789.localdomain sudo[240385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:25 np0005548789.localdomain python3[240387]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:26 np0005548789.localdomain sshd[240414]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:27 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15808 DF PROTO=TCP SPT=33906 DPT=9882 SEQ=2958895315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DB9D700000000001030307) 
Dec 06 09:47:27 np0005548789.localdomain sshd[240414]: Received disconnect from 14.194.101.210 port 38980:11: Bye Bye [preauth]
Dec 06 09:47:27 np0005548789.localdomain sshd[240414]: Disconnected from authenticating user root 14.194.101.210 port 38980 [preauth]
Dec 06 09:47:27 np0005548789.localdomain podman[240401]: 2025-12-06 09:47:25.915403834 +0000 UTC m=+0.045127678 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:27 np0005548789.localdomain podman[240470]: 
Dec 06 09:47:28 np0005548789.localdomain podman[240470]: 2025-12-06 09:47:28.006954297 +0000 UTC m=+0.084881624 container create b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Dec 06 09:47:28 np0005548789.localdomain podman[240470]: 2025-12-06 09:47:27.968552925 +0000 UTC m=+0.046480352 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:28 np0005548789.localdomain python3[240387]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 06 09:47:28 np0005548789.localdomain sudo[240385]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:28 np0005548789.localdomain sudo[240614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okjcymsrxpexhmgvneklkbcrftfpedcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014448.353047-1932-68110652496455/AnsiballZ_stat.py
Dec 06 09:47:28 np0005548789.localdomain sudo[240614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:28 np0005548789.localdomain python3.9[240616]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:47:28 np0005548789.localdomain sudo[240614]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:29 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:29.239 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:29 np0005548789.localdomain sudo[240726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udjztigzqbiflskmjfxeogjatsmyhdpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.271278-1959-49400457727301/AnsiballZ_file.py
Dec 06 09:47:29 np0005548789.localdomain sudo[240726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38774 DF PROTO=TCP SPT=42688 DPT=9101 SEQ=1025857698 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBA7EF0000000001030307) 
Dec 06 09:47:29 np0005548789.localdomain python3.9[240728]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:29 np0005548789.localdomain sudo[240726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:30 np0005548789.localdomain sudo[240835]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eulxrvvefewgnyoemlpcdocosvikmhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.8067646-1959-241526902827975/AnsiballZ_copy.py
Dec 06 09:47:30 np0005548789.localdomain sudo[240835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:47:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:47:30 np0005548789.localdomain podman[240839]: 2025-12-06 09:47:30.368704485 +0000 UTC m=+0.096968398 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 09:47:30 np0005548789.localdomain podman[240839]: 2025-12-06 09:47:30.377138304 +0000 UTC m=+0.105402257 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:47:30 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:47:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:30.435 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:30 np0005548789.localdomain python3.9[240837]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014449.8067646-1959-241526902827975/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:30 np0005548789.localdomain podman[240838]: 2025-12-06 09:47:30.462640167 +0000 UTC m=+0.190600010 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:47:30 np0005548789.localdomain sudo[240835]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:30 np0005548789.localdomain podman[240838]: 2025-12-06 09:47:30.495306118 +0000 UTC m=+0.223265941 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:47:30 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:47:30 np0005548789.localdomain sudo[240931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qesnsaoffxpucbedelszzayxdwavyasz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.8067646-1959-241526902827975/AnsiballZ_systemd.py
Dec 06 09:47:30 np0005548789.localdomain sudo[240931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:31 np0005548789.localdomain python3.9[240933]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:31 np0005548789.localdomain systemd-rc-local-generator[240954]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:31 np0005548789.localdomain systemd-sysv-generator[240957]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:31 np0005548789.localdomain sudo[240931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:31 np0005548789.localdomain sudo[241022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdxrdmtcezpgoyznmovxdxelaucappqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014449.8067646-1959-241526902827975/AnsiballZ_systemd.py
Dec 06 09:47:31 np0005548789.localdomain sudo[241022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:31 np0005548789.localdomain python3.9[241024]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:47:32 np0005548789.localdomain systemd-rc-local-generator[241054]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:32 np0005548789.localdomain systemd-sysv-generator[241057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:47:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51080 DF PROTO=TCP SPT=36754 DPT=9105 SEQ=1920759448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBB1EF0000000001030307) 
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Starting podman_exporter container...
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:47:32 np0005548789.localdomain podman[241065]: 2025-12-06 09:47:32.48757635 +0000 UTC m=+0.128479003 container init b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:47:32 np0005548789.localdomain podman_exporter[241078]: ts=2025-12-06T09:47:32.501Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 06 09:47:32 np0005548789.localdomain podman_exporter[241078]: ts=2025-12-06T09:47:32.501Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 06 09:47:32 np0005548789.localdomain podman_exporter[241078]: ts=2025-12-06T09:47:32.501Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 06 09:47:32 np0005548789.localdomain podman_exporter[241078]: ts=2025-12-06T09:47:32.501Z caller=handler.go:105 level=info collector=container
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Starting Podman API Service...
Dec 06 09:47:32 np0005548789.localdomain podman[241065]: 2025-12-06 09:47:32.522080208 +0000 UTC m=+0.162982871 container start b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:47:32 np0005548789.localdomain podman[241065]: podman_exporter
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Started Podman API Service.
Dec 06 09:47:32 np0005548789.localdomain systemd[1]: Started podman_exporter container.
Dec 06 09:47:32 np0005548789.localdomain sudo[241022]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="Setting parallel job count to 25"
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:47:32 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 06 09:47:32 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:32Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:47:32 np0005548789.localdomain podman[241089]: 2025-12-06 09:47:32.584065062 +0000 UTC m=+0.058162543 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:47:32 np0005548789.localdomain podman[241089]: 2025-12-06 09:47:32.667090516 +0000 UTC m=+0.141188017 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:32 np0005548789.localdomain podman[241089]: unhealthy
Dec 06 09:47:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:47:33 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:33 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:47:33 np0005548789.localdomain sshd[241143]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:34 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:34.288 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58668 DF PROTO=TCP SPT=38750 DPT=9102 SEQ=2585128302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBBB700000000001030307) 
Dec 06 09:47:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:34 np0005548789.localdomain sudo[241235]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aojapkqjaufeshqunnjozoaplmxemlgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014454.4934378-2031-39084979139422/AnsiballZ_systemd.py
Dec 06 09:47:34 np0005548789.localdomain sudo[241235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:35 np0005548789.localdomain python3.9[241237]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: Stopping podman_exporter container...
Dec 06 09:47:35 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:47:32 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 2790 "" "Go-http-client/1.1"
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: libpod-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.scope: Deactivated successfully.
Dec 06 09:47:35 np0005548789.localdomain podman[241241]: 2025-12-06 09:47:35.203942299 +0000 UTC m=+0.057811071 container died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.timer: Deactivated successfully.
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941-userdata-shm.mount: Deactivated successfully.
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cd6425452938e99a947d98ed440c416f97c1a47fc1a973380479b4612f15ab3d-merged.mount: Deactivated successfully.
Dec 06 09:47:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:35.482 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-fcbe7548a736ce5f8ebae55fdcfeeff017a268d646edfaeb837e7d7f4a13d780-merged.mount: Deactivated successfully.
Dec 06 09:47:35 np0005548789.localdomain podman[241241]: 2025-12-06 09:47:35.587633697 +0000 UTC m=+0.441502419 container cleanup b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:47:35 np0005548789.localdomain podman[241241]: podman_exporter
Dec 06 09:47:35 np0005548789.localdomain podman[241254]: 2025-12-06 09:47:35.60089392 +0000 UTC m=+0.395953620 container cleanup b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:35 np0005548789.localdomain sshd[241143]: Received disconnect from 103.192.152.59 port 55438:11: Bye Bye [preauth]
Dec 06 09:47:35 np0005548789.localdomain sshd[241143]: Disconnected from authenticating user root 103.192.152.59 port 55438 [preauth]
Dec 06 09:47:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:47:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:47:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19323 DF PROTO=TCP SPT=48568 DPT=9102 SEQ=2118299854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBC7EF0000000001030307) 
Dec 06 09:47:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:47:38 np0005548789.localdomain systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:47:38 np0005548789.localdomain podman[241269]: 2025-12-06 09:47:38.090642534 +0000 UTC m=+0.751047247 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 06 09:47:38 np0005548789.localdomain podman[241281]: 2025-12-06 09:47:38.147532985 +0000 UTC m=+0.077445916 container cleanup b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:47:38 np0005548789.localdomain podman[241281]: podman_exporter
Dec 06 09:47:38 np0005548789.localdomain podman[241269]: 2025-12-06 09:47:38.175260838 +0000 UTC m=+0.835665541 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:47:38 np0005548789.localdomain podman[241269]: unhealthy
Dec 06 09:47:39 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:39.291 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: Stopped podman_exporter container.
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:47:40 np0005548789.localdomain systemd[1]: Starting podman_exporter container...
Dec 06 09:47:40 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:40.526 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58670 DF PROTO=TCP SPT=38750 DPT=9102 SEQ=2585128302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBD32F0000000001030307) 
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:47:41 np0005548789.localdomain podman[241299]: 2025-12-06 09:47:41.592996983 +0000 UTC m=+1.413586795 container init b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:41 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:47:41.606Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 06 09:47:41 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:47:41.606Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 06 09:47:41 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:47:41.606Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 06 09:47:41 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:47:41.607Z caller=handler.go:105 level=info collector=container
Dec 06 09:47:41 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:47:41 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 06 09:47:41 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:41Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:47:41 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:47:41 np0005548789.localdomain podman[241299]: 2025-12-06 09:47:41.635961531 +0000 UTC m=+1.456551333 container start b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:41 np0005548789.localdomain podman[241299]: podman_exporter
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: Started podman_exporter container.
Dec 06 09:47:42 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:42 np0005548789.localdomain sudo[241235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:42 np0005548789.localdomain podman[241323]: 2025-12-06 09:47:42.39463334 +0000 UTC m=+0.755722135 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:42 np0005548789.localdomain podman[241323]: 2025-12-06 09:47:42.40907031 +0000 UTC m=+0.770159145 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:47:42 np0005548789.localdomain podman[241323]: unhealthy
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:43 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:43 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:47:43 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:47:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:47:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cd6425452938e99a947d98ed440c416f97c1a47fc1a973380479b4612f15ab3d-merged.mount: Deactivated successfully.
Dec 06 09:47:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24014 DF PROTO=TCP SPT=34684 DPT=9101 SEQ=3076688077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBE0E00000000001030307) 
Dec 06 09:47:44 np0005548789.localdomain sudo[241454]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-altuwbnuvqkmbpltrppkpqbxbilqevld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.0755699-2055-77208776298872/AnsiballZ_stat.py
Dec 06 09:47:44 np0005548789.localdomain sudo[241454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:44 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:44.341 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:44 np0005548789.localdomain python3.9[241456]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:44 np0005548789.localdomain sudo[241454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:44 np0005548789.localdomain sudo[241542]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlbhovapgemlwrdyqlcafamgjjtmebwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.0755699-2055-77208776298872/AnsiballZ_copy.py
Dec 06 09:47:44 np0005548789.localdomain sudo[241542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:45 np0005548789.localdomain python3.9[241544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014464.0755699-2055-77208776298872/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:45 np0005548789.localdomain sudo[241542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:47:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7d7c22414f3b3b03ee747009a3ba1860a523968c599aef8234ba1bea94f6d58e-merged.mount: Deactivated successfully.
Dec 06 09:47:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7d7c22414f3b3b03ee747009a3ba1860a523968c599aef8234ba1bea94f6d58e-merged.mount: Deactivated successfully.
Dec 06 09:47:45 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:45.564 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:45 np0005548789.localdomain sshd[241562]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:47:46 np0005548789.localdomain sudo[241654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knwbxhjfepsymbepqcwqhyqgbaquroth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014466.6545808-2106-204146365386380/AnsiballZ_container_config_data.py
Dec 06 09:47:46 np0005548789.localdomain sudo[241654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 np0005548789.localdomain python3.9[241656]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 06 09:47:47 np0005548789.localdomain sudo[241654]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:47:47.277 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:47:47.277 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:47:47.279 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:47:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24016 DF PROTO=TCP SPT=34684 DPT=9101 SEQ=3076688077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBECEF0000000001030307) 
Dec 06 09:47:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:47:47 np0005548789.localdomain sudo[241764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ercvzohneubazkwmhztsdjdmaavujsen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014467.4546137-2134-138317371388084/AnsiballZ_container_config_hash.py
Dec 06 09:47:47 np0005548789.localdomain sudo[241764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:47:47 np0005548789.localdomain podman[241767]: 2025-12-06 09:47:47.927714806 +0000 UTC m=+0.123852935 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:47:47 np0005548789.localdomain podman[241767]: 2025-12-06 09:47:47.941239967 +0000 UTC m=+0.137378146 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec 06 09:47:47 np0005548789.localdomain python3.9[241766]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:47:48 np0005548789.localdomain sudo[241764]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:47:48 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:47:49 np0005548789.localdomain sudo[241893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjmqinhhqergpivgxxiwbmmspxyvtbph ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014468.3810594-2163-22370546955153/AnsiballZ_edpm_container_manage.py
Dec 06 09:47:49 np0005548789.localdomain sudo[241893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:49 np0005548789.localdomain sshd[241896]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:49 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:49.381 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:49 np0005548789.localdomain python3[241895]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:47:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1129 DF PROTO=TCP SPT=51478 DPT=9105 SEQ=2814942310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DBF5EF0000000001030307) 
Dec 06 09:47:49 np0005548789.localdomain sshd[241910]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:47:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:47:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:50 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:50 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:50 np0005548789.localdomain podman[241911]: 2025-12-06 09:47:50.497281342 +0000 UTC m=+0.205790365 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:47:50 np0005548789.localdomain podman[241911]: 2025-12-06 09:47:50.533460994 +0000 UTC m=+0.241970007 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:47:50 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:50.620 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:50 np0005548789.localdomain sshd[241896]: Received disconnect from 118.219.234.233 port 38072:11: Bye Bye [preauth]
Dec 06 09:47:50 np0005548789.localdomain sshd[241896]: Disconnected from authenticating user root 118.219.234.233 port 38072 [preauth]
Dec 06 09:47:50 np0005548789.localdomain sshd[241910]: Connection closed by 101.126.88.93 port 51094 [preauth]
Dec 06 09:47:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:51 np0005548789.localdomain podman[241090]: time="2025-12-06T09:47:51Z" level=error msg="Getting root fs size for \"0d3b158edc684f8600676f0dcf0cb5a14357db1593e587d02be14e52e3f8b304\": getting diffsize of layer \"3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae\" and its parent \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\": unmounting layer 3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae: replacing mount point \"/var/lib/containers/storage/overlay/3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae/merged\": device or resource busy"
Dec 06 09:47:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:47:52 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:52 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:47:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1130 DF PROTO=TCP SPT=51478 DPT=9105 SEQ=2814942310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC05AF0000000001030307) 
Dec 06 09:47:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:47:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:54.382 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:47:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7d7c22414f3b3b03ee747009a3ba1860a523968c599aef8234ba1bea94f6d58e-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:55.621 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:47:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15811 DF PROTO=TCP SPT=33906 DPT=9882 SEQ=2958895315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC0DF00000000001030307) 
Dec 06 09:47:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06baa34adcac19ffd1cac321f0c14e5e32037c7b357d2eb54e065b4d177d72fd-merged.mount: Deactivated successfully.
Dec 06 09:47:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:56.460 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:56.461 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:56.481 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:56.481 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:47:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:56.482 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:47:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:57.199 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:47:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:57.199 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:47:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:57.199 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:47:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:57.200 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:47:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:47:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:47:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0dae0ae2501f0b947a8e64948b264823feec8c7ddb8b7849cb102fbfe0c75da8-merged.mount: Deactivated successfully.
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.270 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.282 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.282 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.283 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.283 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.284 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.284 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.284 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.284 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.285 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.285 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.299 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.300 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.300 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.300 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.301 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.773 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.842 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.842 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.991 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.992 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12423MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.992 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:47:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:58.992 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.078 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.079 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.079 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.135 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.424 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.595 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.602 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.620 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.622 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:47:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:47:59.623 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:47:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:47:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24018 DF PROTO=TCP SPT=34684 DPT=9101 SEQ=3076688077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC1DF00000000001030307) 
Dec 06 09:47:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:48:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:48:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:00.677 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:00 np0005548789.localdomain podman[241991]: 2025-12-06 09:48:00.754956739 +0000 UTC m=+0.065503367 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:48:00 np0005548789.localdomain systemd[1]: tmp-crun.SL3jlP.mount: Deactivated successfully.
Dec 06 09:48:00 np0005548789.localdomain podman[241992]: 2025-12-06 09:48:00.821698744 +0000 UTC m=+0.134642748 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 09:48:00 np0005548789.localdomain podman[241992]: 2025-12-06 09:48:00.854124867 +0000 UTC m=+0.167068871 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 09:48:00 np0005548789.localdomain podman[241991]: 2025-12-06 09:48:00.873660019 +0000 UTC m=+0.184206667 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec 06 09:48:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1131 DF PROTO=TCP SPT=51478 DPT=9105 SEQ=2814942310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC25EF0000000001030307) 
Dec 06 09:48:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:03 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:48:03 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:48:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:04.472 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27375 DF PROTO=TCP SPT=36682 DPT=9102 SEQ=4003239329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC30AF0000000001030307) 
Dec 06 09:48:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:05 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:05 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:05 np0005548789.localdomain podman[241090]: time="2025-12-06T09:48:05Z" level=error msg="Getting root fs size for \"15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:48:05 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:05 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:05.718 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-89ee8dead5a29d0553b978375c66d4bc010ba2732baca36dcfe5a54e3214c8ff-merged.mount: Deactivated successfully.
Dec 06 09:48:08 np0005548789.localdomain sudo[242044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:48:08 np0005548789.localdomain sudo[242044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:08 np0005548789.localdomain sudo[242044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:08 np0005548789.localdomain sudo[242062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:48:08 np0005548789.localdomain sudo[242062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10210 DF PROTO=TCP SPT=49382 DPT=9882 SEQ=3923200607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC41EF0000000001030307) 
Dec 06 09:48:09 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:09.507 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:48:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:10.757 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27377 DF PROTO=TCP SPT=36682 DPT=9102 SEQ=4003239329 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC486F0000000001030307) 
Dec 06 09:48:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:11 np0005548789.localdomain podman[242093]: 2025-12-06 09:48:11.147731904 +0000 UTC m=+0.909440243 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:48:11 np0005548789.localdomain podman[242093]: 2025-12-06 09:48:11.177552756 +0000 UTC m=+0.939261115 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:48:11 np0005548789.localdomain podman[242093]: unhealthy
Dec 06 09:48:11 np0005548789.localdomain sudo[242062]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:12 np0005548789.localdomain sudo[242140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:48:12 np0005548789.localdomain sudo[242140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:48:12 np0005548789.localdomain sudo[242140]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:13 np0005548789.localdomain rsyslogd[760]: imjournal from <localhost:sudo>: begin to drop messages due to rate-limiting
Dec 06 09:48:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:48:13 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:13 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:48:13 np0005548789.localdomain podman[242158]: 2025-12-06 09:48:13.59503334 +0000 UTC m=+0.421544628 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:48:13 np0005548789.localdomain podman[242158]: 2025-12-06 09:48:13.629630328 +0000 UTC m=+0.456141606 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:48:13 np0005548789.localdomain podman[242158]: unhealthy
Dec 06 09:48:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29225 DF PROTO=TCP SPT=33728 DPT=9101 SEQ=957042335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC56100000000001030307) 
Dec 06 09:48:14 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:14.513 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:14 np0005548789.localdomain podman[241090]: time="2025-12-06T09:48:14Z" level=error msg="Unmounting /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged: invalid argument"
Dec 06 09:48:14 np0005548789.localdomain podman[241090]: time="2025-12-06T09:48:14Z" level=error msg="Getting root fs size for \"15b3289dfb7ed52ab0d10f0af104ac3227bc5ecb093aa34d613c9c265f6e2f89\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": creating overlay mount to /var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged, mount_data=\"lowerdir=/var/lib/containers/storage/overlay/l/W2PMDNCWT6GHZ7HQPIDDYPHD4B,upperdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/diff,workdir=/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/work,nodev,metacopy=on\": no such file or directory"
Dec 06 09:48:14 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:14 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:14 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:14 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:48:14 np0005548789.localdomain podman[241923]: 2025-12-06 09:47:50.52391022 +0000 UTC m=+0.057472541 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:15.805 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-89ee8dead5a29d0553b978375c66d4bc010ba2732baca36dcfe5a54e3214c8ff-merged.mount: Deactivated successfully.
Dec 06 09:48:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29227 DF PROTO=TCP SPT=33728 DPT=9101 SEQ=957042335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC62300000000001030307) 
Dec 06 09:48:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:48:19 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:19.562 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54746 DF PROTO=TCP SPT=35724 DPT=9105 SEQ=1846442121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC6B300000000001030307) 
Dec 06 09:48:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411-merged.mount: Deactivated successfully.
Dec 06 09:48:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411-merged.mount: Deactivated successfully.
Dec 06 09:48:20 np0005548789.localdomain podman[242215]: 2025-12-06 09:48:17.978574987 +0000 UTC m=+0.046158902 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:20 np0005548789.localdomain podman[242227]: 2025-12-06 09:48:20.418489777 +0000 UTC m=+2.181968854 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:48:20 np0005548789.localdomain podman[242227]: 2025-12-06 09:48:20.434352432 +0000 UTC m=+2.197831559 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd)
Dec 06 09:48:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:20.854 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:23 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:48:23 np0005548789.localdomain podman[242245]: 2025-12-06 09:48:23.111879126 +0000 UTC m=+0.457943511 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:48:23 np0005548789.localdomain podman[242245]: 2025-12-06 09:48:23.152259 +0000 UTC m=+0.498323405 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:48:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54747 DF PROTO=TCP SPT=35724 DPT=9105 SEQ=1846442121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC7AEF0000000001030307) 
Dec 06 09:48:24 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:24.593 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:25 np0005548789.localdomain podman[242215]: 
Dec 06 09:48:25 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:48:25 np0005548789.localdomain podman[242215]: 2025-12-06 09:48:25.124818973 +0000 UTC m=+7.192402898 container create 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1755695350, config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal)
Dec 06 09:48:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:25 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:25.893 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:26 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:26 np0005548789.localdomain python3[241895]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 06 09:48:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548789.localdomain podman[241090]: time="2025-12-06T09:48:26Z" level=error msg="Getting root fs size for \"192809174d5f26e681c631d5f2ef11ed9ed7f1d8dce6529b3a87b11ae6c7b9dd\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:48:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:26 np0005548789.localdomain sshd[242279]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:48:26 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43038 DF PROTO=TCP SPT=33576 DPT=9882 SEQ=4288307620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC87B00000000001030307) 
Dec 06 09:48:28 np0005548789.localdomain sshd[242292]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:48:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807-merged.mount: Deactivated successfully.
Dec 06 09:48:29 np0005548789.localdomain sshd[242292]: Received disconnect from 154.113.10.34 port 37676:11: Bye Bye [preauth]
Dec 06 09:48:29 np0005548789.localdomain sshd[242292]: Disconnected from authenticating user root 154.113.10.34 port 37676 [preauth]
Dec 06 09:48:29 np0005548789.localdomain sudo[241893]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:29 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:29.595 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29229 DF PROTO=TCP SPT=33728 DPT=9101 SEQ=957042335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC91F00000000001030307) 
Dec 06 09:48:29 np0005548789.localdomain sudo[242401]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koeajirckwompnnitflwtvbzkzfrelgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014509.6430814-2187-45450466154655/AnsiballZ_stat.py
Dec 06 09:48:29 np0005548789.localdomain sudo[242401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:30 np0005548789.localdomain python3.9[242403]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:30 np0005548789.localdomain sudo[242401]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:30 np0005548789.localdomain sudo[242513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjlxiqswrnedjvdbklkpayoiguqtqeny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.4812906-2214-218110139307402/AnsiballZ_file.py
Dec 06 09:48:30 np0005548789.localdomain sudo[242513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:30 np0005548789.localdomain python3.9[242515]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:30 np0005548789.localdomain sudo[242513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:30.950 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-853ccb0b7aef1ea23933a0a39c3ed46ab9d9a29acf9ba782f87031dcfb79c247-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548789.localdomain sudo[242622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsghohsoubmygwxnsdedkvnbeaxxqglj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9813495-2214-134577820564724/AnsiballZ_copy.py
Dec 06 09:48:31 np0005548789.localdomain sudo[242622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:31 np0005548789.localdomain python3.9[242624]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014510.9813495-2214-134577820564724/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:31 np0005548789.localdomain sudo[242622]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:31 np0005548789.localdomain sudo[242677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iktyagqirzwneaehgshqvqmywpgteqec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9813495-2214-134577820564724/AnsiballZ_systemd.py
Dec 06 09:48:31 np0005548789.localdomain sudo[242677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54748 DF PROTO=TCP SPT=35724 DPT=9105 SEQ=1846442121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DC9BEF0000000001030307) 
Dec 06 09:48:32 np0005548789.localdomain python3.9[242679]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:48:32 np0005548789.localdomain systemd-rc-local-generator[242704]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:48:32 np0005548789.localdomain systemd-sysv-generator[242710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2444c943287df738a2d4fa7e71e9d9b0754a541b609353f29be07295aed4c411-merged.mount: Deactivated successfully.
Dec 06 09:48:32 np0005548789.localdomain sudo[242677]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:32 np0005548789.localdomain sudo[242768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqxuwexqurajcbfiegfooogpfobblujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014510.9813495-2214-134577820564724/AnsiballZ_systemd.py
Dec 06 09:48:32 np0005548789.localdomain sudo[242768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:33 np0005548789.localdomain python3.9[242770]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:48:33 np0005548789.localdomain systemd-sysv-generator[242802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:48:33 np0005548789.localdomain systemd-rc-local-generator[242798]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 06 09:48:33 np0005548789.localdomain systemd[1]: tmp-crun.BHq41l.mount: Deactivated successfully.
Dec 06 09:48:33 np0005548789.localdomain podman[242811]: 2025-12-06 09:48:33.768087954 +0000 UTC m=+0.080009907 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 06 09:48:33 np0005548789.localdomain podman[242810]: 2025-12-06 09:48:33.78400936 +0000 UTC m=+0.092969393 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:48:33 np0005548789.localdomain podman[242811]: 2025-12-06 09:48:33.796943916 +0000 UTC m=+0.108865859 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent)
Dec 06 09:48:33 np0005548789.localdomain podman[242810]: 2025-12-06 09:48:33.813404029 +0000 UTC m=+0.122364092 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:48:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:34 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:34.632 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4380 DF PROTO=TCP SPT=58628 DPT=9102 SEQ=4212331848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCA5EF0000000001030307) 
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:48:34 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e070471e372fcbd38b59aff17e13ec5529369b06fe5533bb0efb8ae47643d94e/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:34 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e070471e372fcbd38b59aff17e13ec5529369b06fe5533bb0efb8ae47643d94e/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:48:34 np0005548789.localdomain podman[242812]: 2025-12-06 09:48:34.870856806 +0000 UTC m=+1.174392483 container init 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *bridge.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *coverage.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *datapath.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *iface.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *memory.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *ovnnorthd.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *ovn.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *ovsdbserver.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *pmd_perf.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *pmd_rxq.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: INFO    09:48:34 main.go:48: registering *vswitch.Collector
Dec 06 09:48:34 np0005548789.localdomain openstack_network_exporter[242865]: NOTICE  09:48:34 main.go:82: listening on http://:9105/metrics
Dec 06 09:48:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:48:34 np0005548789.localdomain podman[242812]: 2025-12-06 09:48:34.914367616 +0000 UTC m=+1.217903303 container start 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9)
Dec 06 09:48:34 np0005548789.localdomain podman[242812]: openstack_network_exporter
Dec 06 09:48:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:35.990 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:48:36 np0005548789.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 06 09:48:36 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:36 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:36 np0005548789.localdomain sudo[242768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:36 np0005548789.localdomain podman[242876]: 2025-12-06 09:48:36.991399702 +0000 UTC m=+2.077329856 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64)
Dec 06 09:48:37 np0005548789.localdomain podman[242876]: 2025-12-06 09:48:37.011514748 +0000 UTC m=+2.097444942 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64)
Dec 06 09:48:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58673 DF PROTO=TCP SPT=38750 DPT=9102 SEQ=2585128302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCB1EF0000000001030307) 
Dec 06 09:48:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:48:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:39 np0005548789.localdomain sudo[243004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slospcxjnpxsmlajpahbabcmimvpiwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014519.34746-2286-189643671596987/AnsiballZ_systemd.py
Dec 06 09:48:39 np0005548789.localdomain sudo[243004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:39 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:39.664 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:39 np0005548789.localdomain python3.9[243006]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:48:39 np0005548789.localdomain systemd[1]: Stopping openstack_network_exporter container...
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: libpod-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.scope: Deactivated successfully.
Dec 06 09:48:40 np0005548789.localdomain podman[243010]: 2025-12-06 09:48:40.067034866 +0000 UTC m=+0.068869526 container died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.timer: Deactivated successfully.
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e-userdata-shm.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-17bd947360a6f617c72dcee5b3dca8b1bd8a672c96dfaa4c4c4ee08fb50892d9-merged.mount: Deactivated successfully.
Dec 06 09:48:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4382 DF PROTO=TCP SPT=58628 DPT=9102 SEQ=4212331848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCBDAF0000000001030307) 
Dec 06 09:48:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:41.039 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6427be8e8aab7c88f5d6611b3222842e716badb64f49ed94370155c549e54807-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e070471e372fcbd38b59aff17e13ec5529369b06fe5533bb0efb8ae47643d94e-merged.mount: Deactivated successfully.
Dec 06 09:48:41 np0005548789.localdomain podman[243010]: 2025-12-06 09:48:41.815012523 +0000 UTC m=+1.816847093 container cleanup 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec 06 09:48:41 np0005548789.localdomain podman[243010]: openstack_network_exporter
Dec 06 09:48:41 np0005548789.localdomain podman[243024]: 2025-12-06 09:48:41.873703627 +0000 UTC m=+1.807405434 container cleanup 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=)
Dec 06 09:48:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548789.localdomain sshd[243037]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:48:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:48:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:43 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:43 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:44 np0005548789.localdomain systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 06 09:48:44 np0005548789.localdomain podman[243039]: 2025-12-06 09:48:44.033541515 +0000 UTC m=+0.202299315 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:48:44 np0005548789.localdomain podman[243039]: 2025-12-06 09:48:44.064543802 +0000 UTC m=+0.233301582 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:48:44 np0005548789.localdomain podman[243039]: unhealthy
Dec 06 09:48:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25326 DF PROTO=TCP SPT=40248 DPT=9101 SEQ=1974539489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCCB400000000001030307) 
Dec 06 09:48:44 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:44.703 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548789.localdomain sshd[243037]: Received disconnect from 64.227.156.63 port 33638:11: Bye Bye [preauth]
Dec 06 09:48:45 np0005548789.localdomain sshd[243037]: Disconnected from authenticating user root 64.227.156.63 port 33638 [preauth]
Dec 06 09:48:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:48:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:46 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:46 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:46 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:46 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:48:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:46.074 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:46 np0005548789.localdomain podman[243070]: 2025-12-06 09:48:46.104047871 +0000 UTC m=+0.897535649 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:48:46 np0005548789.localdomain podman[243070]: 2025-12-06 09:48:46.115197172 +0000 UTC m=+0.908684880 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:48:46 np0005548789.localdomain podman[243070]: unhealthy
Dec 06 09:48:46 np0005548789.localdomain podman[243052]: 2025-12-06 09:48:46.131936134 +0000 UTC m=+2.104667292 container cleanup 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:48:46 np0005548789.localdomain podman[243052]: openstack_network_exporter
Dec 06 09:48:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: Stopped openstack_network_exporter container.
Dec 06 09:48:47 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:47 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: Starting openstack_network_exporter container...
Dec 06 09:48:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:48:47.278 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:48:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:48:47.279 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:48:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:48:47.280 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:48:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25328 DF PROTO=TCP SPT=40248 DPT=9101 SEQ=1974539489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCD72F0000000001030307) 
Dec 06 09:48:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-beaf327340ccd7215a759765519263eed11e8999b460cf785f7dbab3207ce8ee-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:48 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:48:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e070471e372fcbd38b59aff17e13ec5529369b06fe5533bb0efb8ae47643d94e/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e070471e372fcbd38b59aff17e13ec5529369b06fe5533bb0efb8ae47643d94e/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:48:48 np0005548789.localdomain podman[243096]: 2025-12-06 09:48:48.146529891 +0000 UTC m=+0.881252701 container init 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *bridge.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *coverage.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *datapath.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *iface.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *memory.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *ovnnorthd.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *ovn.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *ovsdbserver.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *pmd_perf.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *pmd_rxq.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: INFO    09:48:48 main.go:48: registering *vswitch.Collector
Dec 06 09:48:48 np0005548789.localdomain openstack_network_exporter[243110]: NOTICE  09:48:48 main.go:82: listening on http://:9105/metrics
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:48:48 np0005548789.localdomain podman[243096]: 2025-12-06 09:48:48.183049678 +0000 UTC m=+0.917772438 container start 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal)
Dec 06 09:48:48 np0005548789.localdomain podman[243096]: openstack_network_exporter
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:48:48 np0005548789.localdomain systemd[1]: Started openstack_network_exporter container.
Dec 06 09:48:49 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:49 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:49 np0005548789.localdomain sudo[243004]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:49 np0005548789.localdomain podman[243120]: 2025-12-06 09:48:49.029835185 +0000 UTC m=+0.842097285 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=starting, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350)
Dec 06 09:48:49 np0005548789.localdomain podman[243120]: 2025-12-06 09:48:49.048228527 +0000 UTC m=+0.860490677 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64)
Dec 06 09:48:49 np0005548789.localdomain sudo[243247]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxbqqqgtqfermnpmzjtwgzlhmzqbbghb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014529.2103343-2310-165073301765183/AnsiballZ_find.py
Dec 06 09:48:49 np0005548789.localdomain sudo[243247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:49 np0005548789.localdomain python3.9[243249]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:48:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43902 DF PROTO=TCP SPT=36310 DPT=9105 SEQ=847959067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCE06F0000000001030307) 
Dec 06 09:48:49 np0005548789.localdomain sudo[243247]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:49 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:49.748 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548789.localdomain sudo[243357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kufrboklrqihmlecvukacexxuhfklwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014530.3176324-2337-67217054844906/AnsiballZ_podman_container_info.py
Dec 06 09:48:50 np0005548789.localdomain sudo[243357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-17bd947360a6f617c72dcee5b3dca8b1bd8a672c96dfaa4c4c4ee08fb50892d9-merged.mount: Deactivated successfully.
Dec 06 09:48:50 np0005548789.localdomain python3.9[243359]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 06 09:48:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-17bd947360a6f617c72dcee5b3dca8b1bd8a672c96dfaa4c4c4ee08fb50892d9-merged.mount: Deactivated successfully.
Dec 06 09:48:51 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:51 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:51.134 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:51 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:48:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:48:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:48:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d980d54738e5f040d62ff40bb9abb4b1931da0f4c80c1ba3031e7feabd416146-merged.mount: Deactivated successfully.
Dec 06 09:48:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:53 np0005548789.localdomain sudo[243357]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43903 DF PROTO=TCP SPT=36310 DPT=9105 SEQ=847959067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCF02F0000000001030307) 
Dec 06 09:48:53 np0005548789.localdomain sudo[243479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knyogyygygsotjpkxbltafowjzokvaej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014533.2673879-2345-60564433654523/AnsiballZ_podman_container_exec.py
Dec 06 09:48:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:48:53 np0005548789.localdomain sudo[243479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:53 np0005548789.localdomain podman[243481]: 2025-12-06 09:48:53.876370197 +0000 UTC m=+0.052379823 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:48:53 np0005548789.localdomain podman[243481]: 2025-12-06 09:48:53.888946981 +0000 UTC m=+0.064956617 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:48:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548789.localdomain python3.9[243482]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:48:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-beaf327340ccd7215a759765519263eed11e8999b460cf785f7dbab3207ce8ee-merged.mount: Deactivated successfully.
Dec 06 09:48:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:54.750 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:48:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:48:55 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:48:55 np0005548789.localdomain systemd[1]: Started libpod-conmon-0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.scope.
Dec 06 09:48:55 np0005548789.localdomain podman[243498]: 2025-12-06 09:48:55.33081175 +0000 UTC m=+1.297593850 container exec 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:48:55 np0005548789.localdomain podman[243509]: 2025-12-06 09:48:55.351877503 +0000 UTC m=+0.170434870 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:48:55 np0005548789.localdomain podman[243509]: 2025-12-06 09:48:55.393217417 +0000 UTC m=+0.211774824 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:48:55 np0005548789.localdomain podman[243529]: 2025-12-06 09:48:55.433678105 +0000 UTC m=+0.090819098 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:48:55 np0005548789.localdomain podman[243498]: 2025-12-06 09:48:55.438069908 +0000 UTC m=+1.404852038 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:48:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43041 DF PROTO=TCP SPT=33576 DPT=9882 SEQ=4288307620 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DCF7EF0000000001030307) 
Dec 06 09:48:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:56.163 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:48:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:48:57 np0005548789.localdomain systemd[1]: libpod-conmon-0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.scope: Deactivated successfully.
Dec 06 09:48:57 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:57 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:48:57 np0005548789.localdomain sudo[243479]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:58 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:48:58 np0005548789.localdomain sudo[243659]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-filqcgbffjwzucalmmkmlkzzckgwkysv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014538.036752-2353-132580767885390/AnsiballZ_podman_container_exec.py
Dec 06 09:48:58 np0005548789.localdomain sudo[243659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:58 np0005548789.localdomain python3.9[243661]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:48:58 np0005548789.localdomain systemd[1]: Started libpod-conmon-0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.scope.
Dec 06 09:48:58 np0005548789.localdomain podman[243662]: 2025-12-06 09:48:58.614684549 +0000 UTC m=+0.092853379 container exec 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 09:48:58 np0005548789.localdomain podman[243662]: 2025-12-06 09:48:58.643371196 +0000 UTC m=+0.121539966 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 09:48:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548789.localdomain sudo[243659]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:48:59 np0005548789.localdomain systemd[1]: libpod-conmon-0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.scope: Deactivated successfully.
Dec 06 09:48:59 np0005548789.localdomain sudo[243799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysycnatdrrtoonxikccmgvrjyyeinzio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014539.2984445-2361-81768778744470/AnsiballZ_file.py
Dec 06 09:48:59 np0005548789.localdomain sudo[243799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:59.624 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:59.625 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:48:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:59.626 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:48:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:59.626 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:48:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:48:59.795 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:48:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25330 DF PROTO=TCP SPT=40248 DPT=9101 SEQ=1974539489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD07EF0000000001030307) 
Dec 06 09:48:59 np0005548789.localdomain python3.9[243801]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:59 np0005548789.localdomain sudo[243799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.208 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.209 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.209 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.209 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:49:00 np0005548789.localdomain sudo[243909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hekhcrzoukdwksgyxdosmqpbeomlidrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014540.0828798-2370-32335099596266/AnsiballZ_podman_container_info.py
Dec 06 09:49:00 np0005548789.localdomain sudo[243909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:00 np0005548789.localdomain python3.9[243911]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.625 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.646 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.647 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.648 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.648 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.648 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.648 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.649 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.649 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.649 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.650 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.680 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.680 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.681 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.681 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:49:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:00.681 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.083 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.143 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.144 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.206 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:01 np0005548789.localdomain sshd[243946]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.379 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.382 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12474MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.383 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.383 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.494 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.494 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.495 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:49:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:01.541 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9a3d7308d621b6c88232a02a8b3043ca87b4d41445011f4603e8c51c260b2591-merged.mount: Deactivated successfully.
Dec 06 09:49:01 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43904 DF PROTO=TCP SPT=36310 DPT=9105 SEQ=847959067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD0FEF0000000001030307) 
Dec 06 09:49:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9a3d7308d621b6c88232a02a8b3043ca87b4d41445011f4603e8c51c260b2591-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:02.041 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:02 np0005548789.localdomain sshd[243946]: Received disconnect from 64.227.102.57 port 55218:11: Bye Bye [preauth]
Dec 06 09:49:02 np0005548789.localdomain sshd[243946]: Disconnected from authenticating user root 64.227.102.57 port 55218 [preauth]
Dec 06 09:49:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:02.046 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:49:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:02.067 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:49:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:02.069 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:49:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:02.069 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:49:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:49:04 np0005548789.localdomain sudo[243909]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:04 np0005548789.localdomain sshd[243991]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:04 np0005548789.localdomain sudo[244079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrikwvkdtmvdzvyqdbosovhxklausjyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014544.3897839-2378-164282730034910/AnsiballZ_podman_container_exec.py
Dec 06 09:49:04 np0005548789.localdomain sudo[244079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25620 DF PROTO=TCP SPT=37480 DPT=9102 SEQ=4226156703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD1B2F0000000001030307) 
Dec 06 09:49:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:04.838 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:49:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:49:04 np0005548789.localdomain python3.9[244081]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:49:04 np0005548789.localdomain podman[244083]: 2025-12-06 09:49:04.96558266 +0000 UTC m=+0.100367869 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:49:04 np0005548789.localdomain podman[244083]: 2025-12-06 09:49:04.975064321 +0000 UTC m=+0.109849470 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.scope.
Dec 06 09:49:05 np0005548789.localdomain podman[244104]: 2025-12-06 09:49:05.082863226 +0000 UTC m=+0.148343796 container exec 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 09:49:05 np0005548789.localdomain podman[244104]: 2025-12-06 09:49:05.111618555 +0000 UTC m=+0.177099175 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 09:49:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:49:06 np0005548789.localdomain sshd[243991]: Received disconnect from 103.192.152.59 port 43720:11: Bye Bye [preauth]
Dec 06 09:49:06 np0005548789.localdomain sshd[243991]: Disconnected from authenticating user root 103.192.152.59 port 43720 [preauth]
Dec 06 09:49:06 np0005548789.localdomain sudo[244079]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:06 np0005548789.localdomain systemd[1]: tmp-crun.Q1SER5.mount: Deactivated successfully.
Dec 06 09:49:06 np0005548789.localdomain podman[244082]: 2025-12-06 09:49:06.179302224 +0000 UTC m=+1.304713776 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:06.209 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:06 np0005548789.localdomain podman[244082]: 2025-12-06 09:49:06.246278942 +0000 UTC m=+1.371690524 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 09:49:06 np0005548789.localdomain sudo[244255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsqajosvvvauspnflpihzlcukmpeuyyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014546.3091009-2386-131686863295251/AnsiballZ_podman_container_exec.py
Dec 06 09:49:06 np0005548789.localdomain sudo[244255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:06 np0005548789.localdomain python3.9[244257]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.909 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.942 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.943 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3783a4bc-bd58-4854-ab43-7ebadefc1d4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.910480', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0a4548-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '9a4049515dcf5ac0cf9de2622c6464a350d9799f7566f306228d677bbab8f2c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.910480', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0a598e-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '97646639861d4bd0ccfa8a501b75fc90b0c53f0d1e22847eadc547ee3c09071d'}]}, 'timestamp': '2025-12-06 09:49:07.943480', '_unique_id': '805969909bfb4f548cf0d81aa9a579c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.950 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ae5c9c6-2c0f-4faf-b07a-53900c2c36da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:07.946266', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf0b7d5a-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '1b44b454286a3791f6772d30908d2cf31cf233157cbad204488bce013efd48ee'}]}, 'timestamp': '2025-12-06 09:49:07.951008', '_unique_id': 'd36ef5e400dd492b8e29f4c43d9cd5da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.953 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.954 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e0250e6-62c6-4ec8-ab17-627d50ed0932', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.953546', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0bf7a8-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '5c65740d5837ca45d69bce7a08395f46c8ee0385e0c08f84facfd53fee4abb39'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.953546', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0c08ba-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': 'e500710f5fb217d65b908f943f478799ebf5ece227effa1523193d7f0fda8eaf'}]}, 'timestamp': '2025-12-06 09:49:07.954509', '_unique_id': 'd374107d29d34ab495344ac413496551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.956 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.956 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.957 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f364587f-516e-4ee5-805f-37e1b02288ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.956820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0c74bc-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '5c6ef8be608d3f57ba89c87dbc67511b3caab3de26d4b99874c3467fc235a5f8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.956820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0c8628-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '4a7b6e031fd290f32f57650b37bf3ec9373e7a7882528aa6727184cce9152d68'}]}, 'timestamp': '2025-12-06 09:49:07.957752', '_unique_id': 'beb3322972104aa387aab53237275e54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.959 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.960 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '283d9edd-5ec5-4285-a98b-731cff2d9bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:07.960117', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf0cf644-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '50998dab1ed0ef7f12dc9a95092c886908999520a2ef384a543f4f58badc3c8e'}]}, 'timestamp': '2025-12-06 09:49:07.960639', '_unique_id': 'bc8e6c06e5f2460c84c8063c0d54c5c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86586717-4f4e-4012-ba63-291b0c1b9de1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:07.962982', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf0d6552-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '9c8eb1ae628e4683063dd23525f50eb65df2a3e34607f3667335a188eaa21452'}]}, 'timestamp': '2025-12-06 09:49:07.963470', '_unique_id': 'd4bebf1af5d545c9af06453706c41268'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.965 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.965 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.966 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93e9e38b-4af6-4485-ab2d-21b992e6f7e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.965889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0dd708-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '69a590df1f4e6b2baa53dbb30f0d978540332ff83d35895a88e6f214a82be8f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.965889', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0de84c-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '08e23ba1a6a1294034f889f8c927c72689a90bac864778ceadf67933c6777763'}]}, 'timestamp': '2025-12-06 09:49:07.966816', '_unique_id': 'd13dcd6357834dc398112c797f27ae35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.968 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22042558-3e8b-41c2-b52a-f30b8a342606', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.969014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0e5084-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': 'c13970bfa5f8bc8b9b8d851526d857fb96e59b43ef610e72818f610ec7327061'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.969014', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0e6236-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '79e389bb49cb82d0b4620afdd29bc94ad71d319b122a24e3584398bf36bab24f'}]}, 'timestamp': '2025-12-06 09:49:07.969938', '_unique_id': 'a26f7de8505a4bfd8eea3d322cd5cdae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.972 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.972 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a9576d9-39e8-4139-bff8-4e1c170243f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.972150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf0ecaf0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '54ba1b93bb4921786ac79575f703fad934b0fd664e0228110ca68a334c86c787'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.972150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf0edc66-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.159842952, 'message_signature': '069ec1faa9491332498d8af05cd8d2429c3a6c73c30baecbad91f2f1f8d1a50d'}]}, 'timestamp': '2025-12-06 09:49:07.973034', '_unique_id': '4c829e3836d448e495b00f2081aa5078'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.988 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.989 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70d4f138-b69a-423c-851f-8118868faeeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.975322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf1151d0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': '30cb6c77d634e967f233c729ca8f4ae9f13593c07da18d5ba87f0c850af0d8f0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.975322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf1163a0-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': '8b35d5a4489fc2f8107b9a07b253f6126136c8be8f9865734e6e7290d83646f4'}]}, 'timestamp': '2025-12-06 09:49:07.989608', '_unique_id': '9a193851a0df4d009ec74c3f573f0dc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '730be5f1-7c77-4233-948e-d439c0ac9fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:07.992430', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf11e384-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': '0cea59759731eb0721d9b3578189bd4333616e0d76dbc5d69315cef60b62e779'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:07.992430', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf11f590-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': 'beda979920d6bf355a0127822d34c8957e9169b8a3626dd928bd30fe848cc79a'}]}, 'timestamp': '2025-12-06 09:49:07.993334', '_unique_id': 'e73ce24d10a440ecb7c14779ee77a4ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ccedb40-58b9-4789-a359-8752e6afea13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:07.995735', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf1265ca-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '334f1dc56b942eff3057c8caa038cba769f96b0a244029827fe6be687128560a'}]}, 'timestamp': '2025-12-06 09:49:07.996234', '_unique_id': '047caddb0e624e5abfc6ad766df67d30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:49:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68688070-4dc5-45c8-b28f-fa968c349b1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:07.998405', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf12cc40-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '82cb1d9c82ce9bcd5bc3e19e91e7ab44cb63a9c3f892637b4b24d73a58847180'}]}, 'timestamp': '2025-12-06 09:49:07.998892', '_unique_id': '2459a066cc014379bc6953d29f1e9f47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.000 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 50860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58c45dc7-f231-4b43-8df4-03327a94f05f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50860000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:49:08.001051', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'cf15fef6-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.268471763, 'message_signature': '4b8fa083d906480ff6c1cf5d875a6d25c374c3d6a6a0a05e3d98628b4a62244a'}]}, 'timestamp': '2025-12-06 09:49:08.019879', '_unique_id': '7516ec85b5bf41b085973c230d32f4d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf564e45-a939-4faa-aa80-08031208b8e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:49:08.022379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cf1674b2-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': '59bf587a12649d4d00ead0d2cf8705c82df67d7eb6702d3fbc86a07811b4ea97'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:49:08.022379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cf168614-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.224694775, 'message_signature': 'cfb01f1c7f4b77502036be93a0d9b04fc674d3f4724dbc485d88a5720ab4d734'}]}, 'timestamp': '2025-12-06 09:49:08.023245', '_unique_id': 'bf70d9e8db814c13b128ce29818c2fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '198a596b-501e-49a4-afb7-cc3ec4bb0f22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:08.025431', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf16ec1c-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '75245c907348d62b24d4da0f69e2d4a2666a9b19b2668c2873e42127b9747c57'}]}, 'timestamp': '2025-12-06 09:49:08.025917', '_unique_id': '861787415b3945d299176ddbec96e916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c86fc72a-c7e9-4c84-ac40-8a8d7d21aefc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:49:08.028655', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'cf176be2-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.268471763, 'message_signature': '9030e411c04801c57783f4604491e0d92aed24ec27859442ac1f0965291afcd5'}]}, 'timestamp': '2025-12-06 09:49:08.029142', '_unique_id': 'f978807066f44e7c897a362ab5077595'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.031 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8335827f-cafd-4466-af96-3049b1f0686f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:08.031360', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf17d3d4-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '2b941259253d19ac35403e6b63c6ad20b617c3360897e531851ea4aab94a5b21'}]}, 'timestamp': '2025-12-06 09:49:08.031858', '_unique_id': '25151c169e554401a383ec78d587d6ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.034 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cc3ef5c-406b-4ace-85ed-d9220c4cfca9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:08.034461', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf184d82-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': 'd0bfa46df4d1885aef92b479ddcc6405e264c2752944cff0d8c144b0e1415814'}]}, 'timestamp': '2025-12-06 09:49:08.034968', '_unique_id': '9bdfca20622c4c428a40405404cd663a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbdde104-54c3-4ad8-923c-5a73aad42e7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:08.037111', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf18b498-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '500caa7d98736a4451b97119ba7d0a01d4b4ce0ea602c35e3dd2dfb01a2b99ec'}]}, 'timestamp': '2025-12-06 09:49:08.037576', '_unique_id': '335d1486e57a4860a69faac30487a877'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.040 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57a06fce-b019-4218-b0fd-0c730dee5f72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:49:08.039979', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'cf19248c-d288-11f0-aaf2-fa163e118844', 'monotonic_time': 10966.195654617, 'message_signature': '3446da458d1a77ab6d8c811a35f5e0700b57b5baae082ed6e2275c7aa0b4f621'}]}, 'timestamp': '2025-12-06 09:49:08.040440', '_unique_id': '3db1d2cec90f44339daeccb2c911c4bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.042 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:49:08.042 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:49:08 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:08 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: libpod-conmon-5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.scope: Deactivated successfully.
Dec 06 09:49:08 np0005548789.localdomain systemd[1]: Started libpod-conmon-5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.scope.
Dec 06 09:49:08 np0005548789.localdomain podman[244258]: 2025-12-06 09:49:08.8045427 +0000 UTC m=+1.986115938 container exec 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:49:08 np0005548789.localdomain podman[244258]: 2025-12-06 09:49:08.838486868 +0000 UTC m=+2.020060095 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:49:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:09 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56211 DF PROTO=TCP SPT=58098 DPT=9882 SEQ=3484834428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD2DEF0000000001030307) 
Dec 06 09:49:09 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:09.884 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:10 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:10 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:10 np0005548789.localdomain sudo[244255]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25622 DF PROTO=TCP SPT=37480 DPT=9102 SEQ=4226156703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD32EF0000000001030307) 
Dec 06 09:49:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain sudo[244396]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtwonjlsfewspcpdguuoutsuxdskefxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014550.8616128-2394-269605162353557/AnsiballZ_file.py
Dec 06 09:49:11 np0005548789.localdomain sudo[244396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:11 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:11.212 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:11 np0005548789.localdomain python3.9[244398]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:11 np0005548789.localdomain sudo[244396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:11 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:11 np0005548789.localdomain systemd[1]: libpod-conmon-5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.scope: Deactivated successfully.
Dec 06 09:49:11 np0005548789.localdomain sudo[244506]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oovhhwetjazyvkbpgixieypfeuhqyoea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014551.609781-2403-115962892915416/AnsiballZ_podman_container_info.py
Dec 06 09:49:11 np0005548789.localdomain sudo[244506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:12 np0005548789.localdomain python3.9[244508]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 06 09:49:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:12 np0005548789.localdomain sudo[244521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:49:12 np0005548789.localdomain sudo[244521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:12 np0005548789.localdomain sudo[244521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:13 np0005548789.localdomain sudo[244539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:49:13 np0005548789.localdomain sudo[244539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully.
Dec 06 09:49:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9a3d7308d621b6c88232a02a8b3043ca87b4d41445011f4603e8c51c260b2591-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9a3d7308d621b6c88232a02a8b3043ca87b4d41445011f4603e8c51c260b2591-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29262 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=2812379677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD40700000000001030307) 
Dec 06 09:49:14 np0005548789.localdomain sudo[244539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:14 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:14.886 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:15 np0005548789.localdomain sudo[244506]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:15 np0005548789.localdomain sudo[244597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:49:15 np0005548789.localdomain sudo[244597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:49:15 np0005548789.localdomain sudo[244597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:15 np0005548789.localdomain sudo[244714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvftwcmucixwcydlbuffkliqtbnzpupc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014555.3592474-2411-183733771988976/AnsiballZ_podman_container_exec.py
Dec 06 09:49:15 np0005548789.localdomain sudo[244714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:15 np0005548789.localdomain python3.9[244716]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:15 np0005548789.localdomain systemd[1]: Started libpod-conmon-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.scope.
Dec 06 09:49:15 np0005548789.localdomain podman[244717]: 2025-12-06 09:49:15.935031004 +0000 UTC m=+0.112681385 container exec 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:49:15 np0005548789.localdomain podman[244717]: 2025-12-06 09:49:15.967168874 +0000 UTC m=+0.144819215 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 09:49:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:16.215 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:49:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:49:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29264 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=2812379677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD4C6F0000000001030307) 
Dec 06 09:49:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:49:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb-merged.mount: Deactivated successfully.
Dec 06 09:49:17 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:17 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:17 np0005548789.localdomain sudo[244714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:17 np0005548789.localdomain podman[244747]: 2025-12-06 09:49:17.774163483 +0000 UTC m=+1.318461309 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:49:17 np0005548789.localdomain podman[244747]: 2025-12-06 09:49:17.80815661 +0000 UTC m=+1.352454456 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:49:17 np0005548789.localdomain podman[244747]: unhealthy
Dec 06 09:49:17 np0005548789.localdomain podman[244759]: 2025-12-06 09:49:17.837858574 +0000 UTC m=+0.339768577 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:49:17 np0005548789.localdomain podman[244759]: 2025-12-06 09:49:17.847029104 +0000 UTC m=+0.348939097 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:49:17 np0005548789.localdomain podman[244759]: unhealthy
Dec 06 09:49:18 np0005548789.localdomain sudo[244896]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubgsojnfkgudqfjghtfbtgwmdxkzccef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014557.9811127-2419-212480186942019/AnsiballZ_podman_container_exec.py
Dec 06 09:49:18 np0005548789.localdomain sudo[244896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:18 np0005548789.localdomain python3.9[244898]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33626 DF PROTO=TCP SPT=44208 DPT=9105 SEQ=2201017667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD55AF0000000001030307) 
Dec 06 09:49:19 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:19.922 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:20 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548789.localdomain podman[241090]: time="2025-12-06T09:49:20Z" level=error msg="Getting root fs size for \"329949042ceb45cabbb88c012af20b9f8b51f1de61f81364a852037377f8c8c1\": getting diffsize of layer \"3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34\" and its parent \"f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958\": unmounting layer 3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34: replacing mount point \"/var/lib/containers/storage/overlay/3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34/merged\": device or resource busy"
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: libpod-conmon-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.scope: Deactivated successfully.
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Failed with result 'exit-code'.
Dec 06 09:49:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:49:20 np0005548789.localdomain systemd[1]: Started libpod-conmon-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.scope.
Dec 06 09:49:20 np0005548789.localdomain podman[244899]: 2025-12-06 09:49:20.658006776 +0000 UTC m=+2.111185443 container exec 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:49:20 np0005548789.localdomain podman[244899]: 2025-12-06 09:49:20.666094612 +0000 UTC m=+2.119273279 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:49:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:21.261 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:49:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:23 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:23 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:23 np0005548789.localdomain sudo[244896]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:23 np0005548789.localdomain podman[244928]: 2025-12-06 09:49:23.437735734 +0000 UTC m=+2.094413661 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:49:23 np0005548789.localdomain podman[244928]: 2025-12-06 09:49:23.45628771 +0000 UTC m=+2.112965597 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:49:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7114dad9ca5bd35640cc71d1bc50a0dfc385dbab46b008e450dae4492651614e-merged.mount: Deactivated successfully.
Dec 06 09:49:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33627 DF PROTO=TCP SPT=44208 DPT=9105 SEQ=2201017667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD656F0000000001030307) 
Dec 06 09:49:23 np0005548789.localdomain sudo[245055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxsehahnydxbmoiqjkntbqflonmdofaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014563.5755808-2427-29482742753226/AnsiballZ_file.py
Dec 06 09:49:23 np0005548789.localdomain sudo[245055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:24 np0005548789.localdomain python3.9[245057]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:24 np0005548789.localdomain sudo[245055]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:24 np0005548789.localdomain sudo[245165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhlflkzqastryaoeokhzftuhlddypxeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014564.3513405-2436-241204850321925/AnsiballZ_podman_container_info.py
Dec 06 09:49:24 np0005548789.localdomain sudo[245165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:24 np0005548789.localdomain python3.9[245167]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 06 09:49:24 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:24.978 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:49:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56212 DF PROTO=TCP SPT=58098 DPT=9882 SEQ=3484834428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD6DEF0000000001030307) 
Dec 06 09:49:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:26 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:26 np0005548789.localdomain systemd[1]: libpod-conmon-44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.scope: Deactivated successfully.
Dec 06 09:49:26 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:49:26 np0005548789.localdomain podman[245179]: 2025-12-06 09:49:26.261398182 +0000 UTC m=+0.421081405 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:49:26 np0005548789.localdomain podman[245179]: 2025-12-06 09:49:26.284412814 +0000 UTC m=+0.444096027 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:49:26 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:26.317 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:49:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:28 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:49:28 np0005548789.localdomain podman[245198]: 2025-12-06 09:49:28.26682145 +0000 UTC m=+0.428765390 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:49:28 np0005548789.localdomain podman[245198]: 2025-12-06 09:49:28.285165729 +0000 UTC m=+0.447109629 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:28 np0005548789.localdomain systemd[1]: tmp-crun.UJXYQy.mount: Deactivated successfully.
Dec 06 09:49:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:29 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:49:29 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:29 np0005548789.localdomain sudo[245165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29266 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=2812379677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD7BEF0000000001030307) 
Dec 06 09:49:29 np0005548789.localdomain sudo[245328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtxcdbwtyimcllzdvgvkbofsvihzyffa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014569.6867752-2444-153893314998526/AnsiballZ_podman_container_exec.py
Dec 06 09:49:29 np0005548789.localdomain sudo[245328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:30.020 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:30 np0005548789.localdomain python3.9[245330]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:30 np0005548789.localdomain systemd[1]: Started libpod-conmon-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope.
Dec 06 09:49:30 np0005548789.localdomain podman[245331]: 2025-12-06 09:49:30.339999492 +0000 UTC m=+0.111666974 container exec bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 06 09:49:30 np0005548789.localdomain podman[245331]: 2025-12-06 09:49:30.370317177 +0000 UTC m=+0.141984629 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:49:31 np0005548789.localdomain sshd[245358]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:31 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:31.354 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb-merged.mount: Deactivated successfully.
Dec 06 09:49:32 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33628 DF PROTO=TCP SPT=44208 DPT=9105 SEQ=2201017667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD85EF0000000001030307) 
Dec 06 09:49:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5fc0aec4e92feba574efc8a2831ff4547cdf669c9ba74f30ce1106436a335beb-merged.mount: Deactivated successfully.
Dec 06 09:49:32 np0005548789.localdomain sudo[245328]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:32 np0005548789.localdomain sudo[245470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahtdphlmavvzgvlavrpaawbbepaeljgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014572.610148-2452-107694071329867/AnsiballZ_podman_container_exec.py
Dec 06 09:49:32 np0005548789.localdomain sudo[245470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:33 np0005548789.localdomain python3.9[245472]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-16e4342af8bf5958b38bc295034feee0dd1522d1c796e48d3acbadb880cc49ff-merged.mount: Deactivated successfully.
Dec 06 09:49:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-16e4342af8bf5958b38bc295034feee0dd1522d1c796e48d3acbadb880cc49ff-merged.mount: Deactivated successfully.
Dec 06 09:49:34 np0005548789.localdomain systemd[1]: libpod-conmon-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope: Deactivated successfully.
Dec 06 09:49:34 np0005548789.localdomain systemd[1]: Started libpod-conmon-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope.
Dec 06 09:49:34 np0005548789.localdomain podman[245473]: 2025-12-06 09:49:34.577832426 +0000 UTC m=+1.433723043 container exec bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Dec 06 09:49:34 np0005548789.localdomain podman[245473]: 2025-12-06 09:49:34.611340657 +0000 UTC m=+1.467231354 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 09:49:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46456 DF PROTO=TCP SPT=52298 DPT=9102 SEQ=885430671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD902F0000000001030307) 
Dec 06 09:49:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:35.024 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:36 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:36.360 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:49:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:37 np0005548789.localdomain sudo[245470]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:37 np0005548789.localdomain podman[245503]: 2025-12-06 09:49:37.093794365 +0000 UTC m=+0.336696284 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:49:37 np0005548789.localdomain podman[245503]: 2025-12-06 09:49:37.130017669 +0000 UTC m=+0.372919588 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 06 09:49:37 np0005548789.localdomain sudo[245627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgrygxazzlrktyalwooqgmgaaqqgwfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014577.2351065-2460-8660580092089/AnsiballZ_file.py
Dec 06 09:49:37 np0005548789.localdomain sudo[245627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:37 np0005548789.localdomain python3.9[245629]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4385 DF PROTO=TCP SPT=58628 DPT=9102 SEQ=4212331848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DD9BF00000000001030307) 
Dec 06 09:49:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:37 np0005548789.localdomain sudo[245627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:38 np0005548789.localdomain sudo[245737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuoynygpfgqqtjviplrsztmzkadmasax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014578.028381-2469-260781889695683/AnsiballZ_podman_container_info.py
Dec 06 09:49:38 np0005548789.localdomain sudo[245737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:38 np0005548789.localdomain python3.9[245739]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 06 09:49:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:49:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:39 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:49:39 np0005548789.localdomain systemd[1]: libpod-conmon-bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.scope: Deactivated successfully.
Dec 06 09:49:39 np0005548789.localdomain podman[245751]: 2025-12-06 09:49:39.140238931 +0000 UTC m=+0.296339343 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:49:39 np0005548789.localdomain podman[245751]: 2025-12-06 09:49:39.210231725 +0000 UTC m=+0.366332157 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Dec 06 09:49:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:40 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:40.067 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:40 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:49:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:40 np0005548789.localdomain sudo[245737]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46458 DF PROTO=TCP SPT=52298 DPT=9102 SEQ=885430671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDA7EF0000000001030307) 
Dec 06 09:49:41 np0005548789.localdomain sudo[245883]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbmncvxraysksshqhekwzlzwgsekzqns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014580.9272015-2477-273153807573107/AnsiballZ_podman_container_exec.py
Dec 06 09:49:41 np0005548789.localdomain sudo[245883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:41.363 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:41 np0005548789.localdomain python3.9[245885]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: Started libpod-conmon-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.scope.
Dec 06 09:49:41 np0005548789.localdomain podman[245886]: 2025-12-06 09:49:41.593268661 +0000 UTC m=+0.134029357 container exec d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:41 np0005548789.localdomain podman[245886]: 2025-12-06 09:49:41.623310877 +0000 UTC m=+0.164071563 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:49:41 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:42 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:42 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:42 np0005548789.localdomain sudo[245883]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:42 np0005548789.localdomain sudo[246022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehrvmxutdyotxibtbahvuaeqfopjmdrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014582.232951-2485-143613618897943/AnsiballZ_podman_container_exec.py
Dec 06 09:49:42 np0005548789.localdomain sudo[246022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:42 np0005548789.localdomain python3.9[246024]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25678 DF PROTO=TCP SPT=56714 DPT=9101 SEQ=1622402260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDB5A10000000001030307) 
Dec 06 09:49:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:49:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7114dad9ca5bd35640cc71d1bc50a0dfc385dbab46b008e450dae4492651614e-merged.mount: Deactivated successfully.
Dec 06 09:49:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7114dad9ca5bd35640cc71d1bc50a0dfc385dbab46b008e450dae4492651614e-merged.mount: Deactivated successfully.
Dec 06 09:49:44 np0005548789.localdomain systemd[1]: libpod-conmon-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.scope: Deactivated successfully.
Dec 06 09:49:44 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:44 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:44 np0005548789.localdomain systemd[1]: Started libpod-conmon-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.scope.
Dec 06 09:49:44 np0005548789.localdomain podman[246025]: 2025-12-06 09:49:44.637543363 +0000 UTC m=+1.874438085 container exec d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:49:44 np0005548789.localdomain podman[246025]: 2025-12-06 09:49:44.670188688 +0000 UTC m=+1.907083360 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:49:45 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:45.069 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:45 np0005548789.localdomain sshd[241562]: fatal: Timeout before authentication for 45.78.222.162 port 59092
Dec 06 09:49:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:46.365 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:46 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11191 DF PROTO=TCP SPT=39922 DPT=9105 SEQ=1207020566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDBECF0000000001030307) 
Dec 06 09:49:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:47 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:47 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:47 np0005548789.localdomain sudo[246022]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:49:47.279 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:49:47.279 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:49:47.281 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:47 np0005548789.localdomain sudo[246163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzzlpfgbvvydkdcngzayfreluhbhixxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014587.384287-2493-245517438129630/AnsiballZ_file.py
Dec 06 09:49:47 np0005548789.localdomain sudo[246163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:47 np0005548789.localdomain python3.9[246165]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:48 np0005548789.localdomain sudo[246163]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:48 np0005548789.localdomain sudo[246273]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqekbqidczzxnzjhxqznutvrdhioevob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014588.7072086-2502-187374162860956/AnsiballZ_podman_container_info.py
Dec 06 09:49:48 np0005548789.localdomain sudo[246273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:49:49 np0005548789.localdomain systemd[1]: libpod-conmon-d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.scope: Deactivated successfully.
Dec 06 09:49:49 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:49 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:49 np0005548789.localdomain python3.9[246275]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 06 09:49:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11193 DF PROTO=TCP SPT=39922 DPT=9105 SEQ=1207020566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDCAEF0000000001030307) 
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:50.127 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe-merged.mount: Deactivated successfully.
Dec 06 09:49:50 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:49:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:49:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:51 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:51.368 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:49:52 np0005548789.localdomain sudo[246273]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:52 np0005548789.localdomain podman[246289]: 2025-12-06 09:49:52.255156866 +0000 UTC m=+1.424289675 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:49:52 np0005548789.localdomain podman[246289]: 2025-12-06 09:49:52.269028979 +0000 UTC m=+1.438161788 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:49:52 np0005548789.localdomain podman[246289]: unhealthy
Dec 06 09:49:52 np0005548789.localdomain podman[246290]: 2025-12-06 09:49:52.311897555 +0000 UTC m=+1.474800894 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 09:49:52 np0005548789.localdomain podman[246290]: 2025-12-06 09:49:52.349123239 +0000 UTC m=+1.512026588 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:49:52 np0005548789.localdomain sudo[246437]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmtstssyrkqyluvipnlrbvntvejucsxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014592.469578-2510-205972019128054/AnsiballZ_podman_container_exec.py
Dec 06 09:49:52 np0005548789.localdomain sudo[246437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:52 np0005548789.localdomain python3.9[246439]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:49:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11194 DF PROTO=TCP SPT=39922 DPT=9105 SEQ=1207020566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDDAB00000000001030307) 
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:49:54 np0005548789.localdomain sshd[246452]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:54 np0005548789.localdomain systemd[1]: Started libpod-conmon-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.scope.
Dec 06 09:49:54 np0005548789.localdomain podman[246440]: 2025-12-06 09:49:54.356852487 +0000 UTC m=+1.430255686 container exec b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:49:54 np0005548789.localdomain podman[246440]: 2025-12-06 09:49:54.386639206 +0000 UTC m=+1.460042425 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:49:54 np0005548789.localdomain sshd[246470]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.145 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:55 np0005548789.localdomain sshd[246452]: Invalid user admin from 78.128.112.74 port 42280
Dec 06 09:49:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:55 np0005548789.localdomain sshd[246452]: Connection closed by invalid user admin 78.128.112.74 port 42280 [preauth]
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.939 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.940 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.973 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.973 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:49:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:55.974 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:49:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:56.327 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:49:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:56.328 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:49:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:56.329 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:49:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:56.329 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:49:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:56.371 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:49:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:49:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-16e4342af8bf5958b38bc295034feee0dd1522d1c796e48d3acbadb880cc49ff-merged.mount: Deactivated successfully.
Dec 06 09:49:56 np0005548789.localdomain sudo[246437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:56 np0005548789.localdomain podman[246471]: 2025-12-06 09:49:56.911449994 +0000 UTC m=+0.256311443 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:49:56 np0005548789.localdomain podman[246471]: 2025-12-06 09:49:56.919860191 +0000 UTC m=+0.264721680 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 06 09:49:56 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13386 DF PROTO=TCP SPT=48498 DPT=9882 SEQ=2470392513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDE7300000000001030307) 
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.068 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.094 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.094 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.095 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.095 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.096 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.096 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.096 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.096 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.097 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.114 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.115 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.115 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.115 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.116 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:57 np0005548789.localdomain sudo[246618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcoyqjzoyhpgcrgsicmemqntldxvlsft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014597.1360614-2518-114830737568061/AnsiballZ_podman_container_exec.py
Dec 06 09:49:57 np0005548789.localdomain sudo[246618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.516 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.619 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.619 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:49:57 np0005548789.localdomain python3.9[246620]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:49:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:49:57 np0005548789.localdomain systemd[1]: libpod-conmon-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.scope: Deactivated successfully.
Dec 06 09:49:57 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:49:57 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:57 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:49:57 np0005548789.localdomain systemd[1]: Started libpod-conmon-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.scope.
Dec 06 09:49:57 np0005548789.localdomain podman[246623]: 2025-12-06 09:49:57.740039661 +0000 UTC m=+0.083435665 container exec b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:49:57 np0005548789.localdomain podman[246623]: 2025-12-06 09:49:57.769484278 +0000 UTC m=+0.112880212 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.810 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.811 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12439MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.811 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.811 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.890 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.890 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.891 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:49:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:57.947 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:49:58 np0005548789.localdomain podman[241090]: time="2025-12-06T09:49:58Z" level=error msg="Getting root fs size for \"44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6\": getting diffsize of layer \"ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9\" and its parent \"cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa\": unmounting layer ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9: replacing mount point \"/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/merged\": device or resource busy"
Dec 06 09:49:58 np0005548789.localdomain sudo[246618]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:58.453 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:49:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:58.460 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:49:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:58.475 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:49:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:58.478 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:49:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:58.478 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:49:58 np0005548789.localdomain systemd[1]: libpod-conmon-b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.scope: Deactivated successfully.
Dec 06 09:49:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:49:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:49:58 np0005548789.localdomain podman[246742]: 2025-12-06 09:49:58.668063378 +0000 UTC m=+0.098766272 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:49:58 np0005548789.localdomain podman[246742]: 2025-12-06 09:49:58.679863017 +0000 UTC m=+0.110565911 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:49:58 np0005548789.localdomain sudo[246799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvqohoidxigbjpnyusqvbjhkrolnnpya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014598.4532528-2526-223091293817447/AnsiballZ_file.py
Dec 06 09:49:58 np0005548789.localdomain sudo[246799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:58 np0005548789.localdomain python3.9[246801]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:58 np0005548789.localdomain sudo[246799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25682 DF PROTO=TCP SPT=56714 DPT=9101 SEQ=1622402260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDF1F00000000001030307) 
Dec 06 09:49:59 np0005548789.localdomain sudo[246909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cplfaekfzmlouyjzrgbyhidsbpudhhmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014599.2452645-2535-159865854357512/AnsiballZ_podman_container_info.py
Dec 06 09:49:59 np0005548789.localdomain sudo[246909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:49:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:49:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:49:59.884 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:49:59 np0005548789.localdomain python3.9[246911]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 06 09:50:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:00.179 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:50:00 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:50:00 np0005548789.localdomain podman[246912]: 2025-12-06 09:50:00.433936103 +0000 UTC m=+0.672454528 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:50:00 np0005548789.localdomain podman[246912]: 2025-12-06 09:50:00.44464994 +0000 UTC m=+0.683168415 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:50:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:01.394 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11195 DF PROTO=TCP SPT=39922 DPT=9105 SEQ=1207020566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DDFBEF0000000001030307) 
Dec 06 09:50:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:02 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:50:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548789.localdomain sudo[246909]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33253 DF PROTO=TCP SPT=35130 DPT=9102 SEQ=2491027423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE056F0000000001030307) 
Dec 06 09:50:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:04 np0005548789.localdomain sudo[247054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nygwnlbbsycldajlgibjixwdxwzgvllx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014604.6098793-2543-7185025910210/AnsiballZ_podman_container_exec.py
Dec 06 09:50:04 np0005548789.localdomain sudo[247054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:05 np0005548789.localdomain python3.9[247056]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.scope.
Dec 06 09:50:05 np0005548789.localdomain podman[247057]: 2025-12-06 09:50:05.138292477 +0000 UTC m=+0.074563374 container exec 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:50:05 np0005548789.localdomain podman[247057]: 2025-12-06 09:50:05.174118319 +0000 UTC m=+0.110389266 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm)
Dec 06 09:50:05 np0005548789.localdomain sshd[246470]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:50:05 np0005548789.localdomain sshd[246470]: banner exchange: Connection from 125.124.183.254 port 57884: Connection timed out
Dec 06 09:50:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:05.213 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe-merged.mount: Deactivated successfully.
Dec 06 09:50:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-926f9ae59ae4620dbf9c3ec16ecc12b1f2dc289864a1e6ab337d5dcdccd2a7fe-merged.mount: Deactivated successfully.
Dec 06 09:50:05 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:05 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:06 np0005548789.localdomain systemd[1]: libpod-conmon-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.scope: Deactivated successfully.
Dec 06 09:50:06 np0005548789.localdomain sudo[247054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:06.458 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:06 np0005548789.localdomain sudo[247193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzjepdixwayykrbocisffnxgkdtpkziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014606.212669-2551-253213223872673/AnsiballZ_podman_container_exec.py
Dec 06 09:50:06 np0005548789.localdomain sudo[247193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:06 np0005548789.localdomain sshd[247196]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:06 np0005548789.localdomain python3.9[247195]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 06 09:50:06 np0005548789.localdomain systemd[1]: Started libpod-conmon-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.scope.
Dec 06 09:50:06 np0005548789.localdomain podman[247198]: 2025-12-06 09:50:06.996860168 +0000 UTC m=+0.121819654 container exec 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public)
Dec 06 09:50:07 np0005548789.localdomain podman[247198]: 2025-12-06 09:50:07.030195864 +0000 UTC m=+0.155155320 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=)
Dec 06 09:50:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:50:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25625 DF PROTO=TCP SPT=37480 DPT=9102 SEQ=4226156703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE11F10000000001030307) 
Dec 06 09:50:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:08 np0005548789.localdomain sudo[247193]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:50:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5761 writes, 25K keys, 5761 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5761 writes, 760 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:50:09 np0005548789.localdomain sudo[247337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfeanikhszpjtzfonondbqazscoxhavk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014608.93078-2559-26592325404119/AnsiballZ_file.py
Dec 06 09:50:09 np0005548789.localdomain sudo[247337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:50:09 np0005548789.localdomain python3.9[247339]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:09 np0005548789.localdomain sudo[247337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:10.252 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:10 np0005548789.localdomain sshd[247196]: Received disconnect from 179.33.210.213 port 44812:11: Bye Bye [preauth]
Dec 06 09:50:10 np0005548789.localdomain sshd[247196]: Disconnected from authenticating user root 179.33.210.213 port 44812 [preauth]
Dec 06 09:50:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:50:10 np0005548789.localdomain sshd[247379]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33255 DF PROTO=TCP SPT=35130 DPT=9102 SEQ=2491027423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE1D2F0000000001030307) 
Dec 06 09:50:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:50:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c229f79c70cf5be9a27371d03399d655b2b0280f5e9159c8f223d964c49a7e53-merged.mount: Deactivated successfully.
Dec 06 09:50:11 np0005548789.localdomain systemd[1]: libpod-conmon-10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.scope: Deactivated successfully.
Dec 06 09:50:11 np0005548789.localdomain podman[247367]: 2025-12-06 09:50:11.433555523 +0000 UTC m=+1.017559527 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:50:11 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:11.505 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:11 np0005548789.localdomain podman[247367]: 2025-12-06 09:50:11.524154664 +0000 UTC m=+1.108158658 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:50:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:50:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.2 total, 600.0 interval
                                                          Cumulative writes: 4879 writes, 21K keys, 4879 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4879 writes, 669 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 09:50:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2bd01f86bd06174222a9d55fe041ff06edb278c28aedc59c96738054f88e995d-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548789.localdomain sshd[247379]: Received disconnect from 45.78.222.162 port 33974:11: Bye Bye [preauth]
Dec 06 09:50:13 np0005548789.localdomain sshd[247379]: Disconnected from authenticating user root 45.78.222.162 port 33974 [preauth]
Dec 06 09:50:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:13 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:50:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37106 DF PROTO=TCP SPT=36736 DPT=9101 SEQ=458424556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE2AD00000000001030307) 
Dec 06 09:50:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:15.289 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:15 np0005548789.localdomain sudo[247394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:50:15 np0005548789.localdomain sudo[247394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:15 np0005548789.localdomain sudo[247394]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:15 np0005548789.localdomain sudo[247412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:50:15 np0005548789.localdomain sudo[247412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:16.539 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-70249a3a7715ea2081744d13dd83fad2e62b9b24ab69f2af1c4f45ccd311c7a7-merged.mount: Deactivated successfully.
Dec 06 09:50:17 np0005548789.localdomain podman[247340]: 2025-12-06 09:50:17.156662554 +0000 UTC m=+7.951975471 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:50:17 np0005548789.localdomain podman[247340]: 2025-12-06 09:50:17.16408014 +0000 UTC m=+7.959393087 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 06 09:50:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37108 DF PROTO=TCP SPT=36736 DPT=9101 SEQ=458424556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE36EF0000000001030307) 
Dec 06 09:50:17 np0005548789.localdomain sudo[247412]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:18 np0005548789.localdomain systemd[1]: tmp-crun.7LpCF4.mount: Deactivated successfully.
Dec 06 09:50:18 np0005548789.localdomain sudo[247467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:50:18 np0005548789.localdomain sudo[247467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:50:18 np0005548789.localdomain sudo[247467]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:19 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:50:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50399 DF PROTO=TCP SPT=56594 DPT=9105 SEQ=4047791579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE3FEF0000000001030307) 
Dec 06 09:50:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20-merged.mount: Deactivated successfully.
Dec 06 09:50:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:20.321 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:21.577 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50400 DF PROTO=TCP SPT=56594 DPT=9105 SEQ=4047791579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE4FAF0000000001030307) 
Dec 06 09:50:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:50:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:50:24 np0005548789.localdomain podman[247485]: 2025-12-06 09:50:24.424600421 +0000 UTC m=+0.052609978 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:50:24 np0005548789.localdomain podman[247485]: 2025-12-06 09:50:24.434057499 +0000 UTC m=+0.062067076 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:50:24 np0005548789.localdomain podman[247485]: unhealthy
Dec 06 09:50:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:25 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:25 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:50:25 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:25 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:25 np0005548789.localdomain podman[247486]: 2025-12-06 09:50:25.177610098 +0000 UTC m=+0.802298715 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:50:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:25 np0005548789.localdomain podman[247486]: 2025-12-06 09:50:25.211622301 +0000 UTC m=+0.836310898 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 09:50:25 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:25.355 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13389 DF PROTO=TCP SPT=48498 DPT=9882 SEQ=2470392513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE57F00000000001030307) 
Dec 06 09:50:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:26 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:26.614 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:26 np0005548789.localdomain sshd[242279]: fatal: Timeout before authentication for 122.13.25.186 port 39764
Dec 06 09:50:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:50:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:28 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:50:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:28 np0005548789.localdomain podman[247525]: 2025-12-06 09:50:28.078422544 +0000 UTC m=+0.303980846 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Dec 06 09:50:28 np0005548789.localdomain podman[247525]: 2025-12-06 09:50:28.093349287 +0000 UTC m=+0.318907589 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Dec 06 09:50:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548789.localdomain sshd[247544]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37110 DF PROTO=TCP SPT=36736 DPT=9101 SEQ=458424556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE67EF0000000001030307) 
Dec 06 09:50:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:30 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:50:30 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:30 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:30.407 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:50:30 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:30 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:30 np0005548789.localdomain systemd[1]: tmp-crun.TIBjHu.mount: Deactivated successfully.
Dec 06 09:50:30 np0005548789.localdomain podman[247546]: 2025-12-06 09:50:30.708318838 +0000 UTC m=+0.103335730 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 09:50:30 np0005548789.localdomain podman[247546]: 2025-12-06 09:50:30.72220774 +0000 UTC m=+0.117224642 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 09:50:31 np0005548789.localdomain sshd[247544]: Received disconnect from 103.192.152.59 port 49756:11: Bye Bye [preauth]
Dec 06 09:50:31 np0005548789.localdomain sshd[247544]: Disconnected from authenticating user root 103.192.152.59 port 49756 [preauth]
Dec 06 09:50:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:31 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:31.650 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50401 DF PROTO=TCP SPT=56594 DPT=9105 SEQ=4047791579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE6FF00000000001030307) 
Dec 06 09:50:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:50:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043-merged.mount: Deactivated successfully.
Dec 06 09:50:33 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:33 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:50:33 np0005548789.localdomain podman[247566]: 2025-12-06 09:50:33.3808718 +0000 UTC m=+0.533551390 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:50:33 np0005548789.localdomain podman[247566]: 2025-12-06 09:50:33.396182445 +0000 UTC m=+0.548861965 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:50:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:34 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:50:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:34 np0005548789.localdomain podman[241090]: time="2025-12-06T09:50:34Z" level=error msg="Getting root fs size for \"77cac28f3c09b9f832ae0c5e203a7ac268e6e556a22ddb4e08fa5fd08b32fce5\": unmounting layer c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6: replacing mount point \"/var/lib/containers/storage/overlay/c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6/merged\": device or resource busy"
Dec 06 09:50:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6529 DF PROTO=TCP SPT=36412 DPT=9102 SEQ=3117396038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE7AAF0000000001030307) 
Dec 06 09:50:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:35.445 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-62c1e5bed2cbe019aeb004cf1ae8b59e235260475c84680568f7ac998ec16abe-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-62c1e5bed2cbe019aeb004cf1ae8b59e235260475c84680568f7ac998ec16abe-merged.mount: Deactivated successfully.
Dec 06 09:50:36 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:36.686 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:50:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-29aef16efd0d0d4913740c423de7a8c374d7bce415829f4c5401764f17811d20-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:50:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:50:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35427 DF PROTO=TCP SPT=48878 DPT=9882 SEQ=1154267759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE8BEF0000000001030307) 
Dec 06 09:50:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:39 np0005548789.localdomain sshd[247589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:40 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:40.474 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6531 DF PROTO=TCP SPT=36412 DPT=9102 SEQ=3117396038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DE926F0000000001030307) 
Dec 06 09:50:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:41 np0005548789.localdomain sshd[247590]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:41.731 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain sshd[247590]: Received disconnect from 14.194.101.210 port 46142:11: Bye Bye [preauth]
Dec 06 09:50:43 np0005548789.localdomain sshd[247590]: Disconnected from authenticating user root 14.194.101.210 port 46142 [preauth]
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f4cfdd88bd7d8f375c1733109d5a26e8b3ffb2befe75a3de2b4653848e69b6e4-merged.mount: Deactivated successfully.
Dec 06 09:50:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:50:43 np0005548789.localdomain podman[247592]: 2025-12-06 09:50:43.92838465 +0000 UTC m=+0.087963423 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:50:43 np0005548789.localdomain podman[247592]: 2025-12-06 09:50:43.959688362 +0000 UTC m=+0.119267115 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 09:50:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4267 DF PROTO=TCP SPT=55426 DPT=9101 SEQ=2285950285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEA0000000000001030307) 
Dec 06 09:50:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:45.496 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:45 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:50:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:46.776 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:50:47.280 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:50:47.280 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:50:47.281 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4269 DF PROTO=TCP SPT=55426 DPT=9101 SEQ=2285950285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEABEF0000000001030307) 
Dec 06 09:50:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:50:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:50:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c3718780a2d803326e6f6ad6743b607a5e1167128961c2a398f3fd685de43043-merged.mount: Deactivated successfully.
Dec 06 09:50:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60645 DF PROTO=TCP SPT=56846 DPT=9105 SEQ=3654085747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEB52F0000000001030307) 
Dec 06 09:50:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:50:49 np0005548789.localdomain podman[247616]: 2025-12-06 09:50:49.928141291 +0000 UTC m=+0.088087467 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:50:49 np0005548789.localdomain podman[247616]: 2025-12-06 09:50:49.937083532 +0000 UTC m=+0.097029718 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:50:50 np0005548789.localdomain sshd[247589]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:50:50 np0005548789.localdomain sshd[247589]: banner exchange: Connection from 125.208.17.16 port 43678: Connection timed out
Dec 06 09:50:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:50 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:50.545 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:51 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:50:51 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:51 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:50:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee-merged.mount: Deactivated successfully.
Dec 06 09:50:51 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:51.805 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.525 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.526 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.527 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 09:50:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:52.547 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d8311fd89fa9ff9a4d8824219b7d14d00721d421cc1a51c3601cb914a56f4bfc-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-62c1e5bed2cbe019aeb004cf1ae8b59e235260475c84680568f7ac998ec16abe-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:53.571 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60646 DF PROTO=TCP SPT=56846 DPT=9105 SEQ=3654085747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEC4EF0000000001030307) 
Dec 06 09:50:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:50:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:50:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:54.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:54.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:54.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:50:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:50:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:50:55 np0005548789.localdomain podman[247633]: 2025-12-06 09:50:55.377096688 +0000 UTC m=+0.072311488 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:50:55 np0005548789.localdomain podman[247633]: 2025-12-06 09:50:55.412135243 +0000 UTC m=+0.107350003 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:50:55 np0005548789.localdomain podman[247633]: unhealthy
Dec 06 09:50:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:55.497 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:55.547 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:55 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:50:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:56 np0005548789.localdomain sshd[247655]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:50:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:56.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:56.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:50:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:56.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:50:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:56.841 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:50:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:57 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7934 DF PROTO=TCP SPT=33454 DPT=9882 SEQ=3339928409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DED1AF0000000001030307) 
Dec 06 09:50:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:50:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6-merged.mount: Deactivated successfully.
Dec 06 09:50:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:50:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:57.307 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:50:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:57.308 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:50:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:57.308 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:50:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:57.308 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:50:57 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:57 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:50:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ad3810d21826b8b406da9b30fe54843a6684913af1956a8eb23d071f707914a6-merged.mount: Deactivated successfully.
Dec 06 09:50:57 np0005548789.localdomain sshd[247655]: Received disconnect from 118.219.234.233 port 58944:11: Bye Bye [preauth]
Dec 06 09:50:57 np0005548789.localdomain sshd[247655]: Disconnected from authenticating user root 118.219.234.233 port 58944 [preauth]
Dec 06 09:50:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:50:58 np0005548789.localdomain podman[247657]: 2025-12-06 09:50:58.168771059 +0000 UTC m=+0.082982973 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:50:58 np0005548789.localdomain podman[247657]: 2025-12-06 09:50:58.17440587 +0000 UTC m=+0.088617824 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 09:50:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.437 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.449 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.449 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.449 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.450 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.450 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.463 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.463 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.463 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.464 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.464 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.874 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.940 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:50:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:58.940 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.102 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.104 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12379MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.104 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.104 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.199 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.199 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.199 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.280 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.328 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.328 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.343 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.362 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,HW_CPU_X86_BMI,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.400 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:50:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4271 DF PROTO=TCP SPT=55426 DPT=9101 SEQ=2285950285 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEDBEF0000000001030307) 
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.860 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.865 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.884 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.887 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.887 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:50:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:50:59.938 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:00.550 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:51:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:01 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain podman[247721]: 2025-12-06 09:51:01.036513629 +0000 UTC m=+0.289141605 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec 06 09:51:01 np0005548789.localdomain podman[247721]: 2025-12-06 09:51:01.046428791 +0000 UTC m=+0.299056767 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6)
Dec 06 09:51:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:51:01 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:01.875 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60647 DF PROTO=TCP SPT=56846 DPT=9105 SEQ=3654085747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEE5EF0000000001030307) 
Dec 06 09:51:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:51:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f4cfdd88bd7d8f375c1733109d5a26e8b3ffb2befe75a3de2b4653848e69b6e4-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:51:03 np0005548789.localdomain systemd[1]: tmp-crun.imx8Ia.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:03 np0005548789.localdomain podman[247743]: 2025-12-06 09:51:03.597377808 +0000 UTC m=+0.103043791 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:51:03 np0005548789.localdomain podman[247743]: 2025-12-06 09:51:03.610984751 +0000 UTC m=+0.116650724 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:51:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2b1be86c3700ae830e3281535879d3e256788ac5e6b2a708dc79f8a857d5536b-merged.mount: Deactivated successfully.
Dec 06 09:51:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:51:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5443 DF PROTO=TCP SPT=36416 DPT=9102 SEQ=456130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEEFEF0000000001030307) 
Dec 06 09:51:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:51:05 np0005548789.localdomain podman[247760]: 2025-12-06 09:51:05.142332253 +0000 UTC m=+0.639267332 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:05 np0005548789.localdomain podman[247760]: 2025-12-06 09:51:05.184176284 +0000 UTC m=+0.681111353 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:05.590 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22-merged.mount: Deactivated successfully.
Dec 06 09:51:06 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:51:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:06.915 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33258 DF PROTO=TCP SPT=35130 DPT=9102 SEQ=2491027423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DEFBEF0000000001030307) 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.909 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.910 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.913 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ff20e72-2bbb-473d-8b89-f2471e45802b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:07.910294', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '168c6aae-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': 'b9b168f5d251471dd08e9bde9877a4b94769148f85ed73c2159ab10ec221bb15'}]}, 'timestamp': '2025-12-06 09:51:07.914220', '_unique_id': '97f3fa6839ed4bd6ab3aefeb12aef7b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.915 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.943 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.944 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa33231-6641-4245-aca0-217cd266e3dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.917006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '169105f0-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '26f5e567d553065744332f3616dbe8fdf3e5e4102a906a380910136ee567ade2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.917006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '16911a5e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': 'c5b9feb0fa1688813098b54dca7f7e60745230f23ea5035739a43aa6c6b2633e'}]}, 'timestamp': '2025-12-06 09:51:07.944865', '_unique_id': '572eb7b225bb496db20882736a54fd3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.947 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.948 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f872585-8c1e-4c21-a26d-e4584ce3077a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.947537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '169197ae-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '276d2285a9d98b22abc20dca5888bc67a833d5d1e58721e97ff665ba626a885a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.947537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1691a898-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '329241b317517315e9bd6c1275baa224f307ab751b217128dabd71be32b9e93d'}]}, 'timestamp': '2025-12-06 09:51:07.948489', '_unique_id': 'a564f0f122164985ae13502163963c6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.949 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.950 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '604a8479-b1ea-434b-b48a-27fe4fc71645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:07.950854', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '16921832-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': 'e081ef298d5983f9f54a385513ce5d57f313b76cf305f0c557120e3e3d880558'}]}, 'timestamp': '2025-12-06 09:51:07.951345', '_unique_id': 'ccac64520d6a4b26aa7c1623060d287b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.953 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c96317ba-fe24-4e23-8c61-b8266a69a67e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:07.953858', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '16928d3a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '4a0dbf7c3126b505e6f4e6272c9b17749b152a2657e6fbbe6da483b271e0084e'}]}, 'timestamp': '2025-12-06 09:51:07.954349', '_unique_id': '278ebb65eeb142ce845c9fdf09123d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.956 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 51890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e158184f-f9b2-4cbc-a666-1157836386c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51890000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:51:07.956680', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1694ed32-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.218505765, 'message_signature': '3b09892d42a4b4f8681c706c25ff6979438522f89f39a70f8f634da259c582e1'}]}, 'timestamp': '2025-12-06 09:51:07.970036', '_unique_id': '0a94f70450484adcb4ed757482bf1d09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.981 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24935a8b-f8cd-4e4d-8308-796dc509f650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.972645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1696daf2-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': '914f4db59a44286c592ec6938dd9b088519e0fd6cc17e0310c0424182a6405de'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.972645', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1696f384-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': '79eaf9057cb3ade8fbfe24abbec5aa3736bd9dcf29b4115cb1ccdfffbd9f3ac1'}]}, 'timestamp': '2025-12-06 09:51:07.983165', '_unique_id': '4bec5e2155884d9997777f529a281267'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.985 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95ecaadd-2974-430a-95fa-54bd5b3a5e1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:07.986116', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '16977908-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '94e67649f77a6c7f210e2baf2f33fd6b7babaf2b6ece656d9fb8d8eff0b6a8e2'}]}, 'timestamp': '2025-12-06 09:51:07.986598', '_unique_id': '501d9ad1af27468782100ba18fcf0d46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.988 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.989 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8d7e2b1-9acd-4028-bdeb-e0e8878a543f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.988866', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1697e424-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': 'bd3622246ba940e5297a9df26611dfffd07d88489a62b3e376c8544b5e121e81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.988866', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1697f41e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': 'e624eec9b650fb5eff2dd7da2e0c187a61c217280efa4d84e76bc6f5c0710266'}]}, 'timestamp': '2025-12-06 09:51:07.989709', '_unique_id': '914ecd7127ad46a1a51853621c688f3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13b6f13a-aee7-498c-8021-fca62b2ce0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.991846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '169857ba-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '734407b23cd69cb8ef181da45f24c434d9ecea41204ebcd41539fe775cfe93df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.991846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '16986624-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '6f8ac59008a03b5ab91d08e5685c8fa744913ce22e54a1c7cadb18f01313d4a6'}]}, 'timestamp': '2025-12-06 09:51:07.992598', '_unique_id': 'cf30fff4ab2f480493638dff70d9f3ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92ea3caa-bf20-4af1-bf83-e67aea89ccde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.994214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1698b444-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': 'bcca38975874d60a475f912b633a1ab43a087840343a72bbf1a5b75e4b284340'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.994214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1698c2d6-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': '130afffa70858cccad649c356a7781469b082ba5ab095cb01d1a3f4ca677c90c'}]}, 'timestamp': '2025-12-06 09:51:07.994957', '_unique_id': 'bbdf5b31e3e94030be07d622e07c72c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.996 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd387dc4d-f3ca-4233-a09c-f1b7eed78e91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:07.996723', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1699179a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': 'ae73107f2e69d23d1094e6bc4915ec864eaf7b42719599ac02158875d151acb6'}]}, 'timestamp': '2025-12-06 09:51:07.997130', '_unique_id': 'c538fbd84a014ece814aeb1a5b54588d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f34e726c-44bd-400a-ad2d-91c2a61e1f3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:07.998710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '169964d4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': '01d43c91d3406daf1d8d8815f453f33ab09a1149cf615a2efa8406aeeb6a76ee'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:07.998710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '16997244-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.222009421, 'message_signature': 'c6fd14bcb2c54db8e83fcaa483da5deb52ff1841c04acb070e104807afc15a56'}]}, 'timestamp': '2025-12-06 09:51:07.999471', '_unique_id': '3d7adcee7bc44bf4b64bd3a372ace281'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '865861a6-00fd-4e03-8f9f-476b458fbfa3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:08.001228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1699c60e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '20528459c0bf41a283e410a22136c0c33fa4b90847863ef49cff4257d5e86fe2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:08.001228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1699d446-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '538108bc49e530681b6998459400655bbc657abe5faa20cc507f9a268cc5babe'}]}, 'timestamp': '2025-12-06 09:51:08.001947', '_unique_id': '94b53fc7a1a34e0ebc6ea82fd6ffbbf1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b80312e4-3b44-4cab-9ea8-236aec409db3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:08.003565', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '169a1f6e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '8f23bb63afe535ee94e1fccf9d33a1c207809462cc8e8a64e54792fc73da06d9'}]}, 'timestamp': '2025-12-06 09:51:08.004042', '_unique_id': 'd1d0753735bb49a7ad4f814a5f7c9be6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.005 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cdfb727-a657-4f14-99e5-da12b743ad47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:51:08.006021', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '169a80da-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.218505765, 'message_signature': 'e0f88b8579bf3e768ac257619deac4cc067d161667809cbd93ac4fd01ab71c9e'}]}, 'timestamp': '2025-12-06 09:51:08.006402', '_unique_id': 'cb1bdcb81a5c40229ea73b289feec511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fae8552c-0e11-4d34-8667-c67e40d1bba0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:08.008017', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '169acd6a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '1c61e47020bd38154e5b87cf03fd55d97b554b209dedc030a465b693419f4e7d'}]}, 'timestamp': '2025-12-06 09:51:08.008337', '_unique_id': 'b0be327885304fe9b0d6daa6bbad9144'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '092f2cd8-a81d-43df-8724-2aa2a0dff770', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:51:08.009919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '169b17ac-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': 'c18cc1671d72e39a2c3833a2868d5ab2f9017ae199767aa4e961d5bc0fdc77f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:51:08.009919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '169b2788-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.166370752, 'message_signature': '4ac670c500f0bfaa13e70e9ca7db029bdee6853c65123c37de278d59bd9ebfa5'}]}, 'timestamp': '2025-12-06 09:51:08.010627', '_unique_id': 'cc1bde2779e140dabc3d4b86d08fe9cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87a60b9d-57fe-487a-8207-6b08e3074a5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:08.012342', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '169b77e2-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '2d3d00c1a87e644df44818e0cd61abc1c5a1f1bee6a4dd718de12fbd88d8ff84'}]}, 'timestamp': '2025-12-06 09:51:08.012729', '_unique_id': '67a7aeca452544b0a361af02aa9643fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13b7503-5a11-4a30-82f7-8b43f3322230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:08.014226', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '169bbffe-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '26fbe56abf0dbbe044f6e0ca10180a758c9c2e84069dcc5a2be5a2a58b44b1f5'}]}, 'timestamp': '2025-12-06 09:51:08.014565', '_unique_id': 'c04540c5e41d42dfa546dfb876e1e469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35b0931b-69d1-4e48-95f9-fdb6eff95082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:51:08.016215', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '169c0dc4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11086.159655837, 'message_signature': '5bea4b84b154dc405f960060dbe24f9fc1b5235d704a0b1800978b1d07d14945'}]}, 'timestamp': '2025-12-06 09:51:08.016575', '_unique_id': '565542dcb4684e30be99b15385c49b45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:51:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:51:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:51:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee-merged.mount: Deactivated successfully.
Dec 06 09:51:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:10.593 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:10 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:10 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5445 DF PROTO=TCP SPT=36416 DPT=9102 SEQ=456130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF07AF0000000001030307) 
Dec 06 09:51:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:11 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:11.933 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef5a2dfca972201470637fe24151a27f03799cbb5a942d988431f305c9ea334c-merged.mount: Deactivated successfully.
Dec 06 09:51:13 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:14 np0005548789.localdomain sshd[247783]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:51:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=554 DF PROTO=TCP SPT=35250 DPT=9101 SEQ=1771142700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF15300000000001030307) 
Dec 06 09:51:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:15 np0005548789.localdomain sshd[247783]: Received disconnect from 154.113.10.34 port 41532:11: Bye Bye [preauth]
Dec 06 09:51:15 np0005548789.localdomain sshd[247783]: Disconnected from authenticating user root 154.113.10.34 port 41532 [preauth]
Dec 06 09:51:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:15.596 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:51:15 np0005548789.localdomain systemd[1]: tmp-crun.ZsQPW2.mount: Deactivated successfully.
Dec 06 09:51:15 np0005548789.localdomain podman[247785]: 2025-12-06 09:51:15.933174295 +0000 UTC m=+0.091230103 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:51:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:15 np0005548789.localdomain podman[247785]: 2025-12-06 09:51:15.966132607 +0000 UTC m=+0.124188425 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:51:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:16 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:16 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:51:16 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:16 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:16.969 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=556 DF PROTO=TCP SPT=35250 DPT=9101 SEQ=1771142700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF212F0000000001030307) 
Dec 06 09:51:17 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:17 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548789.localdomain sudo[247806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:51:19 np0005548789.localdomain sudo[247806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:19 np0005548789.localdomain sudo[247806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:19 np0005548789.localdomain sudo[247824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:51:19 np0005548789.localdomain sudo[247824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49750 DF PROTO=TCP SPT=42680 DPT=9105 SEQ=3580060358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF2A700000000001030307) 
Dec 06 09:51:19 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1c3c28666d804509c2b20602368c8cd77799587988a2d1a1789a75cb16c60a47-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548789.localdomain sudo[247824]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:20.635 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:20 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:51:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:21 np0005548789.localdomain podman[247875]: 2025-12-06 09:51:21.465070338 +0000 UTC m=+0.093039341 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:51:21 np0005548789.localdomain podman[247875]: 2025-12-06 09:51:21.470286838 +0000 UTC m=+0.098255811 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:51:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:21.972 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd-merged.mount: Deactivated successfully.
Dec 06 09:51:23 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:51:23 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:23 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:23 np0005548789.localdomain sudo[247893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:51:23 np0005548789.localdomain sudo[247893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:51:23 np0005548789.localdomain sudo[247893]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49751 DF PROTO=TCP SPT=42680 DPT=9105 SEQ=3580060358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF3A2F0000000001030307) 
Dec 06 09:51:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:24 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:24 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2b1be86c3700ae830e3281535879d3e256788ac5e6b2a708dc79f8a857d5536b-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:25 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:25 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:25 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:25 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:25.676 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7937 DF PROTO=TCP SPT=33454 DPT=9882 SEQ=3339928409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF41EF0000000001030307) 
Dec 06 09:51:26 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:26 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:26 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:51:26 np0005548789.localdomain podman[247911]: 2025-12-06 09:51:26.180851728 +0000 UTC m=+0.088023018 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:51:26 np0005548789.localdomain podman[247911]: 2025-12-06 09:51:26.191984488 +0000 UTC m=+0.099155778 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:51:26 np0005548789.localdomain podman[247911]: unhealthy
Dec 06 09:51:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:27 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:27.001 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:27 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:51:27 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:51:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully.
Dec 06 09:51:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 09:51:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-39b7e1aa52a8c5999fa4a60374f7871a9f2a511ae7fd70de9cb8a012d08ea2a8-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:29 np0005548789.localdomain sshd[231235]: Received disconnect from 192.168.122.30 port 34622:11: disconnected by user
Dec 06 09:51:29 np0005548789.localdomain sshd[231235]: Disconnected from user zuul 192.168.122.30 port 34622
Dec 06 09:51:29 np0005548789.localdomain sshd[231232]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:51:29 np0005548789.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Dec 06 09:51:29 np0005548789.localdomain systemd[1]: session-56.scope: Consumed 1min 13.794s CPU time.
Dec 06 09:51:29 np0005548789.localdomain systemd-logind[766]: Session 56 logged out. Waiting for processes to exit.
Dec 06 09:51:29 np0005548789.localdomain systemd-logind[766]: Removed session 56.
Dec 06 09:51:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=558 DF PROTO=TCP SPT=35250 DPT=9101 SEQ=1771142700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF51EF0000000001030307) 
Dec 06 09:51:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:51:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:30 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:30.719 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:51:31 np0005548789.localdomain sshd[245358]: fatal: Timeout before authentication for 14.103.123.73 port 14026
Dec 06 09:51:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:31 np0005548789.localdomain podman[247934]: 2025-12-06 09:51:31.297920719 +0000 UTC m=+0.089752751 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 09:51:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:31 np0005548789.localdomain podman[247934]: 2025-12-06 09:51:31.336342001 +0000 UTC m=+0.128174033 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.schema-version=1.0)
Dec 06 09:51:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49752 DF PROTO=TCP SPT=42680 DPT=9105 SEQ=3580060358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF59F00000000001030307) 
Dec 06 09:51:32 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:32.032 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:51:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c7edd91eaf927f0cc6c745dda6c529d67c09cf793c73be3335bece938eeb713d-merged.mount: Deactivated successfully.
Dec 06 09:51:32 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:51:32 np0005548789.localdomain podman[247951]: 2025-12-06 09:51:32.640799343 +0000 UTC m=+0.558551702 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6)
Dec 06 09:51:32 np0005548789.localdomain podman[247951]: 2025-12-06 09:51:32.65607423 +0000 UTC m=+0.573826609 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Dec 06 09:51:33 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:51:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef5a2dfca972201470637fe24151a27f03799cbb5a942d988431f305c9ea334c-merged.mount: Deactivated successfully.
Dec 06 09:51:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42420 DF PROTO=TCP SPT=59776 DPT=9102 SEQ=3036102195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF64F00000000001030307) 
Dec 06 09:51:35 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:35.750 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:51:35 np0005548789.localdomain podman[247973]: 2025-12-06 09:51:35.893829279 +0000 UTC m=+0.089760281 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 09:51:35 np0005548789.localdomain podman[247973]: 2025-12-06 09:51:35.930188559 +0000 UTC m=+0.126119591 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 09:51:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:37.038 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:51:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:37 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:37 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:51:37 np0005548789.localdomain podman[247990]: 2025-12-06 09:51:37.639661446 +0000 UTC m=+0.394943598 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:51:37 np0005548789.localdomain podman[247990]: 2025-12-06 09:51:37.673006413 +0000 UTC m=+0.428288515 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:51:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17317 DF PROTO=TCP SPT=34306 DPT=9882 SEQ=4063139873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF77F00000000001030307) 
Dec 06 09:51:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:39 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:39 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:39 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:51:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:40 np0005548789.localdomain podman[241090]: time="2025-12-06T09:51:40Z" level=error msg="Getting root fs size for \"b995fb7c95a2e9bcba8263167f958b9d9a1aa5d19b80b46a281a635d52d8c08d\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:51:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42422 DF PROTO=TCP SPT=59776 DPT=9102 SEQ=3036102195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF7CB00000000001030307) 
Dec 06 09:51:40 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:40.792 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:42 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:42.071 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-761acef0a8624aecb4fcf3ced8fe890c9e02485b13878d736631aa63ee8f2874-merged.mount: Deactivated successfully.
Dec 06 09:51:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-55528a480405885214081d53ccb18f2317128ddef63aa2fac6c624569bda37fd-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64808 DF PROTO=TCP SPT=45624 DPT=9101 SEQ=1152627846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF8A600000000001030307) 
Dec 06 09:51:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:45 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:45.794 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:51:46 np0005548789.localdomain podman[248013]: 2025-12-06 09:51:46.902053512 +0000 UTC m=+0.069921146 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:51:46 np0005548789.localdomain podman[248013]: 2025-12-06 09:51:46.982262329 +0000 UTC m=+0.150129943 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 06 09:51:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully.
Dec 06 09:51:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5d82191509656bbf6f64f1f50570f9d09f17aadb036e941dc9fdbfc1b9557da8-merged.mount: Deactivated successfully.
Dec 06 09:51:47 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:47.074 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:51:47.280 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:51:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:51:47.281 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:51:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:51:47.282 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:51:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64810 DF PROTO=TCP SPT=45624 DPT=9101 SEQ=1152627846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF966F0000000001030307) 
Dec 06 09:51:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:51:48 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:51:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47232 DF PROTO=TCP SPT=51018 DPT=9105 SEQ=1215328337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DF9F6F0000000001030307) 
Dec 06 09:51:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:50 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:50.839 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:52.112 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:52 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:51:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:53 np0005548789.localdomain podman[248036]: 2025-12-06 09:51:53.661253052 +0000 UTC m=+0.074600269 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:51:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:53 np0005548789.localdomain podman[248036]: 2025-12-06 09:51:53.693116524 +0000 UTC m=+0.106463721 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 09:51:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47233 DF PROTO=TCP SPT=51018 DPT=9105 SEQ=1215328337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFAF2F0000000001030307) 
Dec 06 09:51:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:54.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:51:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c7edd91eaf927f0cc6c745dda6c529d67c09cf793c73be3335bece938eeb713d-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:51:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:55 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:51:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:55.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:55.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:55.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:51:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:51:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:55.875 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17318 DF PROTO=TCP SPT=34306 DPT=9882 SEQ=4063139873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFB7EF0000000001030307) 
Dec 06 09:51:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:56.496 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:56.497 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.156 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.533 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.534 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.534 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.534 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.534 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:51:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:51:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548789.localdomain systemd[1]: tmp-crun.8OXsCK.mount: Deactivated successfully.
Dec 06 09:51:57 np0005548789.localdomain podman[248074]: 2025-12-06 09:51:57.932952026 +0000 UTC m=+0.091653810 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:51:57 np0005548789.localdomain podman[248074]: 2025-12-06 09:51:57.965912711 +0000 UTC m=+0.124614495 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:51:57 np0005548789.localdomain podman[248074]: unhealthy
Dec 06 09:51:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:57.993 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.065 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.065 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.236 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.238 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12259MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.239 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.239 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.325 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.325 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.325 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.369 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:51:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:51:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394-merged.mount: Deactivated successfully.
Dec 06 09:51:58 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:51:58 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.894 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.903 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.925 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.929 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:51:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:58.930 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:51:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64812 DF PROTO=TCP SPT=45624 DPT=9101 SEQ=1152627846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFC5EF0000000001030307) 
Dec 06 09:51:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Dec 06 09:51:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:59.931 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:51:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:59.932 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:51:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:51:59.932 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548789.localdomain sshd[248121]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.276 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.276 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.276 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.277 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.731 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.747 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.747 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.748 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.748 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:00.928 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47234 DF PROTO=TCP SPT=51018 DPT=9105 SEQ=1215328337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFCFEF0000000001030307) 
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:02.197 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:02 np0005548789.localdomain sshd[248123]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:52:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:02 np0005548789.localdomain podman[248124]: 2025-12-06 09:52:02.851409823 +0000 UTC m=+0.077345662 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible)
Dec 06 09:52:02 np0005548789.localdomain podman[248124]: 2025-12-06 09:52:02.860623044 +0000 UTC m=+0.086558803 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 09:52:03 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:52:03 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:03 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:52:03 np0005548789.localdomain podman[248143]: 2025-12-06 09:52:03.317072948 +0000 UTC m=+0.063895241 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Dec 06 09:52:03 np0005548789.localdomain podman[248143]: 2025-12-06 09:52:03.35707083 +0000 UTC m=+0.103893133 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 09:52:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-30c5044896505b9166e77885065d3af47cd1d5cde049e01332c2cc6c18ba5026-merged.mount: Deactivated successfully.
Dec 06 09:52:04 np0005548789.localdomain sshd[248163]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50212 DF PROTO=TCP SPT=55900 DPT=9102 SEQ=2677644358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFDA2F0000000001030307) 
Dec 06 09:52:05 np0005548789.localdomain sshd[248163]: Received disconnect from 64.227.102.57 port 41472:11: Bye Bye [preauth]
Dec 06 09:52:05 np0005548789.localdomain sshd[248163]: Disconnected from authenticating user root 64.227.102.57 port 41472 [preauth]
Dec 06 09:52:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:05 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:52:05 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:05.956 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:07 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:07.233 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:07 np0005548789.localdomain sshd[248165]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:52:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5448 DF PROTO=TCP SPT=36416 DPT=9102 SEQ=456130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFE5EF0000000001030307) 
Dec 06 09:52:07 np0005548789.localdomain podman[248167]: 2025-12-06 09:52:07.755081679 +0000 UTC m=+0.074470235 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 09:52:07 np0005548789.localdomain podman[248167]: 2025-12-06 09:52:07.775194923 +0000 UTC m=+0.094583479 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:52:08 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:52:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully.
Dec 06 09:52:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-761acef0a8624aecb4fcf3ced8fe890c9e02485b13878d736631aa63ee8f2874-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain sshd[248165]: Received disconnect from 14.194.101.210 port 46712:11: Bye Bye [preauth]
Dec 06 09:52:09 np0005548789.localdomain sshd[248165]: Disconnected from authenticating user root 14.194.101.210 port 46712 [preauth]
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:52:09 np0005548789.localdomain podman[248184]: 2025-12-06 09:52:09.920123602 +0000 UTC m=+0.079902510 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:52:09 np0005548789.localdomain podman[248184]: 2025-12-06 09:52:09.956226784 +0000 UTC m=+0.116005732 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:52:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50214 DF PROTO=TCP SPT=55900 DPT=9102 SEQ=2677644358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFF1F00000000001030307) 
Dec 06 09:52:10 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:10.988 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4b9c41fe9442d39f0f731cbd431e2ad53f3df5a873cab9bbccc810ab289d4d69-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:11 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:11 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:12 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:12.273 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:12 np0005548789.localdomain sshd[248121]: ssh_dispatch_run_fatal: Connection from 120.48.175.241 port 51192: Connection timed out [preauth]
Dec 06 09:52:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60-merged.mount: Deactivated successfully.
Dec 06 09:52:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa-merged.mount: Deactivated successfully.
Dec 06 09:52:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:52:12 np0005548789.localdomain sshd[248123]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:52:12 np0005548789.localdomain sshd[248123]: banner exchange: Connection from 114.80.34.158 port 58352: Connection timed out
Dec 06 09:52:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2411 DF PROTO=TCP SPT=52028 DPT=9101 SEQ=2118355275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52DFFF900000000001030307) 
Dec 06 09:52:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ea63802099ebb85258cb7d2a1bbd57ddeec51406b466437719c2fc7b376d5b79-merged.mount: Deactivated successfully.
Dec 06 09:52:14 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:14 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:14 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:15.991 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:17 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:17.325 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:17 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2413 DF PROTO=TCP SPT=52028 DPT=9101 SEQ=2118355275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E00BAF0000000001030307) 
Dec 06 09:52:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:52:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:18 np0005548789.localdomain podman[248207]: 2025-12-06 09:52:18.548543655 +0000 UTC m=+0.075396833 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:52:18 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:18 np0005548789.localdomain podman[248207]: 2025-12-06 09:52:18.582195212 +0000 UTC m=+0.109048400 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:52:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:19 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:19 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:19 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:52:19 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:19 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=663 DF PROTO=TCP SPT=36852 DPT=9105 SEQ=4223212149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E014AF0000000001030307) 
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:21.028 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2c8811abddbaacb6f32387bcfbf857078338be762178d55f30b77fc76cc9dc19-merged.mount: Deactivated successfully.
Dec 06 09:52:21 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:21 np0005548789.localdomain sshd[248233]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:52:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:22.358 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:22 np0005548789.localdomain sshd[248233]: Received disconnect from 118.219.234.233 port 60708:11: Bye Bye [preauth]
Dec 06 09:52:22 np0005548789.localdomain sshd[248233]: Disconnected from authenticating user root 118.219.234.233 port 60708 [preauth]
Dec 06 09:52:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=664 DF PROTO=TCP SPT=36852 DPT=9105 SEQ=4223212149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0246F0000000001030307) 
Dec 06 09:52:23 np0005548789.localdomain sudo[248235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:52:23 np0005548789.localdomain sudo[248235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:23 np0005548789.localdomain sudo[248235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:23 np0005548789.localdomain sudo[248253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:52:23 np0005548789.localdomain sudo[248253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ef97a8ce410352459d2cd2d839f0c4f3d007fd27d6a886085c43dfe3ff9df394-merged.mount: Deactivated successfully.
Dec 06 09:52:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Dec 06 09:52:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:52:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:25 np0005548789.localdomain podman[248286]: 2025-12-06 09:52:25.364944461 +0000 UTC m=+0.163000937 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:52:25 np0005548789.localdomain podman[248286]: 2025-12-06 09:52:25.398266609 +0000 UTC m=+0.196323105 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:52:25 np0005548789.localdomain sudo[248253]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:25 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21472 DF PROTO=TCP SPT=43446 DPT=9882 SEQ=3741622451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E02BF00000000001030307) 
Dec 06 09:52:26 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:26.075 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:26 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:52:27 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:27.389 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:27 np0005548789.localdomain sudo[248323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:52:27 np0005548789.localdomain sudo[248323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:52:27 np0005548789.localdomain sudo[248323]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:52:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:52:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:28 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:29 np0005548789.localdomain podman[248341]: 2025-12-06 09:52:29.071495123 +0000 UTC m=+0.085571154 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:52:29 np0005548789.localdomain podman[248341]: 2025-12-06 09:52:29.081464356 +0000 UTC m=+0.095540407 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:52:29 np0005548789.localdomain podman[248341]: unhealthy
Dec 06 09:52:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:29 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2415 DF PROTO=TCP SPT=52028 DPT=9101 SEQ=2118355275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E03BF00000000001030307) 
Dec 06 09:52:29 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e232d99afeeb95c94065c4aa6c90831e0f37d94aede849daf1e3af8b69b5b465-merged.mount: Deactivated successfully.
Dec 06 09:52:30 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:52:30 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:52:31 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:31.115 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:52:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:31 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=665 DF PROTO=TCP SPT=36852 DPT=9105 SEQ=4223212149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E043EF0000000001030307) 
Dec 06 09:52:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully.
Dec 06 09:52:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-30c5044896505b9166e77885065d3af47cd1d5cde049e01332c2cc6c18ba5026-merged.mount: Deactivated successfully.
Dec 06 09:52:32 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:32.433 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:52:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:52:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:52:33 np0005548789.localdomain podman[248362]: 2025-12-06 09:52:33.5863815 +0000 UTC m=+0.126940217 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:52:33 np0005548789.localdomain podman[248362]: 2025-12-06 09:52:33.599209061 +0000 UTC m=+0.139767778 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:52:34 np0005548789.localdomain sshd[248378]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:34 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:52:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:34 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47989 DF PROTO=TCP SPT=39156 DPT=9102 SEQ=1060486510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E04F6F0000000001030307) 
Dec 06 09:52:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:52:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-93af1cca3c5bc2fd6a62da9c5b24ae5188daa60c95e07f180cd96e6bf14ae456-merged.mount: Deactivated successfully.
Dec 06 09:52:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-93af1cca3c5bc2fd6a62da9c5b24ae5188daa60c95e07f180cd96e6bf14ae456-merged.mount: Deactivated successfully.
Dec 06 09:52:35 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:36 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:36.154 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:52:36 np0005548789.localdomain podman[248380]: 2025-12-06 09:52:36.399437685 +0000 UTC m=+0.061542109 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Dec 06 09:52:36 np0005548789.localdomain podman[248380]: 2025-12-06 09:52:36.413115413 +0000 UTC m=+0.075219837 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Dec 06 09:52:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:37.471 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42425 DF PROTO=TCP SPT=59776 DPT=9102 SEQ=3036102195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E05BEF0000000001030307) 
Dec 06 09:52:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:38 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:38 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:52:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:52:38 np0005548789.localdomain podman[248400]: 2025-12-06 09:52:38.67594225 +0000 UTC m=+0.053439682 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:52:38 np0005548789.localdomain podman[248400]: 2025-12-06 09:52:38.683547743 +0000 UTC m=+0.061045205 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 09:52:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:39 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:40 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:52:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47991 DF PROTO=TCP SPT=39156 DPT=9102 SEQ=1060486510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E067300000000001030307) 
Dec 06 09:52:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:40 np0005548789.localdomain podman[241090]: time="2025-12-06T09:52:40Z" level=error msg="Getting root fs size for \"e444006757a84d45c953d1ef31bc530b8507f3f86e52f8ba7761eaf744cfae6a\": getting diffsize of layer \"efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf\" and its parent \"c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6\": unmounting layer efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf: replacing mount point \"/var/lib/containers/storage/overlay/efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf/merged\": device or resource busy"
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:40 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:41.189 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:41 np0005548789.localdomain sshd[248419]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:52:41 np0005548789.localdomain podman[248421]: 2025-12-06 09:52:41.681998587 +0000 UTC m=+0.092968149 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:52:41 np0005548789.localdomain podman[248421]: 2025-12-06 09:52:41.71516959 +0000 UTC m=+0.126139132 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:52:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:42 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:42.493 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:42 np0005548789.localdomain sshd[248419]: Invalid user anonymous from 45.135.232.92 port 53102
Dec 06 09:52:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548789.localdomain sshd[248419]: Connection reset by invalid user anonymous 45.135.232.92 port 53102 [preauth]
Dec 06 09:52:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9893e3b6825fe8589fe9eca74d23476479efd735e739d26cb203759d2b267e35-merged.mount: Deactivated successfully.
Dec 06 09:52:43 np0005548789.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:43 np0005548789.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:43 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:52:43 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:43 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:43 np0005548789.localdomain sshd[248442]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:44 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29155 DF PROTO=TCP SPT=45346 DPT=9101 SEQ=3498639843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E074C00000000001030307) 
Dec 06 09:52:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:45 np0005548789.localdomain sshd[248444]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:45 np0005548789.localdomain sshd[248446]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:45 np0005548789.localdomain sshd[248446]: Accepted publickey for zuul from 192.168.122.30 port 51394 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:52:45 np0005548789.localdomain systemd-logind[766]: New session 57 of user zuul.
Dec 06 09:52:45 np0005548789.localdomain systemd[1]: Started Session 57 of User zuul.
Dec 06 09:52:45 np0005548789.localdomain sshd[248446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:52:46 np0005548789.localdomain sudo[248540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yopqxegertefpxgzneljlyrcdeajtyur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014765.913921-3036-242435743087170/AnsiballZ_file.py
Dec 06 09:52:46 np0005548789.localdomain sudo[248540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:46.236 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:46 np0005548789.localdomain python3.9[248542]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:46 np0005548789.localdomain sudo[248540]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:52:46 np0005548789.localdomain sshd[248444]: Received disconnect from 154.113.10.34 port 60432:11: Bye Bye [preauth]
Dec 06 09:52:46 np0005548789.localdomain sshd[248444]: Disconnected from authenticating user root 154.113.10.34 port 60432 [preauth]
Dec 06 09:52:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:46 np0005548789.localdomain kernel: overlayfs: lowerdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Dec 06 09:52:46 np0005548789.localdomain podman[241090]: time="2025-12-06T09:52:46Z" level=error msg="Unable to write json: \"write unix /run/podman/podman.sock->@: write: broken pipe\""
Dec 06 09:52:46 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:47:32 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 4096 "" "Go-http-client/1.1"
Dec 06 09:52:46 np0005548789.localdomain sshd[248442]: Connection reset by authenticating user root 45.135.232.92 port 53108 [preauth]
Dec 06 09:52:46 np0005548789.localdomain sshd[248590]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:47 np0005548789.localdomain sudo[248651]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnlviubqtufbvvcmfkywduksmqbkiitq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014766.6536703-3063-25062690318884/AnsiballZ_stat.py
Dec 06 09:52:47 np0005548789.localdomain sudo[248651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:47 np0005548789.localdomain python3.9[248653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:47 np0005548789.localdomain sudo[248651]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:52:47.281 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:52:47.281 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:52:47.283 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:47 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29157 DF PROTO=TCP SPT=45346 DPT=9101 SEQ=3498639843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E080AF0000000001030307) 
Dec 06 09:52:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b65786436fb22482b0004739b377157c398689a3d99c4b69d96c34a55e2067d5-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:47.494 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:47 np0005548789.localdomain sudo[248740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khrdmhwvcqadrwkmajkgqggufqadujjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014766.6536703-3063-25062690318884/AnsiballZ_copy.py
Dec 06 09:52:47 np0005548789.localdomain sudo[248740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:47 np0005548789.localdomain python3.9[248742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014766.6536703-3063-25062690318884/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:47 np0005548789.localdomain sudo[248740]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:48 np0005548789.localdomain sudo[248850]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auyvaroaowfpfnwhfsitofnkjaticoid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014768.3181663-3111-247024931490594/AnsiballZ_file.py
Dec 06 09:52:48 np0005548789.localdomain sudo[248850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:48 np0005548789.localdomain python3.9[248852]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:48 np0005548789.localdomain sudo[248850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:49 np0005548789.localdomain sshd[248590]: Connection reset by authenticating user root 45.135.232.92 port 23392 [preauth]
Dec 06 09:52:49 np0005548789.localdomain sshd[248941]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:52:49 np0005548789.localdomain sudo[248961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gphqrjanaixtrrgqxgjuxgekuwfkiqol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014769.0587957-3135-218845960526434/AnsiballZ_stat.py
Dec 06 09:52:49 np0005548789.localdomain sudo[248961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:49 np0005548789.localdomain podman[248963]: 2025-12-06 09:52:49.417621156 +0000 UTC m=+0.074537867 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:52:49 np0005548789.localdomain podman[248963]: 2025-12-06 09:52:49.454999927 +0000 UTC m=+0.111916658 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:52:49 np0005548789.localdomain python3.9[248969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:49 np0005548789.localdomain sudo[248961]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43056 DF PROTO=TCP SPT=50928 DPT=9105 SEQ=1194409489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E089F00000000001030307) 
Dec 06 09:52:49 np0005548789.localdomain sudo[249044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xskggrbynvdaehbnxjmbprhrhvlvidmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014769.0587957-3135-218845960526434/AnsiballZ_file.py
Dec 06 09:52:49 np0005548789.localdomain sudo[249044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:49 np0005548789.localdomain python3.9[249046]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:50 np0005548789.localdomain sudo[249044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d39fd500dccd0614704d889eaaf9068fe2575a3bb203d70cd1f6b19969ae7a25-merged.mount: Deactivated successfully.
Dec 06 09:52:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2c8811abddbaacb6f32387bcfbf857078338be762178d55f30b77fc76cc9dc19-merged.mount: Deactivated successfully.
Dec 06 09:52:50 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:52:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2c8811abddbaacb6f32387bcfbf857078338be762178d55f30b77fc76cc9dc19-merged.mount: Deactivated successfully.
Dec 06 09:52:50 np0005548789.localdomain sudo[249154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arzxcaqtxoximwdevhqcnboikejncnuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014770.3231823-3171-255908316444336/AnsiballZ_stat.py
Dec 06 09:52:50 np0005548789.localdomain sudo[249154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:50 np0005548789.localdomain python3.9[249156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:50 np0005548789.localdomain sudo[249154]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:50 np0005548789.localdomain sshd[248941]: Invalid user admin1 from 45.135.232.92 port 23406
Dec 06 09:52:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548789.localdomain sudo[249211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vblwyifwfznkoelhlhbdpbjffedgbnwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014770.3231823-3171-255908316444336/AnsiballZ_file.py
Dec 06 09:52:51 np0005548789.localdomain sudo[249211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:51 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:51.274 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:51 np0005548789.localdomain sshd[248941]: Connection reset by invalid user admin1 45.135.232.92 port 23406 [preauth]
Dec 06 09:52:51 np0005548789.localdomain python3.9[249213]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rdwz6bqo recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:51 np0005548789.localdomain sudo[249211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548789.localdomain sshd[249231]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:52:51 np0005548789.localdomain sudo[249323]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niejcdefpqyfkgmetqqkrfmcydgkxabm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014771.5353122-3207-107632862498007/AnsiballZ_stat.py
Dec 06 09:52:51 np0005548789.localdomain sudo[249323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:52 np0005548789.localdomain python3.9[249325]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:52 np0005548789.localdomain sudo[249323]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:52 np0005548789.localdomain sudo[249380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbkidqsbcqhixmnfzfnmwkqzomasgouu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014771.5353122-3207-107632862498007/AnsiballZ_file.py
Dec 06 09:52:52 np0005548789.localdomain sudo[249380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:52 np0005548789.localdomain python3.9[249382]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:52 np0005548789.localdomain sudo[249380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:52:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:52.538 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:52 np0005548789.localdomain sshd[249231]: Invalid user user from 45.135.232.92 port 23412
Dec 06 09:52:53 np0005548789.localdomain sshd[249231]: Connection reset by invalid user user 45.135.232.92 port 23412 [preauth]
Dec 06 09:52:53 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43057 DF PROTO=TCP SPT=50928 DPT=9105 SEQ=1194409489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E099B00000000001030307) 
Dec 06 09:52:53 np0005548789.localdomain sudo[249490]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxbbbbrywmrcfwgtcffcyylmuynaxyna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014772.7865164-3246-60219126334064/AnsiballZ_command.py
Dec 06 09:52:53 np0005548789.localdomain sudo[249490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:54 np0005548789.localdomain python3.9[249492]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:54 np0005548789.localdomain sudo[249490]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:54 np0005548789.localdomain sudo[249601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bntccemyyhidaswfqspupdwqnqnsuaqk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014774.3075247-3270-3896832867017/AnsiballZ_edpm_nftables_from_files.py
Dec 06 09:52:54 np0005548789.localdomain sudo[249601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:54 np0005548789.localdomain python3[249603]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:52:54 np0005548789.localdomain sudo[249601]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:55.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:55.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:55.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:52:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully.
Dec 06 09:52:55 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54637 DF PROTO=TCP SPT=48354 DPT=9882 SEQ=605703276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0A1EF0000000001030307) 
Dec 06 09:52:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:56.311 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:56.496 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:56.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:56 np0005548789.localdomain sshd[249659]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:52:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548789.localdomain sudo[249713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adnlwkehcpkvlfuzngqfejiqptvdglio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014775.26292-3294-77212559425076/AnsiballZ_stat.py
Dec 06 09:52:56 np0005548789.localdomain sudo[249713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:52:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:52:57 np0005548789.localdomain podman[249716]: 2025-12-06 09:52:57.001405298 +0000 UTC m=+0.079984033 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:52:57 np0005548789.localdomain podman[249716]: 2025-12-06 09:52:57.010422934 +0000 UTC m=+0.089001649 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 09:52:57 np0005548789.localdomain python3.9[249715]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:57 np0005548789.localdomain sudo[249713]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:57 np0005548789.localdomain sudo[249788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txwvscvrfesbnybghjgyvtdjgxvhqavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014775.26292-3294-77212559425076/AnsiballZ_file.py
Dec 06 09:52:57 np0005548789.localdomain sudo[249788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:57 np0005548789.localdomain python3.9[249792]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:57 np0005548789.localdomain sudo[249788]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:52:57 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.525 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.525 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.526 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.526 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.526 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:52:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:57.584 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:52:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:52:58 np0005548789.localdomain sudo[249920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iomqdicdtbznkszghelechbmmjldmdsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014777.7148077-3330-192025668780229/AnsiballZ_stat.py
Dec 06 09:52:58 np0005548789.localdomain sudo[249920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.049 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.113 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.114 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:52:58 np0005548789.localdomain python3.9[249922]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:58 np0005548789.localdomain sudo[249920]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.348 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.349 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12357MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.350 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.350 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.459 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.460 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.460 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:52:58 np0005548789.localdomain sudo[249979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdqzbjhlggbtziinngojjqbbkvtnzcbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014777.7148077-3330-192025668780229/AnsiballZ_file.py
Dec 06 09:52:58 np0005548789.localdomain sudo[249979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.508 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:52:58 np0005548789.localdomain python3.9[249981]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:58 np0005548789.localdomain sudo[249979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.988 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:52:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:58.994 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:52:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:59.023 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:52:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:59.026 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:52:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:52:59.026 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully.
Dec 06 09:52:59 np0005548789.localdomain sudo[250111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioikqakeurchgxkbbcilbxmcfpldhdym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014779.020825-3366-180844039200617/AnsiballZ_stat.py
Dec 06 09:52:59 np0005548789.localdomain sudo[250111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:59 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e232d99afeeb95c94065c4aa6c90831e0f37d94aede849daf1e3af8b69b5b465-merged.mount: Deactivated successfully.
Dec 06 09:52:59 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29159 DF PROTO=TCP SPT=45346 DPT=9101 SEQ=3498639843 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0AFEF0000000001030307) 
Dec 06 09:52:59 np0005548789.localdomain python3.9[250113]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:52:59 np0005548789.localdomain sudo[250111]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:59 np0005548789.localdomain sudo[250168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjqkdgmdvjfblgdfiglejjddaxkllqce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014779.020825-3366-180844039200617/AnsiballZ_file.py
Dec 06 09:52:59 np0005548789.localdomain sudo[250168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.026 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.027 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.027 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:53:00 np0005548789.localdomain python3.9[250170]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:00 np0005548789.localdomain sudo[250168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:53:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.300 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.300 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.301 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.301 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:53:00 np0005548789.localdomain sshd[249659]: Received disconnect from 179.33.210.213 port 47712:11: Bye Bye [preauth]
Dec 06 09:53:00 np0005548789.localdomain sshd[249659]: Disconnected from authenticating user root 179.33.210.213 port 47712 [preauth]
Dec 06 09:53:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:53:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:53:00 np0005548789.localdomain podman[250246]: 2025-12-06 09:53:00.637940852 +0000 UTC m=+0.086650956 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:53:00 np0005548789.localdomain sudo[250295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwsrowaixkyzqtejwwrxrllgzpujbrip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014780.3503091-3402-37576116419873/AnsiballZ_stat.py
Dec 06 09:53:00 np0005548789.localdomain sudo[250295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:00 np0005548789.localdomain podman[250246]: 2025-12-06 09:53:00.676155279 +0000 UTC m=+0.124865413 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:53:00 np0005548789.localdomain podman[250246]: unhealthy
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.682 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.701 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.702 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:53:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:00.702 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:00 np0005548789.localdomain python3.9[250304]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:00 np0005548789.localdomain sudo[250295]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:01 np0005548789.localdomain sudo[250359]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyjrtmfdzcohzlvcgdrxiivkdslfxapz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014780.3503091-3402-37576116419873/AnsiballZ_file.py
Dec 06 09:53:01 np0005548789.localdomain sudo[250359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4470c8636ef8d59ecd85925ad81ff603b150c7b82e82b0e5d5ff653ec51e0d36-merged.mount: Deactivated successfully.
Dec 06 09:53:01 np0005548789.localdomain python3.9[250361]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:01 np0005548789.localdomain sudo[250359]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:01.346 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:01.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:01.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:01 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-48fc1270cbb31781d8896eae0014e3b5a5e48738fd6cff2aa76953f22a08ee71-merged.mount: Deactivated successfully.
Dec 06 09:53:01 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:53:01 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Failed with result 'exit-code'.
Dec 06 09:53:02 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43058 DF PROTO=TCP SPT=50928 DPT=9105 SEQ=1194409489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0B9EF0000000001030307) 
Dec 06 09:53:02 np0005548789.localdomain sudo[250469]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haqxuikvgpzwcbbgjtwyzcmtlomuyzdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014781.7375464-3438-93791276502329/AnsiballZ_stat.py
Dec 06 09:53:02 np0005548789.localdomain sudo[250469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:02 np0005548789.localdomain python3.9[250471]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:02 np0005548789.localdomain sudo[250469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:53:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:53:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:02.621 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:02 np0005548789.localdomain sudo[250559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbngqpscspcefrkrfolyjumjloqoqqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014781.7375464-3438-93791276502329/AnsiballZ_copy.py
Dec 06 09:53:02 np0005548789.localdomain sudo[250559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:02 np0005548789.localdomain python3.9[250561]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014781.7375464-3438-93791276502329/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:02 np0005548789.localdomain sudo[250559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548789.localdomain sudo[250669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltektcmrosjhmdpztykkhgkupnzmaajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014783.2922199-3483-50966586704866/AnsiballZ_file.py
Dec 06 09:53:03 np0005548789.localdomain sudo[250669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:03 np0005548789.localdomain python3.9[250671]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:03 np0005548789.localdomain sudo[250669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:04 np0005548789.localdomain sudo[250779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhgzevmgpdaroimcfcrgqxmbsawrltye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014784.0612755-3507-171333930603006/AnsiballZ_command.py
Dec 06 09:53:04 np0005548789.localdomain sudo[250779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully.
Dec 06 09:53:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-93af1cca3c5bc2fd6a62da9c5b24ae5188daa60c95e07f180cd96e6bf14ae456-merged.mount: Deactivated successfully.
Dec 06 09:53:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:53:04 np0005548789.localdomain systemd[1]: tmp-crun.Yt1Dad.mount: Deactivated successfully.
Dec 06 09:53:04 np0005548789.localdomain podman[250782]: 2025-12-06 09:53:04.542394575 +0000 UTC m=+0.062866340 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:53:04 np0005548789.localdomain podman[250782]: 2025-12-06 09:53:04.55401606 +0000 UTC m=+0.074487865 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:53:04 np0005548789.localdomain python3.9[250781]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:04 np0005548789.localdomain sudo[250779]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37408 DF PROTO=TCP SPT=39094 DPT=9102 SEQ=3143690640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0C4B00000000001030307) 
Dec 06 09:53:05 np0005548789.localdomain sudo[250909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vljjpshjtruptwlesndrigzujzyldkel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014784.7693725-3531-187319243992378/AnsiballZ_blockinfile.py
Dec 06 09:53:05 np0005548789.localdomain sudo[250909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:05 np0005548789.localdomain python3.9[250911]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:05 np0005548789.localdomain sudo[250909]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:06 np0005548789.localdomain sudo[251019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyypyapepzinczbmjpyueqaezuikqmtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014785.7855194-3558-21171269061568/AnsiballZ_command.py
Dec 06 09:53:06 np0005548789.localdomain sudo[251019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:06 np0005548789.localdomain python3.9[251021]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:06 np0005548789.localdomain sudo[251019]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:06.388 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:06 np0005548789.localdomain sudo[251130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwjdxjncnezkvtwyigqxwbdpmvcdkqsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014786.5502846-3582-267154619744805/AnsiballZ_stat.py
Dec 06 09:53:06 np0005548789.localdomain sudo[251130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:07 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:53:07 np0005548789.localdomain python3.9[251132]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:07 np0005548789.localdomain sudo[251130]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:07 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:07.649 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.910 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.915 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8061d5ec-d1e5-45b1-a90e-0faefae53400', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:07.911416', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e1353c4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': 'faa19256398da8f8cb2bf2c36fd154e5ca8ee92610084d64c42ad2268280b4f8'}]}, 'timestamp': '2025-12-06 09:53:07.916576', '_unique_id': 'ade0775e81a449498261f5dd6c50d1c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.935 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.935 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09c4ecf3-b4f5-4144-9b3c-86d65e45cc50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:07.919652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e16458e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': '5c66b569918abec801a4428aa5d13e7ebc4be6d7da8e6e8b392ad58de4aa3e4d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:07.919652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e165d6c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': '16e6ec5e6283f126e371c63c463f9da92271e4b66f314a2a841ddb57e61b6330'}]}, 'timestamp': '2025-12-06 09:53:07.936412', '_unique_id': 'ce2042e9ddc24825a26498bad2c3f90d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.937 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.939 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.939 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a98ea52b-8c83-4d7f-86b5-eabaf5d4523e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:07.939160', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e16dd28-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': 'a7dcc098dee191de2c862095923c741e4b701ff3afe6843c98ae790f2d63b7fc'}]}, 'timestamp': '2025-12-06 09:53:07.939834', '_unique_id': '07a04ddb7abd40a79a980e30ec09a465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.941 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.942 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e44c72-ecb1-4e59-9e27-447236dcf6d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:07.942341', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e175820-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '638186a0db1fefcafc864dee2eab05d5e9c9402df13d0a63fd9ab72b468e3b74'}]}, 'timestamp': '2025-12-06 09:53:07.942997', '_unique_id': 'c30085d4e62d40e8aec74b663532f3c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.945 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37cc8874-cf60-4404-a5a8-c298cf36ee56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:53:07.945270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5e1b900c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.219094307, 'message_signature': '7bb93c9f360c910f215f95372518cdc077cca36fcce9c76b51cf91cac9bb2b66'}]}, 'timestamp': '2025-12-06 09:53:07.970525', '_unique_id': '1034e695a95e4d63bcbebc40592ea98d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:07.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.017 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6861ec8-fc40-4d0b-bf87-1961698b4647', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:07.973075', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e22e1f4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': '212d887b59f4af12a2120de39103b06aeeffdd61303b57f9278f0deb81988466'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:07.973075', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e22f9be-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'c16816476b49f7997c66862678301712fd6fccdf26923445f02801c3db210523'}]}, 'timestamp': '2025-12-06 09:53:08.019119', '_unique_id': 'f338bdec00b44db188703c0de9f57d9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1a9658a-533b-4138-9c67-73eefe334dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.022188', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e238708-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '503c2c9a44d4e5de657805673155d53ed0b478c848f56af565db811d40fced19'}]}, 'timestamp': '2025-12-06 09:53:08.022794', '_unique_id': '7abce80a8d05415194c71e9b0935c30c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.025 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.026 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f87fbd6f-f465-46f8-8b7e-af4efac6b1db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.025843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e2417fe-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'f2cd0bba075109db84522701e1d3a5f6c03a6093fbade9215bc7a6137a5bce64'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.025843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e2429d8-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': '82c2d70d9dc87bb60d974e4cd80c8670fecc63fd7442f388cf93652c914f6ab8'}]}, 'timestamp': '2025-12-06 09:53:08.026983', '_unique_id': 'eee4412cc020436e90b2d6cfad4c8310'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.029 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.029 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.029 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 52900000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11d2369a-b9ce-4079-b19a-aea4cabe5879', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52900000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:53:08.029595', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5e24ac50-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.219094307, 'message_signature': '15f4aa807f34852850c2e01329665220f598750d3d44dfd1f3e0ac5a58217fef'}]}, 'timestamp': '2025-12-06 09:53:08.030274', '_unique_id': 'cb2f301158654369b2776022b449ca53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '522186f5-fcc6-45ce-8f94-b77b458e9369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.033012', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e25308a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '11c886fd7a9fd0b62a6446ce3931fa32495c9f6701f387b60f0874c84166d9d3'}]}, 'timestamp': '2025-12-06 09:53:08.033621', '_unique_id': '1ce3267c64c94dfb8dcf6f499ee73809'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae4f38d3-0182-4ecc-9758-57c737f3eafc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.036688', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e25bea6-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '68055742319fe2a6ca3131cebe5f483c4d8556a0a93f10a83a8b1a5cd907dcbd'}]}, 'timestamp': '2025-12-06 09:53:08.037236', '_unique_id': 'd6222eb721264c58b94be032810e0c89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aca91d6b-f6a3-4dc2-a307-78e61853aaf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.039605', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e26300c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': 'a47d638d126f7e8c2fb20299092162ce6fbd854ffb5548d5c132166825dfc1c5'}]}, 'timestamp': '2025-12-06 09:53:08.040123', '_unique_id': 'afee215ec5234e63b1655f0e200924c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32a844e0-2a0d-4186-b502-26623b15d1d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.042334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e26988a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': '8dbb5c8bc3d818ae3d2cba437028df43b30b1e6324699771557f9d2ecded28b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.042334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e26ac76-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'ed2c2fdf9483646f37221b1c1d547768ca66778da6c4b020d44677b9af11d865'}]}, 'timestamp': '2025-12-06 09:53:08.043291', '_unique_id': '8ec3ceaa54fd4391846917a5b20a9c68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.046 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f67d4808-1ebd-43d3-aa2e-e8cdc5485f83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.045556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e2717ec-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'b177d4a2636eaf80af96206e80b09367f79c558b49d4867ea58a43cc792e3736'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.045556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e2728cc-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'ca43bdc130619a191c544c7f545771594642b146affd18aca147fb9f1e51cfc6'}]}, 'timestamp': '2025-12-06 09:53:08.046461', '_unique_id': '23622e2f6582494bb32ceeee3d19ccc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.049 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.050 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba0d5500-7207-416b-b13b-09f9c0bf326b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.049381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e27afb8-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': '9797d64163a42b808613de9587dc493a9ebd57e1f2f32909e5189640bf4b539a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.049381', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e27cfca-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'ddc784c477d94de899bf5f8e412da9b380c0846f4d7dc20536802c0a5f803628'}]}, 'timestamp': '2025-12-06 09:53:08.050793', '_unique_id': 'e485986731fe477cbbffc018bdf7cd63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.053 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04696f41-eb2d-4050-a53f-38d17c8e60d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.053106', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e283dca-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '2a4ea5a4a59affe6bc9449808078fed12f1d45d719319cb5eb824a41097e8adb'}]}, 'timestamp': '2025-12-06 09:53:08.053580', '_unique_id': '676306252452421ea93a02353d0e4379'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.056 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.056 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cead8c6-2f6f-4c0e-8272-a8ae84ba15dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.056465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e28c18c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': 'c6c389cd58c21aed89d1ac230de68a6b31d716c9abd1ef87d682a36516b0a4b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.056465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e28d474-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.222432068, 'message_signature': '4d4404f9160b21ea430eb797abc02620c91c2dadc9e8df57d7d11bc225fb8be4'}]}, 'timestamp': '2025-12-06 09:53:08.057413', '_unique_id': '388937c7a760411ba0b5281c5777156f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.059 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd74d1c6d-e7af-42be-bda6-0e6683aefbec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.059644', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e293acc-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '075e6bb6d4409b360cfd10ea7116a6072249a8fd1606ceea919cfbe801169c34'}]}, 'timestamp': '2025-12-06 09:53:08.059973', '_unique_id': 'f2db8e46636a4bc2b2c5233c3dc5fe13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.061 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.061 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d6805ce-f10f-42d6-b116-99ed28283dec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.061491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e29819e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': 'b6ec1777fc1bc0857b8969db46f5b8b681c3914ba559394733fbcd28dc9b1dbb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.061491', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e298d38-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': 'd35cf96507e79ddfd4fcffa3ad47aa5a74691fe463cc9a6a5ee44c494e783b7a'}]}, 'timestamp': '2025-12-06 09:53:08.062066', '_unique_id': '4cfc03878442460c8d0bca88a39458a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.063 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.064 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73fbd725-73b5-435b-9e14-4c3eeff683ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:53:08.063795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5e29de78-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': 'e84956117dd4a6f94f073b9df39c072c8a769ed1d76dfb4e92d787de38c1fe8b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:53:08.063795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5e29e968-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.169034368, 'message_signature': 'ab4b5e57acfc0f367c72d92fd6312ddd732be765e2a6038920332d29819cdb60'}]}, 'timestamp': '2025-12-06 09:53:08.064420', '_unique_id': 'a08f8f869c904c95aa3ca4a86439a6aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.065 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c70c8db4-6791-4aa1-81cf-e158974f9f21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:53:08.065856', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '5e2a2d42-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11206.160798147, 'message_signature': '670186d80942056766c3c3348b60df68a203623a469058d3ef8968dcb7354e5f'}]}, 'timestamp': '2025-12-06 09:53:08.066176', '_unique_id': 'e7101390596448ef9e7181e12e22aa6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:53:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:53:08.066 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:53:08 np0005548789.localdomain sudo[251242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwbztibwmbhdstwnmexqkpixyxgovsfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014787.8882875-3606-135867345271766/AnsiballZ_command.py
Dec 06 09:53:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:53:08 np0005548789.localdomain sudo[251242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:08 np0005548789.localdomain systemd[1]: tmp-crun.P4cE80.mount: Deactivated successfully.
Dec 06 09:53:08 np0005548789.localdomain podman[251244]: 2025-12-06 09:53:08.292682481 +0000 UTC m=+0.097040393 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Dec 06 09:53:08 np0005548789.localdomain podman[251244]: 2025-12-06 09:53:08.307067431 +0000 UTC m=+0.111425343 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:53:08 np0005548789.localdomain python3.9[251245]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:53:08 np0005548789.localdomain sudo[251242]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:08 np0005548789.localdomain sudo[251376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqxgwirenqvsdnbtikpwjtfxoqwgewwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014788.6229014-3631-39524850389688/AnsiballZ_file.py
Dec 06 09:53:08 np0005548789.localdomain sudo[251376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:09 np0005548789.localdomain python3.9[251378]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully.
Dec 06 09:53:09 np0005548789.localdomain sudo[251376]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:09 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:53:09 np0005548789.localdomain sshd[248446]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:53:09 np0005548789.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Dec 06 09:53:09 np0005548789.localdomain systemd[1]: session-57.scope: Consumed 12.865s CPU time.
Dec 06 09:53:09 np0005548789.localdomain systemd-logind[766]: Session 57 logged out. Waiting for processes to exit.
Dec 06 09:53:09 np0005548789.localdomain systemd-logind[766]: Removed session 57.
Dec 06 09:53:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully.
Dec 06 09:53:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:53:10 np0005548789.localdomain podman[251396]: 2025-12-06 09:53:10.405193741 +0000 UTC m=+0.084450330 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:53:10 np0005548789.localdomain podman[251396]: 2025-12-06 09:53:10.42451331 +0000 UTC m=+0.103769889 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 09:53:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37410 DF PROTO=TCP SPT=39094 DPT=9102 SEQ=3143690640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0DC6F0000000001030307) 
Dec 06 09:53:11 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:53:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully.
Dec 06 09:53:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully.
Dec 06 09:53:11 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:11.424 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:12 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:12.680 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully.
Dec 06 09:53:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:53:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b65786436fb22482b0004739b377157c398689a3d99c4b69d96c34a55e2067d5-merged.mount: Deactivated successfully.
Dec 06 09:53:13 np0005548789.localdomain podman[251416]: 2025-12-06 09:53:13.697973841 +0000 UTC m=+0.119440988 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:53:13 np0005548789.localdomain podman[251416]: 2025-12-06 09:53:13.708126281 +0000 UTC m=+0.129593478 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:53:13 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:47:41 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142938 "" "Go-http-client/1.1"
Dec 06 09:53:13 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:53:13.846Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 06 09:53:13 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:53:13.846Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 06 09:53:13 np0005548789.localdomain podman_exporter[241313]: ts=2025-12-06T09:53:13.846Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Dec 06 09:53:13 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:53:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b65786436fb22482b0004739b377157c398689a3d99c4b69d96c34a55e2067d5-merged.mount: Deactivated successfully.
Dec 06 09:53:15 np0005548789.localdomain sshd[251440]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:16 np0005548789.localdomain sshd[251440]: Received disconnect from 64.227.102.57 port 54802:11: Bye Bye [preauth]
Dec 06 09:53:16 np0005548789.localdomain sshd[251440]: Disconnected from authenticating user root 64.227.102.57 port 54802 [preauth]
Dec 06 09:53:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:16.465 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:16 np0005548789.localdomain sshd[251442]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:53:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:53:16 np0005548789.localdomain sshd[251442]: Accepted publickey for zuul from 192.168.122.30 port 34548 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:53:16 np0005548789.localdomain systemd-logind[766]: New session 58 of user zuul.
Dec 06 09:53:16 np0005548789.localdomain systemd[1]: Started Session 58 of User zuul.
Dec 06 09:53:16 np0005548789.localdomain sshd[251442]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:53:17 np0005548789.localdomain sudo[251558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vukhpraagpgyhksmtybyqjyklwywebly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014796.8337803-27-242875110228670/AnsiballZ_file.py
Dec 06 09:53:17 np0005548789.localdomain sudo[251558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:17 np0005548789.localdomain python3.9[251560]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:17 np0005548789.localdomain sudo[251558]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:17 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:17.702 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:18 np0005548789.localdomain sudo[251668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvkrygqxcvcrnncqxmoanycrupfjttsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014797.7449756-27-168271811292288/AnsiballZ_file.py
Dec 06 09:53:18 np0005548789.localdomain sudo[251668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:18 np0005548789.localdomain python3.9[251670]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:18 np0005548789.localdomain sudo[251668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:18 np0005548789.localdomain sudo[251778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubyigoepamsuoewsbihgdnaxfvmejazd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014798.3828127-27-211035911364829/AnsiballZ_file.py
Dec 06 09:53:18 np0005548789.localdomain sudo[251778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:18 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37411 DF PROTO=TCP SPT=39094 DPT=9102 SEQ=3143690640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E0FBEF0000000001030307) 
Dec 06 09:53:18 np0005548789.localdomain python3.9[251780]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:18 np0005548789.localdomain sudo[251778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:19 np0005548789.localdomain python3.9[251888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:20 np0005548789.localdomain python3.9[251974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_sriov_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014799.1369522-105-82285190758108/.source.yaml follow=False _original_basename=neutron_sriov_agent.yaml.j2 checksum=d3942d8476d006ea81540d2a1d96dd9d67f33f5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:53:20 np0005548789.localdomain podman[252063]: 2025-12-06 09:53:20.926894741 +0000 UTC m=+0.090338739 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 09:53:20 np0005548789.localdomain podman[252063]: 2025-12-06 09:53:20.967361887 +0000 UTC m=+0.130805905 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec 06 09:53:20 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:53:21 np0005548789.localdomain python3.9[252094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:21.500 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:21 np0005548789.localdomain python3.9[252193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014800.630643-150-78252792069057/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:22 np0005548789.localdomain python3.9[252301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:22.740 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:23 np0005548789.localdomain sshd[252368]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:23 np0005548789.localdomain python3.9[252389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014801.750178-150-82396591782611/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:53:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:53:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:53:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144556 "" "Go-http-client/1.1"
Dec 06 09:53:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:53:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16344 "" "Go-http-client/1.1"
Dec 06 09:53:24 np0005548789.localdomain python3.9[252497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:24 np0005548789.localdomain python3.9[252585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014803.5554872-150-77588433591239/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=be5326ad1944d74af3431e31b8b21bf15602795e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:24 np0005548789.localdomain sshd[252368]: Received disconnect from 123.160.164.187 port 48764:11: Bye Bye [preauth]
Dec 06 09:53:24 np0005548789.localdomain sshd[252368]: Disconnected from authenticating user root 123.160.164.187 port 48764 [preauth]
Dec 06 09:53:26 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:26.503 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:26 np0005548789.localdomain python3.9[252693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:27 np0005548789.localdomain python3.9[252779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014806.102695-324-163288845116448/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=b14e40a972b9e05d0e95a7e875b3201eda2c4b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:27 np0005548789.localdomain sudo[252842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:53:27 np0005548789.localdomain sudo[252842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:53:27 np0005548789.localdomain sudo[252842]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:27 np0005548789.localdomain sudo[252884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 09:53:27 np0005548789.localdomain sudo[252884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:27 np0005548789.localdomain podman[252869]: 2025-12-06 09:53:27.732808864 +0000 UTC m=+0.089208738 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:53:27 np0005548789.localdomain podman[252869]: 2025-12-06 09:53:27.764682744 +0000 UTC m=+0.121082608 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:53:27 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:53:27 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:27.794 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:28 np0005548789.localdomain python3.9[252941]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:28 np0005548789.localdomain sudo[252884]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:28 np0005548789.localdomain sudo[252983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:53:28 np0005548789.localdomain sudo[252983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:28 np0005548789.localdomain sudo[252983]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:28 np0005548789.localdomain sudo[253054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:53:28 np0005548789.localdomain sudo[253054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:28 np0005548789.localdomain sudo[253108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bszbppeqggzqgzncmslaadtsbptshiel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014808.29464-396-79878749245996/AnsiballZ_file.py
Dec 06 09:53:28 np0005548789.localdomain sudo[253108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:28 np0005548789.localdomain python3.9[253110]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:28 np0005548789.localdomain sudo[253108]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:29 np0005548789.localdomain sudo[253054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:29 np0005548789.localdomain sudo[253250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evoepzxfrjntbtuafhxeidimnkvcxfmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014809.1466424-420-142135719761792/AnsiballZ_stat.py
Dec 06 09:53:29 np0005548789.localdomain sudo[253250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:29 np0005548789.localdomain python3.9[253252]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:29 np0005548789.localdomain sudo[253250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:29 np0005548789.localdomain sudo[253307]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynflpbjwijrppdglbxkkdmrbjuagfkpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014809.1466424-420-142135719761792/AnsiballZ_file.py
Dec 06 09:53:29 np0005548789.localdomain sudo[253307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:30 np0005548789.localdomain python3.9[253309]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:30 np0005548789.localdomain sudo[253307]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:30 np0005548789.localdomain sudo[253417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toizqnahkdptopvksirucrprwvqyhbty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014810.273268-420-215437063189238/AnsiballZ_stat.py
Dec 06 09:53:30 np0005548789.localdomain sudo[253417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:30 np0005548789.localdomain python3.9[253419]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:30 np0005548789.localdomain sudo[253417]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:30 np0005548789.localdomain sudo[253474]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oywltvddvxwybtunnsbhycuuvmpiqupo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014810.273268-420-215437063189238/AnsiballZ_file.py
Dec 06 09:53:30 np0005548789.localdomain sudo[253474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:31 np0005548789.localdomain python3.9[253476]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:31 np0005548789.localdomain sudo[253474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:31 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:31.505 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:53:31 np0005548789.localdomain sudo[253597]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-taenykljmhkxxrplanqraugbhlctgzhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014811.6328838-489-133673670072583/AnsiballZ_file.py
Dec 06 09:53:31 np0005548789.localdomain podman[253565]: 2025-12-06 09:53:31.930358231 +0000 UTC m=+0.084798164 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:53:31 np0005548789.localdomain sudo[253597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:31 np0005548789.localdomain podman[253565]: 2025-12-06 09:53:31.965720628 +0000 UTC m=+0.120160571 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:53:31 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:53:32 np0005548789.localdomain python3.9[253609]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:32 np0005548789.localdomain sudo[253597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:32 np0005548789.localdomain sudo[253717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxpxqeuvzjtvpkjnhgqincqmclpefeub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014812.3621304-513-68100548093043/AnsiballZ_stat.py
Dec 06 09:53:32 np0005548789.localdomain sudo[253717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:32 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:32.813 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:32 np0005548789.localdomain python3.9[253719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:32 np0005548789.localdomain sudo[253717]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548789.localdomain sudo[253747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:53:33 np0005548789.localdomain sudo[253747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:53:33 np0005548789.localdomain sudo[253747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548789.localdomain sudo[253792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asolgubizayvsgcndfjekzjqyyhdsjck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014812.3621304-513-68100548093043/AnsiballZ_file.py
Dec 06 09:53:33 np0005548789.localdomain sudo[253792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:33 np0005548789.localdomain python3.9[253794]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:33 np0005548789.localdomain sudo[253792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31165 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E135B80000000001030307) 
Dec 06 09:53:34 np0005548789.localdomain sudo[253902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqkmjwunieorgrzwytfnkxuhljevhvug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014813.66071-549-17660827620128/AnsiballZ_stat.py
Dec 06 09:53:34 np0005548789.localdomain sudo[253902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:34 np0005548789.localdomain python3.9[253904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:34 np0005548789.localdomain sudo[253902]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:34 np0005548789.localdomain sshd[253923]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:34 np0005548789.localdomain sudo[253961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iyluslbrhqcdegxbapddkdkmshtcktcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014813.66071-549-17660827620128/AnsiballZ_file.py
Dec 06 09:53:34 np0005548789.localdomain sudo[253961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31166 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E139B00000000001030307) 
Dec 06 09:53:34 np0005548789.localdomain python3.9[253963]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:34 np0005548789.localdomain sudo[253961]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37412 DF PROTO=TCP SPT=39094 DPT=9102 SEQ=3143690640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E13BEF0000000001030307) 
Dec 06 09:53:35 np0005548789.localdomain sshd[253923]: Received disconnect from 14.194.101.210 port 59048:11: Bye Bye [preauth]
Dec 06 09:53:35 np0005548789.localdomain sshd[253923]: Disconnected from authenticating user root 14.194.101.210 port 59048 [preauth]
Dec 06 09:53:36 np0005548789.localdomain sudo[254071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bltmcxniilhknulonopwhvqjmbfeksol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014815.211852-585-281129874479626/AnsiballZ_systemd.py
Dec 06 09:53:36 np0005548789.localdomain sudo[254071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:36 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:36.546 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:36 np0005548789.localdomain python3.9[254073]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:53:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31167 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E141AF0000000001030307) 
Dec 06 09:53:36 np0005548789.localdomain systemd-sysv-generator[254105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:36 np0005548789.localdomain systemd-rc-local-generator[254101]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:36 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:53:37 np0005548789.localdomain sudo[254071]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:37 np0005548789.localdomain podman[254113]: 2025-12-06 09:53:37.167526033 +0000 UTC m=+0.085641690 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:53:37 np0005548789.localdomain podman[254113]: 2025-12-06 09:53:37.178692923 +0000 UTC m=+0.096808610 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 09:53:37 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:53:37 np0005548789.localdomain sudo[254239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaqldgpxrwpdwrrsbkomcjpitetphdzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014817.3643723-609-172724863272591/AnsiballZ_stat.py
Dec 06 09:53:37 np0005548789.localdomain sudo[254239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:37 np0005548789.localdomain python3.9[254241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47994 DF PROTO=TCP SPT=39156 DPT=9102 SEQ=1060486510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E145F00000000001030307) 
Dec 06 09:53:37 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:37.843 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:37 np0005548789.localdomain sudo[254239]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:38 np0005548789.localdomain sudo[254296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmhbiyyxfqmlxggtkgfsymwapmaktgbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014817.3643723-609-172724863272591/AnsiballZ_file.py
Dec 06 09:53:38 np0005548789.localdomain sudo[254296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:39 np0005548789.localdomain python3.9[254298]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:39 np0005548789.localdomain sudo[254296]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:39 np0005548789.localdomain sudo[254406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szfbwkjvzykbudgsfrsflvjzdnrvrcsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014819.3175924-645-267487610530229/AnsiballZ_stat.py
Dec 06 09:53:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:53:39 np0005548789.localdomain sudo[254406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:39 np0005548789.localdomain systemd[1]: tmp-crun.GY50DI.mount: Deactivated successfully.
Dec 06 09:53:39 np0005548789.localdomain podman[254408]: 2025-12-06 09:53:39.721533371 +0000 UTC m=+0.090671053 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Dec 06 09:53:39 np0005548789.localdomain podman[254408]: 2025-12-06 09:53:39.762189099 +0000 UTC m=+0.131326791 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64)
Dec 06 09:53:39 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:53:39 np0005548789.localdomain python3.9[254409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:39 np0005548789.localdomain sudo[254406]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:40 np0005548789.localdomain sudo[254484]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyhiucgbrjutjkflacwpiifpbxcyyzqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014819.3175924-645-267487610530229/AnsiballZ_file.py
Dec 06 09:53:40 np0005548789.localdomain sudo[254484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:40 np0005548789.localdomain python3.9[254486]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:40 np0005548789.localdomain sudo[254484]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31168 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1516F0000000001030307) 
Dec 06 09:53:40 np0005548789.localdomain sudo[254594]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwqzuggcarcfdlsahyjxgmuhmmpmkhot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014820.5070393-681-176320667418725/AnsiballZ_systemd.py
Dec 06 09:53:40 np0005548789.localdomain sudo[254594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:41 np0005548789.localdomain python3.9[254596]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:53:41 np0005548789.localdomain podman[254598]: 2025-12-06 09:53:41.220866648 +0000 UTC m=+0.073138149 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:53:41 np0005548789.localdomain podman[254598]: 2025-12-06 09:53:41.232567114 +0000 UTC m=+0.084838635 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 09:53:41 np0005548789.localdomain systemd-rc-local-generator[254638]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:41 np0005548789.localdomain systemd-sysv-generator[254643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:53:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:41.583 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:53:41 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:53:41 np0005548789.localdomain sudo[254594]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:42 np0005548789.localdomain sudo[254765]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adaeaxfacybgfmsgtivgaplzyuzjrvul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.0912898-711-185689890264016/AnsiballZ_file.py
Dec 06 09:53:42 np0005548789.localdomain sudo[254765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:42 np0005548789.localdomain python3.9[254767]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:53:42 np0005548789.localdomain sudo[254765]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:42 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:42.881 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:43 np0005548789.localdomain sudo[254875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upzccwozmoancoqegymuxrxdijxtmoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.924094-735-152506000362410/AnsiballZ_stat.py
Dec 06 09:53:43 np0005548789.localdomain sudo[254875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:43 np0005548789.localdomain python3.9[254877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:53:43 np0005548789.localdomain sudo[254875]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:43 np0005548789.localdomain sudo[254963]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxdxpuojbhhemdxdrpdaiqfqsavozwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014822.924094-735-152506000362410/AnsiballZ_copy.py
Dec 06 09:53:43 np0005548789.localdomain sudo[254963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:43 np0005548789.localdomain python3.9[254965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014822.924094-735-152506000362410/.source.json _original_basename=.e2qnmrj5 follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:44 np0005548789.localdomain sudo[254963]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:44 np0005548789.localdomain sudo[255073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpttjwulaplywhclcwhajgkaxvsxevtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.1931212-780-131512812567025/AnsiballZ_file.py
Dec 06 09:53:44 np0005548789.localdomain sudo[255073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:53:44 np0005548789.localdomain podman[255076]: 2025-12-06 09:53:44.636344465 +0000 UTC m=+0.085409513 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:53:44 np0005548789.localdomain podman[255076]: 2025-12-06 09:53:44.646242696 +0000 UTC m=+0.095307764 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:53:44 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:53:44 np0005548789.localdomain python3.9[255075]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:44 np0005548789.localdomain sudo[255073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:45 np0005548789.localdomain sudo[255207]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isakextxrvexorlmgmxgxnilehxcvvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.9977725-804-118440686607140/AnsiballZ_stat.py
Dec 06 09:53:45 np0005548789.localdomain sudo[255207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:45 np0005548789.localdomain sudo[255207]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:45 np0005548789.localdomain sudo[255295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgdpzcmexcxjdktmhtqqkkcyintfkuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014824.9977725-804-118440686607140/AnsiballZ_copy.py
Dec 06 09:53:45 np0005548789.localdomain sudo[255295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:46 np0005548789.localdomain sudo[255295]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:46.610 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:53:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:53:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:53:46 np0005548789.localdomain sudo[255405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeshtgssrygvldigncghvohthesqbwup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014826.4604206-855-91600288105506/AnsiballZ_container_config_data.py
Dec 06 09:53:46 np0005548789.localdomain sudo[255405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:47 np0005548789.localdomain python3.9[255407]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Dec 06 09:53:47 np0005548789.localdomain sudo[255405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:53:47.282 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:53:47.283 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:53:47.284 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:47 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:47.908 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:47 np0005548789.localdomain sudo[255515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mharcfyrhjufgqanhynuruaqzdedzbsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014827.382209-882-252609928604843/AnsiballZ_container_config_hash.py
Dec 06 09:53:47 np0005548789.localdomain sudo[255515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:48 np0005548789.localdomain python3.9[255517]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:53:48 np0005548789.localdomain sudo[255515]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31169 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E171EF0000000001030307) 
Dec 06 09:53:49 np0005548789.localdomain sshd[255573]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:53:49 np0005548789.localdomain sudo[255627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftefrmodvfzecepscfuzzxvziwjxtims ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014828.6440127-909-150163484775845/AnsiballZ_podman_container_info.py
Dec 06 09:53:49 np0005548789.localdomain sudo[255627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:50 np0005548789.localdomain python3.9[255629]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:53:50 np0005548789.localdomain sshd[255573]: Received disconnect from 118.219.234.233 port 34234:11: Bye Bye [preauth]
Dec 06 09:53:50 np0005548789.localdomain sshd[255573]: Disconnected from authenticating user root 118.219.234.233 port 34234 [preauth]
Dec 06 09:53:50 np0005548789.localdomain sudo[255627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:51 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:51.613 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:53:51 np0005548789.localdomain podman[255674]: 2025-12-06 09:53:51.923910316 +0000 UTC m=+0.082786072 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 09:53:51 np0005548789.localdomain podman[255674]: 2025-12-06 09:53:51.968816374 +0000 UTC m=+0.127692150 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:53:51 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:53:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:52.945 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:53:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:53:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:53:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144554 "" "Go-http-client/1.1"
Dec 06 09:53:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:53:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16341 "" "Go-http-client/1.1"
Dec 06 09:53:54 np0005548789.localdomain sudo[255789]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxqmrtuevkkntmuapkngintybmnxblug ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014833.5936062-948-79642549149308/AnsiballZ_edpm_container_manage.py
Dec 06 09:53:54 np0005548789.localdomain sudo[255789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:54 np0005548789.localdomain python3[255791]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:53:54 np0005548789.localdomain podman[255829]: 
Dec 06 09:53:54 np0005548789.localdomain podman[255829]: 2025-12-06 09:53:54.748613051 +0000 UTC m=+0.079633817 container create 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, container_name=neutron_sriov_agent, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 09:53:54 np0005548789.localdomain podman[255829]: 2025-12-06 09:53:54.702125994 +0000 UTC m=+0.033146790 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:53:54 np0005548789.localdomain python3[255791]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Dec 06 09:53:54 np0005548789.localdomain sudo[255789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:55 np0005548789.localdomain sudo[255972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siqqjddnwjdleejunsxwbrafxinostbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.1421962-972-134864536565470/AnsiballZ_stat.py
Dec 06 09:53:55 np0005548789.localdomain sudo[255972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:55.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:55 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:55.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:53:55 np0005548789.localdomain python3.9[255974]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:55 np0005548789.localdomain sudo[255972]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:56 np0005548789.localdomain sudo[256084]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtkwfizjjttkbjevsisgnaetidmekhok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.941337-999-22186140544938/AnsiballZ_file.py
Dec 06 09:53:56 np0005548789.localdomain sudo[256084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:56 np0005548789.localdomain python3.9[256086]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:56 np0005548789.localdomain sudo[256084]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:56.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:56 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:56.656 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:56 np0005548789.localdomain sudo[256139]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsxfzjfrphpijnjbichvokivzrhtovqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014835.941337-999-22186140544938/AnsiballZ_stat.py
Dec 06 09:53:56 np0005548789.localdomain sudo[256139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:56 np0005548789.localdomain python3.9[256141]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:53:56 np0005548789.localdomain sudo[256139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:57 np0005548789.localdomain sudo[256248]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqesbiwwhyybbfjhommuikuwxxksxwzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.9681005-999-266948782211046/AnsiballZ_copy.py
Dec 06 09:53:57 np0005548789.localdomain sudo[256248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:57.496 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:57.531 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:57 np0005548789.localdomain python3.9[256250]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014836.9681005-999-266948782211046/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:53:57 np0005548789.localdomain sudo[256248]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:53:57 np0005548789.localdomain sudo[256314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpevtgzxwzcgunpvluketqqcpavyisfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.9681005-999-266948782211046/AnsiballZ_systemd.py
Dec 06 09:53:57 np0005548789.localdomain podman[256283]: 2025-12-06 09:53:57.919475978 +0000 UTC m=+0.074863632 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:53:57 np0005548789.localdomain sudo[256314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:57.947 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:53:57 np0005548789.localdomain podman[256283]: 2025-12-06 09:53:57.950465331 +0000 UTC m=+0.105852995 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:53:57 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:53:58 np0005548789.localdomain python3.9[256322]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:53:58 np0005548789.localdomain systemd-rc-local-generator[256346]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:58 np0005548789.localdomain systemd-sysv-generator[256351]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.502 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.532 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.532 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.533 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.533 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:53:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:58.534 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:53:58 np0005548789.localdomain sudo[256314]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:58 np0005548789.localdomain sudo[256431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhvgphzgvtuquznwmdeikqvnwwdgtcaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014836.9681005-999-266948782211046/AnsiballZ_systemd.py
Dec 06 09:53:58 np0005548789.localdomain sudo[256431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.034 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.155 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.155 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:53:59 np0005548789.localdomain python3.9[256433]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.321 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.323 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12272MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.323 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.324 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:53:59 np0005548789.localdomain systemd-rc-local-generator[256459]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:53:59 np0005548789.localdomain systemd-sysv-generator[256463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.429 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.429 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.430 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.472 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:53:59 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c1ea5949717770416a8f964126962a3573c9a4287144b7b66754fd2b30c8ab/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:53:59 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c1ea5949717770416a8f964126962a3573c9a4287144b7b66754fd2b30c8ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:53:59 np0005548789.localdomain podman[256496]: 2025-12-06 09:53:59.788739411 +0000 UTC m=+0.118668676 container init 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:53:59 np0005548789.localdomain podman[256496]: 2025-12-06 09:53:59.797981782 +0000 UTC m=+0.127911057 container start 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:53:59 np0005548789.localdomain podman[256496]: neutron_sriov_agent
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + sudo -E kolla_set_configs
Dec 06 09:53:59 np0005548789.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 06 09:53:59 np0005548789.localdomain sudo[256431]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.895 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.900 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Validating config file
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Copying service configuration files
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Writing out command to execute
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.922 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.926 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:53:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:53:59.926 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: ++ cat /run_command
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + ARGS=
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + sudo kolla_copy_cacerts
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + [[ ! -n '' ]]
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + . kolla_extend_start
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + umask 0022
Dec 06 09:53:59 np0005548789.localdomain neutron_sriov_agent[256508]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:00 np0005548789.localdomain sudo[256632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykyrfizzwwkqjdysgwroacjwvchwqspo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014840.1793692-1083-199220598745290/AnsiballZ_systemd.py
Dec 06 09:54:00 np0005548789.localdomain sudo[256632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:00 np0005548789.localdomain python3.9[256634]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: tmp-crun.7Xqlfk.mount: Deactivated successfully.
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: libpod-70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f.scope: Deactivated successfully.
Dec 06 09:54:00 np0005548789.localdomain podman[256638]: 2025-12-06 09:54:00.816947658 +0000 UTC m=+0.063009991 container died 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=neutron_sriov_agent)
Dec 06 09:54:00 np0005548789.localdomain podman[256638]: 2025-12-06 09:54:00.850680665 +0000 UTC m=+0.096742988 container cleanup 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:54:00 np0005548789.localdomain podman[256638]: neutron_sriov_agent
Dec 06 09:54:00 np0005548789.localdomain podman[256663]: 2025-12-06 09:54:00.918620074 +0000 UTC m=+0.043620359 container cleanup 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, container_name=neutron_sriov_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 09:54:00 np0005548789.localdomain podman[256663]: neutron_sriov_agent
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Dec 06 09:54:00 np0005548789.localdomain systemd[1]: Starting neutron_sriov_agent container...
Dec 06 09:54:01 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:54:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c1ea5949717770416a8f964126962a3573c9a4287144b7b66754fd2b30c8ab/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98c1ea5949717770416a8f964126962a3573c9a4287144b7b66754fd2b30c8ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:01 np0005548789.localdomain podman[256675]: 2025-12-06 09:54:01.028636394 +0000 UTC m=+0.085842834 container init 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=neutron_sriov_agent)
Dec 06 09:54:01 np0005548789.localdomain podman[256675]: 2025-12-06 09:54:01.036199155 +0000 UTC m=+0.093405585 container start 70f95d7e4bfd5cee908c3c7be0e2b7e08743ddef8dd6616c182a2c1efdbfaf9f (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a614c97fd2f5526d7833dab0b1f1d737d057a7718ab0a9ef0bb2a6a0f92b1cb2'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/config-data/ansible-generated/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Dec 06 09:54:01 np0005548789.localdomain podman[256675]: neutron_sriov_agent
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + sudo -E kolla_set_configs
Dec 06 09:54:01 np0005548789.localdomain systemd[1]: Started neutron_sriov_agent container.
Dec 06 09:54:01 np0005548789.localdomain sudo[256632]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Validating config file
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Copying service configuration files
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Writing out command to execute
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: ++ cat /run_command
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + CMD=/usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + ARGS=
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + sudo kolla_copy_cacerts
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + [[ ! -n '' ]]
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + . kolla_extend_start
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + umask 0022
Dec 06 09:54:01 np0005548789.localdomain neutron_sriov_agent[256690]: + exec /usr/bin/neutron-sriov-nic-agent
Dec 06 09:54:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:01.699 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.030 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.030 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.031 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:54:02 np0005548789.localdomain sshd[251442]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:54:02 np0005548789.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Dec 06 09:54:02 np0005548789.localdomain systemd[1]: session-58.scope: Consumed 23.118s CPU time.
Dec 06 09:54:02 np0005548789.localdomain systemd-logind[766]: Session 58 logged out. Waiting for processes to exit.
Dec 06 09:54:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:54:02 np0005548789.localdomain systemd-logind[766]: Removed session 58.
Dec 06 09:54:02 np0005548789.localdomain podman[256722]: 2025-12-06 09:54:02.31355606 +0000 UTC m=+0.079346227 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:54:02 np0005548789.localdomain podman[256722]: 2025-12-06 09:54:02.323488723 +0000 UTC m=+0.089278880 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:54:02 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.526 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.527 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.527 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.527 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.646 2 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.646 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev43
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.647 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.647 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.647 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.647 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.647 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005548789.localdomain'}
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.648 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] RPC agent_id: nic-switch-agent.np0005548789.localdomain
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.652 2 INFO neutron.agent.agent_extensions_manager [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] Loaded agent extensions: ['qos']
Dec 06 09:54:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:02.653 2 INFO neutron.agent.agent_extensions_manager [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] Initializing agent extension 'qos'
Dec 06 09:54:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:02.951 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.110 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.133 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.134 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.135 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.135 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:03 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:03.171 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] Agent initialized successfully, now running... 
Dec 06 09:54:03 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:03.171 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Dec 06 09:54:03 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 09:54:03.172 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-66b29cf2-88ec-4f57-9798-638d8a894885 - - - - - -] Agent out of sync with plugin!
Dec 06 09:54:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:03.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63142 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1AAE60000000001030307) 
Dec 06 09:54:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63143 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1AEEF0000000001030307) 
Dec 06 09:54:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31170 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1B1EF0000000001030307) 
Dec 06 09:54:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:06.747 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63144 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1B6EF0000000001030307) 
Dec 06 09:54:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37413 DF PROTO=TCP SPT=39094 DPT=9102 SEQ=3143690640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1B9EF0000000001030307) 
Dec 06 09:54:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:54:07 np0005548789.localdomain systemd[1]: tmp-crun.94gzkP.mount: Deactivated successfully.
Dec 06 09:54:07 np0005548789.localdomain podman[256747]: 2025-12-06 09:54:07.927868246 +0000 UTC m=+0.088875008 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 06 09:54:07 np0005548789.localdomain podman[256747]: 2025-12-06 09:54:07.938347136 +0000 UTC m=+0.099353948 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:54:07 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:54:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:08.003 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:09 np0005548789.localdomain sshd[256766]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:09 np0005548789.localdomain sshd[256766]: Accepted publickey for zuul from 192.168.122.30 port 44782 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:54:09 np0005548789.localdomain systemd-logind[766]: New session 59 of user zuul.
Dec 06 09:54:09 np0005548789.localdomain systemd[1]: Started Session 59 of User zuul.
Dec 06 09:54:09 np0005548789.localdomain sshd[256766]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:54:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:54:09 np0005548789.localdomain podman[256808]: 2025-12-06 09:54:09.922502028 +0000 UTC m=+0.081675918 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm)
Dec 06 09:54:09 np0005548789.localdomain podman[256808]: 2025-12-06 09:54:09.964239969 +0000 UTC m=+0.123413809 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:54:09 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:54:10 np0005548789.localdomain python3.9[256896]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:54:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63145 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1C6AF0000000001030307) 
Dec 06 09:54:11 np0005548789.localdomain sudo[257008]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fclptyhasltblzojvqcwrwqvjmzjscti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014851.2489328-66-206675302102307/AnsiballZ_setup.py
Dec 06 09:54:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:54:11 np0005548789.localdomain sudo[257008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:11 np0005548789.localdomain systemd[1]: tmp-crun.1PhMV0.mount: Deactivated successfully.
Dec 06 09:54:11 np0005548789.localdomain podman[257010]: 2025-12-06 09:54:11.690311642 +0000 UTC m=+0.086099484 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:54:11 np0005548789.localdomain podman[257010]: 2025-12-06 09:54:11.699233073 +0000 UTC m=+0.095020945 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 09:54:11 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:54:11 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:11.784 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:11 np0005548789.localdomain python3.9[257011]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:54:12 np0005548789.localdomain sudo[257008]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:12 np0005548789.localdomain sudo[257090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdgntckqhyzxyqulyojzsohvklocdhqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014851.2489328-66-206675302102307/AnsiballZ_dnf.py
Dec 06 09:54:12 np0005548789.localdomain sudo[257090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:12 np0005548789.localdomain python3.9[257092]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:54:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:13.005 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:54:14 np0005548789.localdomain sshd[257106]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:14 np0005548789.localdomain podman[257095]: 2025-12-06 09:54:14.944401674 +0000 UTC m=+0.105495904 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:54:14 np0005548789.localdomain podman[257095]: 2025-12-06 09:54:14.957012578 +0000 UTC m=+0.118106838 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:54:14 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:54:15 np0005548789.localdomain sudo[257090]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:16 np0005548789.localdomain sshd[257106]: Received disconnect from 154.113.10.34 port 60230:11: Bye Bye [preauth]
Dec 06 09:54:16 np0005548789.localdomain sshd[257106]: Disconnected from authenticating user root 154.113.10.34 port 60230 [preauth]
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:54:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:54:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:16.786 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:16 np0005548789.localdomain sudo[257225]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfjocbspryvifnwwvcsxaufiftfqqsod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014856.4044533-102-197312073214672/AnsiballZ_systemd.py
Dec 06 09:54:16 np0005548789.localdomain sudo[257225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:17 np0005548789.localdomain python3.9[257227]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:17 np0005548789.localdomain sudo[257225]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:18.061 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:19 np0005548789.localdomain sudo[257338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwzjabxzdebtdlbnaduielwlvnyyajco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014858.647763-129-123664741126339/AnsiballZ_file.py
Dec 06 09:54:19 np0005548789.localdomain sudo[257338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63146 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E1E7EF0000000001030307) 
Dec 06 09:54:19 np0005548789.localdomain python3.9[257340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:19 np0005548789.localdomain sudo[257338]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:19 np0005548789.localdomain sudo[257448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nildmthoutmhslxycogdfzdngzschuwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014859.4663186-129-82885944889579/AnsiballZ_file.py
Dec 06 09:54:19 np0005548789.localdomain sudo[257448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:19 np0005548789.localdomain python3.9[257450]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:19 np0005548789.localdomain sudo[257448]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:20 np0005548789.localdomain sudo[257558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvjadsgtruoeihcntqdcqusicusglgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014860.1332529-129-246294601776956/AnsiballZ_file.py
Dec 06 09:54:20 np0005548789.localdomain sudo[257558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:20 np0005548789.localdomain python3.9[257560]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:20 np0005548789.localdomain sudo[257558]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:21 np0005548789.localdomain sudo[257668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-janylomclmiiweyxnovzlopycmzbgfoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014860.7614377-129-50050475363185/AnsiballZ_file.py
Dec 06 09:54:21 np0005548789.localdomain sudo[257668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:21 np0005548789.localdomain python3.9[257670]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:21 np0005548789.localdomain sudo[257668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:21 np0005548789.localdomain sudo[257778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghtaaycnhtczmxfhvhkrpqgmmvbezbud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014861.359426-129-47246071558567/AnsiballZ_file.py
Dec 06 09:54:21 np0005548789.localdomain sudo[257778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:21.824 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:21 np0005548789.localdomain python3.9[257780]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:21 np0005548789.localdomain sudo[257778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:22 np0005548789.localdomain sudo[257888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfiqshnnqjiuztvrxvcenydwqqewkqob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014862.0899343-129-195381672628530/AnsiballZ_file.py
Dec 06 09:54:22 np0005548789.localdomain sudo[257888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:54:22 np0005548789.localdomain podman[257891]: 2025-12-06 09:54:22.489070766 +0000 UTC m=+0.084493434 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_id=ovn_controller)
Dec 06 09:54:22 np0005548789.localdomain podman[257891]: 2025-12-06 09:54:22.558822371 +0000 UTC m=+0.154245019 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 09:54:22 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:54:22 np0005548789.localdomain python3.9[257890]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:22 np0005548789.localdomain sudo[257888]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:23 np0005548789.localdomain sudo[258023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzhjrgpbjgpztwbcovgcrtitolcddpit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014862.7657394-129-86470038216505/AnsiballZ_file.py
Dec 06 09:54:23 np0005548789.localdomain sudo[258023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:23.100 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:23 np0005548789.localdomain python3.9[258025]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:23 np0005548789.localdomain sudo[258023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:54:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:54:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:54:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146512 "" "Go-http-client/1.1"
Dec 06 09:54:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:54:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1"
Dec 06 09:54:24 np0005548789.localdomain sshd[258113]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:24 np0005548789.localdomain sudo[258135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dflyxttedgegxqiklrzwtgrbdmfuvxqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014863.6276286-279-42539910414647/AnsiballZ_stat.py
Dec 06 09:54:24 np0005548789.localdomain sudo[258135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:24 np0005548789.localdomain python3.9[258137]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:24 np0005548789.localdomain sudo[258135]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:24 np0005548789.localdomain sshd[258113]: Received disconnect from 64.227.102.57 port 37214:11: Bye Bye [preauth]
Dec 06 09:54:24 np0005548789.localdomain sshd[258113]: Disconnected from authenticating user root 64.227.102.57 port 37214 [preauth]
Dec 06 09:54:24 np0005548789.localdomain sudo[258223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjyrcokiiiniwvcsquymdscfmpskxoto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014863.6276286-279-42539910414647/AnsiballZ_copy.py
Dec 06 09:54:24 np0005548789.localdomain sudo[258223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:25 np0005548789.localdomain python3.9[258225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014863.6276286-279-42539910414647/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=3ebfe8ab1da42a1c6ca52429f61716009c5fd177 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:25 np0005548789.localdomain sudo[258223]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:25 np0005548789.localdomain python3.9[258333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:26 np0005548789.localdomain python3.9[258419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014865.3005524-324-70876228751438/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:26 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:26.858 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:27 np0005548789.localdomain python3.9[258527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:27 np0005548789.localdomain python3.9[258613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014866.8069844-324-276643992862257/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:28.134 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:28 np0005548789.localdomain python3.9[258721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:54:28 np0005548789.localdomain podman[258771]: 2025-12-06 09:54:28.924404661 +0000 UTC m=+0.087767434 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 09:54:28 np0005548789.localdomain podman[258771]: 2025-12-06 09:54:28.933208979 +0000 UTC m=+0.096571692 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 06 09:54:28 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:54:29 np0005548789.localdomain python3.9[258821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014867.8143377-324-91614150848626/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=691408578fd11e570f58ae61223b69d952ffde6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:31 np0005548789.localdomain python3.9[258933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:31 np0005548789.localdomain python3.9[259019]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014870.6470678-498-232294212083785/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=b14e40a972b9e05d0e95a7e875b3201eda2c4b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:31 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:31.897 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:32 np0005548789.localdomain python3.9[259127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:54:32 np0005548789.localdomain podman[259214]: 2025-12-06 09:54:32.914622414 +0000 UTC m=+0.078142111 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:54:32 np0005548789.localdomain podman[259214]: 2025-12-06 09:54:32.925368831 +0000 UTC m=+0.088888538 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:54:32 np0005548789.localdomain python3.9[259213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014871.9302413-543-206756337855355/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:32 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:54:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:33.173 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:33 np0005548789.localdomain sudo[259271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:54:33 np0005548789.localdomain sudo[259271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:33 np0005548789.localdomain sudo[259271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:33 np0005548789.localdomain sudo[259316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:54:33 np0005548789.localdomain sudo[259316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:33 np0005548789.localdomain python3.9[259380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23385 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E220170000000001030307) 
Dec 06 09:54:34 np0005548789.localdomain sudo[259316]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:34 np0005548789.localdomain python3.9[259486]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014873.2303293-543-260222796690234/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:34 np0005548789.localdomain sshd[248378]: fatal: Timeout before authentication for 45.78.222.162 port 50868
Dec 06 09:54:34 np0005548789.localdomain sudo[259587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:54:34 np0005548789.localdomain sudo[259587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:54:34 np0005548789.localdomain sudo[259587]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23386 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2242F0000000001030307) 
Dec 06 09:54:34 np0005548789.localdomain python3.9[259623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:35 np0005548789.localdomain python3.9[259679]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63147 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E227EF0000000001030307) 
Dec 06 09:54:36 np0005548789.localdomain python3.9[259787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:36 np0005548789.localdomain python3.9[259873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014875.49364-630-186377125682918/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23387 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E22C300000000001030307) 
Dec 06 09:54:36 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:36.949 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:37 np0005548789.localdomain python3.9[259981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:54:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31171 DF PROTO=TCP SPT=54340 DPT=9102 SEQ=2818979063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E22FF00000000001030307) 
Dec 06 09:54:37 np0005548789.localdomain sudo[260091]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwfbjpjsrfaifhskopvxklixeasakpwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014877.6849244-735-163095794165971/AnsiballZ_file.py
Dec 06 09:54:38 np0005548789.localdomain sudo[260091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:54:38 np0005548789.localdomain systemd[1]: tmp-crun.P6QCnP.mount: Deactivated successfully.
Dec 06 09:54:38 np0005548789.localdomain podman[260094]: 2025-12-06 09:54:38.09811079 +0000 UTC m=+0.081146432 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:54:38 np0005548789.localdomain podman[260094]: 2025-12-06 09:54:38.105851856 +0000 UTC m=+0.088887468 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:54:38 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:54:38 np0005548789.localdomain python3.9[260093]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:38.229 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:38 np0005548789.localdomain sudo[260091]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:38 np0005548789.localdomain sudo[260222]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rocmiignxhbaoggvymqrddjgfuajeasv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014878.5214205-759-149374816440202/AnsiballZ_stat.py
Dec 06 09:54:38 np0005548789.localdomain sudo[260222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:39 np0005548789.localdomain python3.9[260224]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:39 np0005548789.localdomain sudo[260222]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:39 np0005548789.localdomain sudo[260279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alzehxkdiwjidokqwbcubzesdcljwlpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014878.5214205-759-149374816440202/AnsiballZ_file.py
Dec 06 09:54:39 np0005548789.localdomain sudo[260279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:39 np0005548789.localdomain python3.9[260281]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:39 np0005548789.localdomain sudo[260279]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:39 np0005548789.localdomain sudo[260389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcqgocsdnqjsuonegzitcwzrnagarmrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014879.5750284-759-13760968964190/AnsiballZ_stat.py
Dec 06 09:54:39 np0005548789.localdomain sudo[260389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:40 np0005548789.localdomain python3.9[260391]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:40 np0005548789.localdomain sudo[260389]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:40 np0005548789.localdomain sudo[260446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijczxluarcbefewvlhtgbunevqdxvgux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014879.5750284-759-13760968964190/AnsiballZ_file.py
Dec 06 09:54:40 np0005548789.localdomain sudo[260446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:54:40 np0005548789.localdomain systemd[1]: tmp-crun.KO2smi.mount: Deactivated successfully.
Dec 06 09:54:40 np0005548789.localdomain podman[260449]: 2025-12-06 09:54:40.457371878 +0000 UTC m=+0.091338314 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 09:54:40 np0005548789.localdomain podman[260449]: 2025-12-06 09:54:40.475226761 +0000 UTC m=+0.109193197 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git)
Dec 06 09:54:40 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:54:40 np0005548789.localdomain python3.9[260448]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:40 np0005548789.localdomain sudo[260446]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23388 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E23BEF0000000001030307) 
Dec 06 09:54:41 np0005548789.localdomain sudo[260577]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otqnnpiuppbymkjoelemubsyqdsifzfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014880.9487195-828-133681248747956/AnsiballZ_file.py
Dec 06 09:54:41 np0005548789.localdomain sudo[260577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:41 np0005548789.localdomain python3.9[260579]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:41 np0005548789.localdomain sudo[260577]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:54:41 np0005548789.localdomain podman[260635]: 2025-12-06 09:54:41.924904446 +0000 UTC m=+0.079559624 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 09:54:41 np0005548789.localdomain podman[260635]: 2025-12-06 09:54:41.962698596 +0000 UTC m=+0.117353824 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd)
Dec 06 09:54:41 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:41.989 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:41 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:54:42 np0005548789.localdomain sudo[260706]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrjbpopalsffzqgtykcwxofwdkcuunay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014881.757656-852-45647326861480/AnsiballZ_stat.py
Dec 06 09:54:42 np0005548789.localdomain sudo[260706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:42 np0005548789.localdomain python3.9[260708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:42 np0005548789.localdomain sudo[260706]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:42 np0005548789.localdomain sudo[260763]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scxxczswyidoearugwjpuwkryymehsuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014881.757656-852-45647326861480/AnsiballZ_file.py
Dec 06 09:54:42 np0005548789.localdomain sudo[260763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:42 np0005548789.localdomain python3.9[260765]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:42 np0005548789.localdomain sudo[260763]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:43.264 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:43 np0005548789.localdomain sudo[260873]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfdsovmsovrxuoofhdhpabjaweqpsbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014883.019795-888-251959384069276/AnsiballZ_stat.py
Dec 06 09:54:43 np0005548789.localdomain sudo[260873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:43 np0005548789.localdomain python3.9[260875]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:43 np0005548789.localdomain sudo[260873]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:43 np0005548789.localdomain sudo[260930]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tztqguokqnalolevwkjlmspecsljgxiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014883.019795-888-251959384069276/AnsiballZ_file.py
Dec 06 09:54:43 np0005548789.localdomain sudo[260930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:44 np0005548789.localdomain python3.9[260932]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:44 np0005548789.localdomain sudo[260930]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:44 np0005548789.localdomain sudo[261040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzjujtkezwdzgdkhvhfbdvnmkmzmvhvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014884.2461927-924-263747925667211/AnsiballZ_systemd.py
Dec 06 09:54:44 np0005548789.localdomain sudo[261040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:44 np0005548789.localdomain python3.9[261042]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:54:44 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:54:44 np0005548789.localdomain systemd-sysv-generator[261069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:44 np0005548789.localdomain systemd-rc-local-generator[261066]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:44 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:54:45 np0005548789.localdomain sudo[261040]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:45 np0005548789.localdomain podman[261080]: 2025-12-06 09:54:45.259744906 +0000 UTC m=+0.078585124 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:54:45 np0005548789.localdomain podman[261080]: 2025-12-06 09:54:45.269504894 +0000 UTC m=+0.088345072 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:54:45 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:54:45 np0005548789.localdomain sudo[261209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elxfbzizepptjycpctmxutrdzqxahoex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014885.5251956-948-160284147242773/AnsiballZ_stat.py
Dec 06 09:54:45 np0005548789.localdomain sudo[261209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:46 np0005548789.localdomain python3.9[261211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:46 np0005548789.localdomain sudo[261209]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:46 np0005548789.localdomain sudo[261266]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iotkmvyqtlalpcnnudwhmyftfeaeeunf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014885.5251956-948-160284147242773/AnsiballZ_file.py
Dec 06 09:54:46 np0005548789.localdomain sudo[261266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:46 np0005548789.localdomain python3.9[261268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:46 np0005548789.localdomain sudo[261266]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:54:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:54:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:54:46 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:46.991 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:47 np0005548789.localdomain sudo[261376]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnmlhpofutdstbhzvtrhvmrtvmbhiebx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014886.9058857-984-153240359914741/AnsiballZ_stat.py
Dec 06 09:54:47 np0005548789.localdomain sudo[261376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:54:47.284 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:54:47.284 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:54:47.285 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:47 np0005548789.localdomain python3.9[261378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:47 np0005548789.localdomain sudo[261376]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:47 np0005548789.localdomain sudo[261433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiakcjesfimajwyknhiqghqddopkoobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014886.9058857-984-153240359914741/AnsiballZ_file.py
Dec 06 09:54:47 np0005548789.localdomain sudo[261433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:47 np0005548789.localdomain python3.9[261435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:47 np0005548789.localdomain sudo[261433]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:48.268 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:48 np0005548789.localdomain sudo[261543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jyxbhzypuljzinyrotpebezkrkxftdhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014888.1433485-1020-16235499468701/AnsiballZ_systemd.py
Dec 06 09:54:48 np0005548789.localdomain sudo[261543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:48 np0005548789.localdomain python3.9[261545]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:54:48 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:54:48 np0005548789.localdomain systemd-rc-local-generator[261569]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:48 np0005548789.localdomain systemd-sysv-generator[261572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23389 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E25BEF0000000001030307) 
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:54:49 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:54:49 np0005548789.localdomain sudo[261543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:50 np0005548789.localdomain sudo[261694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcxilodxyuemrcciothqrcciiertboby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014889.824908-1050-226200357422167/AnsiballZ_file.py
Dec 06 09:54:50 np0005548789.localdomain sudo[261694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:50 np0005548789.localdomain python3.9[261696]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:54:50 np0005548789.localdomain sudo[261694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:50 np0005548789.localdomain sudo[261804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsnvmdvygormictuozxytnsegoukqgaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014890.5519438-1074-216842161076285/AnsiballZ_stat.py
Dec 06 09:54:50 np0005548789.localdomain sudo[261804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:51 np0005548789.localdomain python3.9[261806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:54:51 np0005548789.localdomain sudo[261804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:51 np0005548789.localdomain sudo[261892]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-maghczucwigpjetvmhrucwdfvmyykahp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014890.5519438-1074-216842161076285/AnsiballZ_copy.py
Dec 06 09:54:51 np0005548789.localdomain sudo[261892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:51 np0005548789.localdomain python3.9[261894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014890.5519438-1074-216842161076285/.source.json _original_basename=.un01xqj8 follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:51 np0005548789.localdomain sudo[261892]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:52.037 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:52 np0005548789.localdomain sudo[262002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwpzwhgjjzdsbohsabxhgwzxtcnyfllj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014891.831062-1119-276541471390577/AnsiballZ_file.py
Dec 06 09:54:52 np0005548789.localdomain sudo[262002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:52 np0005548789.localdomain python3.9[262004]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:54:52 np0005548789.localdomain sudo[262002]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:54:52 np0005548789.localdomain sudo[262120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsuedcjmeqwlynqaneksdojttqcjxpwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014892.6007261-1143-163311044997215/AnsiballZ_stat.py
Dec 06 09:54:52 np0005548789.localdomain sudo[262120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:52 np0005548789.localdomain podman[262093]: 2025-12-06 09:54:52.942858376 +0000 UTC m=+0.102208754 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:54:52 np0005548789.localdomain podman[262093]: 2025-12-06 09:54:52.984783763 +0000 UTC m=+0.144134261 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:54:52 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:54:53 np0005548789.localdomain sudo[262120]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:53.309 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:53 np0005548789.localdomain sudo[262223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwdpzdpsciysvrhtoevmafxhtxwvyxpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014892.6007261-1143-163311044997215/AnsiballZ_copy.py
Dec 06 09:54:53 np0005548789.localdomain sudo[262223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:53 np0005548789.localdomain sudo[262223]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:54:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:54:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:54:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146512 "" "Go-http-client/1.1"
Dec 06 09:54:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:54:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1"
Dec 06 09:54:54 np0005548789.localdomain sudo[262333]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idxakwnckkrjjvlwbnuvvbbdrjuajnek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014894.2679055-1194-126156839150476/AnsiballZ_container_config_data.py
Dec 06 09:54:54 np0005548789.localdomain sudo[262333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:54 np0005548789.localdomain python3.9[262335]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Dec 06 09:54:54 np0005548789.localdomain sudo[262333]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:55 np0005548789.localdomain sudo[262443]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzecxzjaxohfagqhtvazxachvqaetypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014895.1524024-1221-39141073951730/AnsiballZ_container_config_hash.py
Dec 06 09:54:55 np0005548789.localdomain sudo[262443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:55 np0005548789.localdomain python3.9[262445]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:54:55 np0005548789.localdomain sudo[262443]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:56 np0005548789.localdomain sudo[262553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jusqjiolwmzffwyfnjsihtsgqrgcdylx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014896.233913-1248-258772699620791/AnsiballZ_podman_container_info.py
Dec 06 09:54:56 np0005548789.localdomain sudo[262553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:56 np0005548789.localdomain python3.9[262555]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:54:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:57.069 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:57 np0005548789.localdomain sshd[262580]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:57 np0005548789.localdomain sudo[262553]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:57.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:57.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:54:58 np0005548789.localdomain sshd[262602]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:54:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:58.346 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:54:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:58.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:58 np0005548789.localdomain sshd[262580]: Received disconnect from 14.194.101.210 port 34874:11: Bye Bye [preauth]
Dec 06 09:54:58 np0005548789.localdomain sshd[262580]: Disconnected from authenticating user root 14.194.101.210 port 34874 [preauth]
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.496 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.503 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.503 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.529 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.529 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.529 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.530 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.530 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:54:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:54:59 np0005548789.localdomain systemd[1]: tmp-crun.GGMk4C.mount: Deactivated successfully.
Dec 06 09:54:59 np0005548789.localdomain podman[262624]: 2025-12-06 09:54:59.937411795 +0000 UTC m=+0.090978982 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:54:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:54:59.949 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:54:59 np0005548789.localdomain podman[262624]: 2025-12-06 09:54:59.971820653 +0000 UTC m=+0.125387810 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:54:59 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.022 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.023 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.211 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.212 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12138MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.213 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.213 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.493 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.494 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.494 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.547 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:55:00 np0005548789.localdomain sshd[262602]: Received disconnect from 45.78.222.162 port 46368:11: Bye Bye [preauth]
Dec 06 09:55:00 np0005548789.localdomain sshd[262602]: Disconnected from authenticating user root 45.78.222.162 port 46368 [preauth]
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.967 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.972 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.990 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.991 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:55:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:00.991 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:01 np0005548789.localdomain sudo[262757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goujjtcvevyjruslikfakveuihapyivb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014900.7075293-1287-21809251239792/AnsiballZ_edpm_container_manage.py
Dec 06 09:55:01 np0005548789.localdomain sudo[262757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:01 np0005548789.localdomain python3[262759]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:55:01 np0005548789.localdomain podman[262796]: 
Dec 06 09:55:01 np0005548789.localdomain podman[262796]: 2025-12-06 09:55:01.707999243 +0000 UTC m=+0.083925727 container create 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=neutron_dhcp)
Dec 06 09:55:01 np0005548789.localdomain podman[262796]: 2025-12-06 09:55:01.669963744 +0000 UTC m=+0.045890228 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:55:01 np0005548789.localdomain python3[262759]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:55:01 np0005548789.localdomain sudo[262757]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:01.989 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:01.990 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:55:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:01.990 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:55:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:02.110 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:02 np0005548789.localdomain sudo[262942]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnpxfdkixjrrahhwyqanoaovggbzsqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014902.2814612-1311-117155539644954/AnsiballZ_stat.py
Dec 06 09:55:02 np0005548789.localdomain sudo[262942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:02.589 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:55:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:02.589 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:55:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:02.589 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:55:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:02.590 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:55:02 np0005548789.localdomain python3.9[262944]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:02 np0005548789.localdomain sudo[262942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.062 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.082 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.082 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.386 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:03 np0005548789.localdomain sudo[263054]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-seftxacpflfrwaxrurddykdjctrydhxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014903.1406667-1338-146020289613541/AnsiballZ_file.py
Dec 06 09:55:03 np0005548789.localdomain sudo[263054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:03.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:03 np0005548789.localdomain podman[263057]: 2025-12-06 09:55:03.565809008 +0000 UTC m=+0.087436394 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:55:03 np0005548789.localdomain podman[263057]: 2025-12-06 09:55:03.603171616 +0000 UTC m=+0.124798992 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:55:03 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:55:03 np0005548789.localdomain python3.9[263056]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:03 np0005548789.localdomain sudo[263054]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=433 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E295470000000001030307) 
Dec 06 09:55:03 np0005548789.localdomain sudo[263133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgmuaahxgxejfqtwujcyfeimxrxnmoih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014903.1406667-1338-146020289613541/AnsiballZ_stat.py
Dec 06 09:55:03 np0005548789.localdomain sudo[263133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:04 np0005548789.localdomain python3.9[263135]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:04 np0005548789.localdomain sudo[263133]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:04.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:04 np0005548789.localdomain sudo[263242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdqrxhzuwrtmnabnszlymzbayrbxtcmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.207366-1338-103556372797297/AnsiballZ_copy.py
Dec 06 09:55:04 np0005548789.localdomain sudo[263242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=434 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2996F0000000001030307) 
Dec 06 09:55:04 np0005548789.localdomain python3.9[263244]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014904.207366-1338-103556372797297/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:04 np0005548789.localdomain sudo[263242]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:05 np0005548789.localdomain sudo[263297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twnlpzkqmzcbgtterhoedeuofgxrmsvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.207366-1338-103556372797297/AnsiballZ_systemd.py
Dec 06 09:55:05 np0005548789.localdomain sudo[263297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23390 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E29BEF0000000001030307) 
Dec 06 09:55:05 np0005548789.localdomain python3.9[263299]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:55:05 np0005548789.localdomain systemd-sysv-generator[263325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:05 np0005548789.localdomain systemd-rc-local-generator[263321]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:05 np0005548789.localdomain sudo[263297]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:06 np0005548789.localdomain sudo[263388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urwygcrjnryysxuetrmqsqrgnobhqqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014904.207366-1338-103556372797297/AnsiballZ_systemd.py
Dec 06 09:55:06 np0005548789.localdomain sudo[263388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:06 np0005548789.localdomain python3.9[263390]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:55:06 np0005548789.localdomain systemd-rc-local-generator[263410]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:06 np0005548789.localdomain systemd-sysv-generator[263418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:55:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=435 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2A16F0000000001030307) 
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: tmp-crun.GFsWjj.mount: Deactivated successfully.
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:55:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc33178382dbe9faba5ad94b6a495cb65d25500ed99f88ead4db6dde47b2ec33/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc33178382dbe9faba5ad94b6a495cb65d25500ed99f88ead4db6dde47b2ec33/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:06 np0005548789.localdomain podman[263430]: 2025-12-06 09:55:06.938187433 +0000 UTC m=+0.114115357 container init 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, org.label-schema.license=GPLv2, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:55:06 np0005548789.localdomain podman[263430]: 2025-12-06 09:55:06.949111326 +0000 UTC m=+0.125039260 container start 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 09:55:06 np0005548789.localdomain podman[263430]: neutron_dhcp_agent
Dec 06 09:55:06 np0005548789.localdomain neutron_dhcp_agent[263443]: + sudo -E kolla_set_configs
Dec 06 09:55:06 np0005548789.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 06 09:55:06 np0005548789.localdomain sudo[263388]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Validating config file
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Copying service configuration files
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Writing out command to execute
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: ++ cat /run_command
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + ARGS=
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + sudo kolla_copy_cacerts
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + [[ ! -n '' ]]
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + . kolla_extend_start
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + umask 0022
Dec 06 09:55:07 np0005548789.localdomain neutron_dhcp_agent[263443]: + exec /usr/bin/neutron-dhcp-agent
Dec 06 09:55:07 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:07.159 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.910 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.911 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.934 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 53930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f62cece7-d51d-4849-803a-a589ad24c51a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53930000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:55:07.911896', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a59cb050-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.183350297, 'message_signature': '195febe4a2ed606e10cc4554ede1d0f9c573695b8658f452094bd4571bd310dc'}]}, 'timestamp': '2025-12-06 09:55:07.935078', '_unique_id': '83d1e32f5bd9437ebace1dbd2e57bec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.940 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4d73264-8e41-4060-952e-3fa766055d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:07.938017', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a59dac9e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': 'ace22b14a1f2226797c362ceb560ee2990928cdd0816134d035c64b9ed8fcd5c'}]}, 'timestamp': '2025-12-06 09:55:07.941552', '_unique_id': '9170c513d76d4ad18621e4cad8507618'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.943 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63148 DF PROTO=TCP SPT=51822 DPT=9102 SEQ=3063132591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2A5F00000000001030307) 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.956 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.957 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5118f837-9720-4a22-92f2-075899abfffb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:07.944486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a00c82-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': 'eb8b9b92ffe6d27f966b74bed37e7a9d3b9f270954e8966995198f25eea792cd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:07.944486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a02190-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': 'ba95c61cee0b7692ce3dbdb1877e79466b36113f5a1dac21cb318091fc63fae9'}]}, 'timestamp': '2025-12-06 09:55:07.957508', '_unique_id': 'f0e8a2fc12eb46f1bc47c1966560023d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.960 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5566d426-3b89-41f4-986b-2f0a4030b9da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:07.960181', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5a09c56-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': 'b909a53ac4d19c00193557af71dbcd3e4b67e1c597b4a4bbe13471955fb342da'}]}, 'timestamp': '2025-12-06 09:55:07.960673', '_unique_id': 'b8fcb7f0f21c482e9770b305be5e8aa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9015 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99ff6862-0256-499a-8ce8-a3b7c2f0a66f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9015, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:07.963129', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5a10f60-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': 'ba1994f18ad1e07c34196aeb003fab56249566367481930a2181fe583a2a9551'}]}, 'timestamp': '2025-12-06 09:55:07.963623', '_unique_id': '02b4f99e121040e997de4f65dec03303'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5b908dd-45a7-4e85-8f24-c0c26ad20ac1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:07.966263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a66e7e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': 'c24176437acd5477950c80a7ae2cd35f3c718d0864b80498a9751997c0dc0d64'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:07.966263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a683aa-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '8e9bd9b94bac294ae33c474eec9ff9a4c04cc5a33d9345d1cd045c4e251e674d'}]}, 'timestamp': '2025-12-06 09:55:07.999346', '_unique_id': '31ae399b61644513802103feadee04fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.001 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '541755ae-28ed-4408-9d5e-585ca68cfe40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.002296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a70910-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': '174f2e1ce9cca8ec937c34a050e119c84ebc851dfaa97609e733667af5eac389'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.002296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a71b1c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': '85c5af9a54a8731c8eb93a0ad19dc865fe4d0586e8f6802665cc42f67a56dd31'}]}, 'timestamp': '2025-12-06 09:55:08.003213', '_unique_id': '41d0b117ee0b473185d9dadf79b58d2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.005 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.005 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f51e9d1-2d49-4d47-9043-fa7dcc3566f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.005671', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5a79150-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '0fff0b325e284eb262d45f8498441b81df81d18053cc43c0452cb2d44d02f1ca'}]}, 'timestamp': '2025-12-06 09:55:08.006274', '_unique_id': 'be4f99b596f248e6b8e8a0281fa74354'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.008 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba663443-8334-470f-9d50-1e97ad93e9ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.008560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a7fe7e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '8d87d49f428a033d3bbd1a9e8901416d9245716faa0259c3a8edb4b39cdb7f07'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.008560', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a80f4a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '56d106f8aa9ce28dbb7836dde14dec211b3f422db3ef8f2e7d10b7c0d6d79b83'}]}, 'timestamp': '2025-12-06 09:55:08.009457', '_unique_id': '067599e807bc4c2fb295edf6e32d44f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.011 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14644a4c-2ccf-4648-9499-1fbce0ed3535', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.011954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a88240-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': 'd7a8d2e26b3558a7260428e22ba69d6d946dc9298c838a507049ec666c66cf4c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.011954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a892b2-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.193874437, 'message_signature': 'f0aae6599b341f6b5d29fa9f52e2f00e4b901380e295fd2538faccbcc514cf64'}]}, 'timestamp': '2025-12-06 09:55:08.012856', '_unique_id': 'dbe2b6feac8c4187a9e1cff2ef5146d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.014 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00139ef7-13c0-4a6b-88bd-d0e1a59930e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.015569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5a91228-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '5ca89a455632162702dd348cffda5c46c50c304cdb19e1ef1cc52448e569a3ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.015569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5a9263c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '4d364f040cca24fd3a153628a4fe925059aa6ccf50219de701d8f7e627f90d46'}]}, 'timestamp': '2025-12-06 09:55:08.016660', '_unique_id': '737a16d72c66435ea8b7613a20cfd93f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94b0808b-ad32-4a1e-bbcb-0aee0996ed22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.019114', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5a99a0e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '258a2c99402a8ed9cb5189e78ec03065fe6c445976734ad83e68333fcf7ae41b'}]}, 'timestamp': '2025-12-06 09:55:08.019594', '_unique_id': '1603b8452b08400793fde10d75aea2f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6be0a3f-1dd0-4afd-bbea-4616b76abad4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.021960', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5aa0c78-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '09eb191ac533088442cb2fd5a217bfe8539235aef43159272bc75e7e482faf19'}]}, 'timestamp': '2025-12-06 09:55:08.022526', '_unique_id': 'e6be44300ff24e75bb74b8644062fdaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57f92971-b7dc-4cfa-aa50-92b2402c2178', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.024796', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5aa7956-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '498eee4533a3f2ddb98d61d52e9544046de4b7472e35ee508b23fb06c13ca716'}]}, 'timestamp': '2025-12-06 09:55:08.025347', '_unique_id': '6ffa19764e8a497db1a5e6e35321f882'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48e5b932-98ab-4a7a-9f0d-a81ee5d586b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.028194', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5aaff2a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '83c7ef9c7404e8bba46ada99f44d2a6600b8761e73799b8f2fbbd9d550f8a99c'}]}, 'timestamp': '2025-12-06 09:55:08.028846', '_unique_id': '2ac12d8404ed46748765d58ef28ebf7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22e3c8ae-12ca-4d05-a0f3-e15641689ac4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:55:08.032344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a5aba330-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.183350297, 'message_signature': 'b361943befdf84c4ac19ff6cc7de380d7076540c0cbb163f509a509bd433a37a'}]}, 'timestamp': '2025-12-06 09:55:08.033027', '_unique_id': '8c7a58773457403c881ef071a6453ba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e043c49a-7da8-4c52-ab87-2fc313aec956', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.035061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5ac0532-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '4bf0f7aa904842419092ef84264ca8665bfe2c38ac729b8e26e9a4f4bcff8fa8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.035061', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5ac0ef6-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '94ae8635fa2f333b45b2cd41f78568829f124904540dee1dcbe2a37abc197ad6'}]}, 'timestamp': '2025-12-06 09:55:08.035580', '_unique_id': '125cee46969349a7895e9efb0fbb2c12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.036 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bee77569-f58d-4d3b-9e06-c8d8cddbcc8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.037083', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5ac5438-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': 'b744181154edbc3627afc7438dd296e4691c4f4c58664a85b45dc31d5ecfdbfa'}]}, 'timestamp': '2025-12-06 09:55:08.037389', '_unique_id': '405249e3808c497f8cb08b1068641bd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf9b3341-7826-4e70-a456-57d266d3d0c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.038992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5ac9e98-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': 'd363dab73549c68aafca545d66e3d1f818534fa695459e9e97bd21caad4b8f72'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.038992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5aca8d4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '9d802114817c00dcea293f1f1849d3e010613d66ccf955fb5cc3e26328641fdd'}]}, 'timestamp': '2025-12-06 09:55:08.039521', '_unique_id': 'ffff8174d62e41ac8ad3a500c98ef3f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.040 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69953f07-1886-4a41-8cc6-029736954866', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:55:08.040875', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'a5ace830-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.18738478, 'message_signature': '3f992ce39d732bc919dcd9ef0d719e27b7e1d29ca9a98f7861ffbbb1945fcbfc'}]}, 'timestamp': '2025-12-06 09:55:08.041170', '_unique_id': 'cc1f269a8f8c4062a067e4e1943f7f92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea226495-6eca-4d88-8e9b-1cdc2d5fb459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:55:08.042538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a5ad2930-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': '8d3a3da7843fde7db8a4020238b44a9e2f5e6c97d494512b715df0061f816730'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:55:08.042538', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a5ad3470-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11326.21563596, 'message_signature': 'efc2e7e1516f64e46904909c0f26a6c199cb381e5e3c10fb361f4cd3c7d7207d'}]}, 'timestamp': '2025-12-06 09:55:08.043097', '_unique_id': 'becacf04495b43e7a2df691fa729a316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:55:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:55:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:55:08 np0005548789.localdomain neutron_dhcp_agent[263443]: 2025-12-06 09:55:08.372 263447 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:55:08 np0005548789.localdomain neutron_dhcp_agent[263443]: 2025-12-06 09:55:08.373 263447 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Dec 06 09:55:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:08.417 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:08 np0005548789.localdomain neutron_dhcp_agent[263443]: 2025-12-06 09:55:08.769 263447 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 06 09:55:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:55:08 np0005548789.localdomain podman[263514]: 2025-12-06 09:55:08.887276358 +0000 UTC m=+0.052710837 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 09:55:08 np0005548789.localdomain podman[263514]: 2025-12-06 09:55:08.922342895 +0000 UTC m=+0.087777394 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:08 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain sudo[263586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irrlvlnbntndtnwigcxqjxftxwxvbxlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014908.7717814-1422-242696354277094/AnsiballZ_systemd.py
Dec 06 09:55:09 np0005548789.localdomain sudo[263586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:09 np0005548789.localdomain python3.9[263588]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: libpod-7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb.scope: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: libpod-7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb.scope: Consumed 2.136s CPU time.
Dec 06 09:55:09 np0005548789.localdomain podman[263592]: 2025-12-06 09:55:09.74890215 +0000 UTC m=+0.383369387 container died 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=neutron_dhcp_agent)
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: tmp-crun.3QGSKy.mount: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain podman[263592]: 2025-12-06 09:55:09.798717928 +0000 UTC m=+0.433185145 container cleanup 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:55:09 np0005548789.localdomain podman[263592]: neutron_dhcp_agent
Dec 06 09:55:09 np0005548789.localdomain podman[263631]: error opening file `/run/crun/7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb/status`: No such file or directory
Dec 06 09:55:09 np0005548789.localdomain podman[263620]: 2025-12-06 09:55:09.882741507 +0000 UTC m=+0.046171858 container cleanup 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:55:09 np0005548789.localdomain podman[263620]: neutron_dhcp_agent
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-fc33178382dbe9faba5ad94b6a495cb65d25500ed99f88ead4db6dde47b2ec33-merged.mount: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb-userdata-shm.mount: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Dec 06 09:55:09 np0005548789.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:55:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc33178382dbe9faba5ad94b6a495cb65d25500ed99f88ead4db6dde47b2ec33/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:10 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc33178382dbe9faba5ad94b6a495cb65d25500ed99f88ead4db6dde47b2ec33/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:10 np0005548789.localdomain podman[263633]: 2025-12-06 09:55:10.016291255 +0000 UTC m=+0.100199083 container init 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, tcib_managed=true)
Dec 06 09:55:10 np0005548789.localdomain podman[263633]: 2025-12-06 09:55:10.023926367 +0000 UTC m=+0.107834165 container start 7bf9e3fbc2679b0a9aa1c570de190e25e525d02579c8f25638444d044bf524fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0abd10feeed3687554ab703b297d02adae88194eeb13180e65a2b251617e0bfc'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:10 np0005548789.localdomain podman[263633]: neutron_dhcp_agent
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + sudo -E kolla_set_configs
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: Started neutron_dhcp_agent container.
Dec 06 09:55:10 np0005548789.localdomain sudo[263586]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Validating config file
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Copying service configuration files
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Writing out command to execute
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/00c6e44062d81bae38ea1c96678049e54d3f27d226bb6f9651816ab13eb94f06
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: ++ cat /run_command
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + CMD=/usr/bin/neutron-dhcp-agent
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + ARGS=
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + sudo kolla_copy_cacerts
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + [[ ! -n '' ]]
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + . kolla_extend_start
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: Running command: '/usr/bin/neutron-dhcp-agent'
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + umask 0022
Dec 06 09:55:10 np0005548789.localdomain neutron_dhcp_agent[263648]: + exec /usr/bin/neutron-dhcp-agent
Dec 06 09:55:10 np0005548789.localdomain sshd[256766]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: session-59.scope: Consumed 33.786s CPU time.
Dec 06 09:55:10 np0005548789.localdomain systemd-logind[766]: Session 59 logged out. Waiting for processes to exit.
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:55:10 np0005548789.localdomain systemd-logind[766]: Removed session 59.
Dec 06 09:55:10 np0005548789.localdomain podman[263680]: 2025-12-06 09:55:10.701441523 +0000 UTC m=+0.064124665 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:55:10 np0005548789.localdomain podman[263680]: 2025-12-06 09:55:10.712930042 +0000 UTC m=+0.075613214 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Dec 06 09:55:10 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:55:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=436 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2B12F0000000001030307) 
Dec 06 09:55:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:11.353 263652 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:55:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:11.353 263652 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43
Dec 06 09:55:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:11.761 263652 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 06 09:55:12 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:12.199 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:55:12 np0005548789.localdomain podman[263700]: 2025-12-06 09:55:12.916000914 +0000 UTC m=+0.077202513 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:55:12 np0005548789.localdomain podman[263700]: 2025-12-06 09:55:12.933174777 +0000 UTC m=+0.094376406 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:55:12 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:55:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:13.436 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:13.863 263652 INFO neutron.agent.dhcp.agent [None req-71b21f07-3eac-4577-97ea-8b42aa86de4e - - - - - -] All active networks have been fetched through RPC.
Dec 06 09:55:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:13.864 263652 INFO neutron.agent.dhcp.agent [-] Starting network 8e238f59-5792-4ff4-95af-f993c8e9e14f dhcp configuration
Dec 06 09:55:15 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:15.578 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:55:15 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:15.579 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 09:55:15 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:15.580 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:55:15 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:15.618 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:55:15 np0005548789.localdomain podman[263719]: 2025-12-06 09:55:15.902030821 +0000 UTC m=+0.073671966 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:55:15 np0005548789.localdomain podman[263719]: 2025-12-06 09:55:15.905712613 +0000 UTC m=+0.077353718 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:55:15 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.101 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp17_t2f8b/privsep.sock']
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:55:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.735 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.619 263746 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.624 263746 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.628 263746 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 06 09:55:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:16.629 263746 INFO oslo.privsep.daemon [-] privsep daemon running as pid 263746
Dec 06 09:55:16 np0005548789.localdomain sshd[263750]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:17 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:17.202 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.304 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpn7e0d44c/privsep.sock']
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.965 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.851 263757 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.855 263757 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.859 263757 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 06 09:55:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:17.859 263757 INFO oslo.privsep.daemon [-] privsep daemon running as pid 263757
Dec 06 09:55:18 np0005548789.localdomain sshd[263750]: Received disconnect from 118.219.234.233 port 35996:11: Bye Bye [preauth]
Dec 06 09:55:18 np0005548789.localdomain sshd[263750]: Disconnected from authenticating user root 118.219.234.233 port 35996 [preauth]
Dec 06 09:55:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:18.434 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:18.956 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp54h0iqk4/privsep.sock']
Dec 06 09:55:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=437 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E2D1EF0000000001030307) 
Dec 06 09:55:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:19.503 263652 INFO oslo.privsep.daemon [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:55:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:19.423 263769 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:55:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:19.428 263769 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:55:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:19.431 263769 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 09:55:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:19.431 263769 INFO oslo.privsep.daemon [-] privsep daemon running as pid 263769
Dec 06 09:55:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:20.815 263652 INFO neutron.agent.linux.ip_lib [None req-b0d34454-06a5-4c4f-a89b-eb8189250d6a - - - - - -] Device tape1277966-bb cannot be used as it has no MAC address
Dec 06 09:55:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:20.932 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:20 np0005548789.localdomain kernel: device tape1277966-bb entered promiscuous mode
Dec 06 09:55:20 np0005548789.localdomain NetworkManager[5973]: <info>  [1765014920.9394] manager: (tape1277966-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/15)
Dec 06 09:55:20 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:55:20Z|00047|binding|INFO|Claiming lport e1277966-bb4e-4c31-a08b-185a772cbf5b for this chassis.
Dec 06 09:55:20 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:55:20Z|00048|binding|INFO|e1277966-bb4e-4c31-a08b-185a772cbf5b: Claiming unknown
Dec 06 09:55:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:20.941 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:20 np0005548789.localdomain systemd-udevd[263784]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:55:20 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:55:20Z|00049|binding|INFO|Setting lport e1277966-bb4e-4c31-a08b-185a772cbf5b ovn-installed in OVS
Dec 06 09:55:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:20.948 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:20.952 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:20 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:20.981 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:21.018 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:21 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:21.045 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:21 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:55:21Z|00050|binding|INFO|Setting lport e1277966-bb4e-4c31-a08b-185a772cbf5b up in Southbound
Dec 06 09:55:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:21.246 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae43cb4c-3e04-441f-9177-31d5e45dfad9, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e1277966-bb4e-4c31-a08b-185a772cbf5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:55:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:21.249 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e1277966-bb4e-4c31-a08b-185a772cbf5b in datapath 8e238f59-5792-4ff4-95af-f993c8e9e14f bound to our chassis
Dec 06 09:55:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:21.251 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port e972a0a4-c434-4624-85e8-2a72a8f17075 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 09:55:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:21.251 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e238f59-5792-4ff4-95af-f993c8e9e14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 09:55:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:21.255 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb7eda4-b5db-49c2-9350-a8f2ab894a21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:55:21 np0005548789.localdomain podman[263840]: 
Dec 06 09:55:21 np0005548789.localdomain podman[263840]: 2025-12-06 09:55:21.853948542 +0000 UTC m=+0.091804097 container create e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:55:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7.scope.
Dec 06 09:55:21 np0005548789.localdomain podman[263840]: 2025-12-06 09:55:21.810459188 +0000 UTC m=+0.048314743 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 09:55:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:55:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9a1d5715f59223f418c72b8a9e9ba377238db2c64e0646a31ed33f6c0f74cb2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:55:21 np0005548789.localdomain podman[263840]: 2025-12-06 09:55:21.932061041 +0000 UTC m=+0.169916586 container init e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:55:21 np0005548789.localdomain podman[263840]: 2025-12-06 09:55:21.944054516 +0000 UTC m=+0.181910071 container start e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:21 np0005548789.localdomain dnsmasq[263859]: started, version 2.85 cachesize 150
Dec 06 09:55:21 np0005548789.localdomain dnsmasq[263859]: DNS service limited to local subnets
Dec 06 09:55:21 np0005548789.localdomain dnsmasq[263859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 09:55:21 np0005548789.localdomain dnsmasq[263859]: warning: no upstream servers configured
Dec 06 09:55:21 np0005548789.localdomain dnsmasq-dhcp[263859]: DHCP, static leases only on 192.168.122.0, lease time 1d
Dec 06 09:55:21 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 09:55:21 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 09:55:21 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 09:55:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:22.004 263652 INFO neutron.agent.dhcp.agent [None req-49f0d709-dacf-4695-b813-900c4037fee4 - - - - - -] Finished network 8e238f59-5792-4ff4-95af-f993c8e9e14f dhcp configuration
Dec 06 09:55:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:22.005 263652 INFO neutron.agent.dhcp.agent [None req-71b21f07-3eac-4577-97ea-8b42aa86de4e - - - - - -] Synchronizing state complete
Dec 06 09:55:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:22.160 263652 INFO neutron.agent.dhcp.agent [None req-71b21f07-3eac-4577-97ea-8b42aa86de4e - - - - - -] DHCP agent started
Dec 06 09:55:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:22.205 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 09:55:22.562 263652 INFO neutron.agent.dhcp.agent [None req-4f2ad4bb-00c6-46fe-8820-f28dba401957 - - - - - -] DHCP configuration for ports {'55ddb56c-afe2-4248-b1cd-f45aef0a3725', '49b140a4-d9f8-482f-b1ba-2b28b09c2e14', '75f7252a-6b17-46d4-b761-60a0a33ef03b'} is completed
Dec 06 09:55:23 np0005548789.localdomain sshd[263860]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:23.474 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:55:23 np0005548789.localdomain sshd[263860]: Accepted publickey for zuul from 192.168.122.30 port 49502 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:55:23 np0005548789.localdomain systemd-logind[766]: New session 60 of user zuul.
Dec 06 09:55:23 np0005548789.localdomain systemd[1]: Started Session 60 of User zuul.
Dec 06 09:55:23 np0005548789.localdomain sshd[263860]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:55:23 np0005548789.localdomain podman[263862]: 2025-12-06 09:55:23.644267761 +0000 UTC m=+0.090535569 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:55:23 np0005548789.localdomain podman[263862]: 2025-12-06 09:55:23.681575538 +0000 UTC m=+0.127843346 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:55:23 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:55:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:55:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:55:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:55:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:55:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:55:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17695 "" "Go-http-client/1.1"
Dec 06 09:55:24 np0005548789.localdomain python3.9[263996]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:55:26 np0005548789.localdomain python3.9[264109]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:55:26 np0005548789.localdomain network[264126]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:55:26 np0005548789.localdomain network[264127]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:55:26 np0005548789.localdomain network[264128]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:55:27 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:27.235 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:28.504 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:29 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:55:30 np0005548789.localdomain systemd[1]: tmp-crun.heiBBZ.mount: Deactivated successfully.
Dec 06 09:55:30 np0005548789.localdomain podman[264223]: 2025-12-06 09:55:30.107204617 +0000 UTC m=+0.086970480 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 06 09:55:30 np0005548789.localdomain podman[264223]: 2025-12-06 09:55:30.116108428 +0000 UTC m=+0.095874331 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:55:30 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:55:30 np0005548789.localdomain sshd[264255]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:30 np0005548789.localdomain sshd[264255]: Received disconnect from 64.227.102.57 port 52878:11: Bye Bye [preauth]
Dec 06 09:55:30 np0005548789.localdomain sshd[264255]: Disconnected from authenticating user root 64.227.102.57 port 52878 [preauth]
Dec 06 09:55:32 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:32.285 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:33 np0005548789.localdomain sudo[264380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhzcifdiihkedsdvrlzgopiqttxauebi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.1493497-102-222189026765198/AnsiballZ_setup.py
Dec 06 09:55:33 np0005548789.localdomain sudo[264380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:33.506 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52212 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E30A770000000001030307) 
Dec 06 09:55:33 np0005548789.localdomain python3.9[264382]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:55:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:55:33 np0005548789.localdomain systemd[1]: tmp-crun.Crcxfw.mount: Deactivated successfully.
Dec 06 09:55:33 np0005548789.localdomain podman[264387]: 2025-12-06 09:55:33.924787441 +0000 UTC m=+0.091660993 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:55:33 np0005548789.localdomain podman[264387]: 2025-12-06 09:55:33.930988639 +0000 UTC m=+0.097862141 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:55:33 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:55:34 np0005548789.localdomain sudo[264380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:34 np0005548789.localdomain sudo[264465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgbnfajbsigfxezjijjnymgdisrudklu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.1493497-102-222189026765198/AnsiballZ_dnf.py
Dec 06 09:55:34 np0005548789.localdomain sudo[264465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52213 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E30E6F0000000001030307) 
Dec 06 09:55:34 np0005548789.localdomain python3.9[264467]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:55:34 np0005548789.localdomain sudo[264468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:55:34 np0005548789.localdomain sudo[264468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:34 np0005548789.localdomain sudo[264468]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:34 np0005548789.localdomain sudo[264487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 09:55:34 np0005548789.localdomain sudo[264487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=438 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E311EF0000000001030307) 
Dec 06 09:55:35 np0005548789.localdomain systemd[1]: tmp-crun.E8O3ts.mount: Deactivated successfully.
Dec 06 09:55:35 np0005548789.localdomain podman[264579]: 2025-12-06 09:55:35.739249645 +0000 UTC m=+0.099131421 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, com.redhat.component=rhceph-container)
Dec 06 09:55:35 np0005548789.localdomain podman[264579]: 2025-12-06 09:55:35.873864235 +0000 UTC m=+0.233745961 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1763362218, io.buildah.version=1.41.4, com.redhat.component=rhceph-container)
Dec 06 09:55:36 np0005548789.localdomain sudo[264487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:36 np0005548789.localdomain sudo[264645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:55:36 np0005548789.localdomain sudo[264645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:36 np0005548789.localdomain sudo[264645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:36 np0005548789.localdomain sudo[264663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:55:36 np0005548789.localdomain sudo[264663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52214 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E316700000000001030307) 
Dec 06 09:55:37 np0005548789.localdomain sudo[264663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:37 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:37.289 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23391 DF PROTO=TCP SPT=52566 DPT=9102 SEQ=3998205506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E319EF0000000001030307) 
Dec 06 09:55:37 np0005548789.localdomain sudo[264714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:55:37 np0005548789.localdomain sudo[264714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:55:37 np0005548789.localdomain sudo[264714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:38 np0005548789.localdomain sudo[264465]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:38.548 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:38 np0005548789.localdomain sudo[264839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjeaveyperukbexvpklwodpmtbmxzhxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014938.294821-138-237236388902559/AnsiballZ_stat.py
Dec 06 09:55:38 np0005548789.localdomain sudo[264839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:38 np0005548789.localdomain python3.9[264841]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:38 np0005548789.localdomain sudo[264839]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:39 np0005548789.localdomain rsyslogd[760]: imjournal: 8252 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 06 09:55:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:55:39 np0005548789.localdomain sudo[264955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxctqhoxtfbnbhydmjuscxrromvjezfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014939.3742497-168-67575055311579/AnsiballZ_command.py
Dec 06 09:55:39 np0005548789.localdomain sudo[264955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:39 np0005548789.localdomain podman[264942]: 2025-12-06 09:55:39.848852589 +0000 UTC m=+0.068774731 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm)
Dec 06 09:55:39 np0005548789.localdomain podman[264942]: 2025-12-06 09:55:39.86135096 +0000 UTC m=+0.081273162 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:55:39 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:55:40 np0005548789.localdomain python3.9[264964]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:55:40 np0005548789.localdomain sudo[264955]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:40 np0005548789.localdomain sudo[265079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vynxgzgricahsckflondodzdgzyyhqnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014940.3958187-198-241691081658715/AnsiballZ_stat.py
Dec 06 09:55:40 np0005548789.localdomain sudo[265079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52215 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E326300000000001030307) 
Dec 06 09:55:40 np0005548789.localdomain sshd[265082]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:55:40 np0005548789.localdomain python3.9[265081]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:40 np0005548789.localdomain sudo[265079]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:40 np0005548789.localdomain podman[265084]: 2025-12-06 09:55:40.925261496 +0000 UTC m=+0.087393068 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64)
Dec 06 09:55:40 np0005548789.localdomain podman[265084]: 2025-12-06 09:55:40.939193182 +0000 UTC m=+0.101324754 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 06 09:55:40 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:55:41 np0005548789.localdomain sudo[265211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aegooodrzyhvmtovlodeuokdnstwdlyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014941.4844124-231-188054486291326/AnsiballZ_lineinfile.py
Dec 06 09:55:41 np0005548789.localdomain sudo[265211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:41 np0005548789.localdomain sshd[265082]: Received disconnect from 154.113.10.34 port 44712:11: Bye Bye [preauth]
Dec 06 09:55:41 np0005548789.localdomain sshd[265082]: Disconnected from authenticating user root 154.113.10.34 port 44712 [preauth]
Dec 06 09:55:42 np0005548789.localdomain python3.9[265213]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:42 np0005548789.localdomain sudo[265211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:42 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:42.337 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:43 np0005548789.localdomain sudo[265321]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guwuwgaghbcmwzxjdhnbqqxqdkeuipfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014942.6148653-258-91821955001430/AnsiballZ_systemd_service.py
Dec 06 09:55:43 np0005548789.localdomain sudo[265321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:55:43 np0005548789.localdomain podman[265324]: 2025-12-06 09:55:43.342062611 +0000 UTC m=+0.082965134 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 09:55:43 np0005548789.localdomain podman[265324]: 2025-12-06 09:55:43.38073201 +0000 UTC m=+0.121634513 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:55:43 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:55:43 np0005548789.localdomain python3.9[265323]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:43.552 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:43 np0005548789.localdomain sudo[265321]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:44 np0005548789.localdomain sudo[265452]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bychtvkydnjalmrkcclieaddogutbjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014943.92149-282-205860866808330/AnsiballZ_systemd_service.py
Dec 06 09:55:44 np0005548789.localdomain sudo[265452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:44 np0005548789.localdomain sshd[265455]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:55:44 np0005548789.localdomain python3.9[265454]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:55:44 np0005548789.localdomain sudo[265452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:46 np0005548789.localdomain sudo[265566]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jovnjfcmjwicwfwpwejtuyybzvxzbzqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014946.1354024-315-57481644958231/AnsiballZ_service_facts.py
Dec 06 09:55:46 np0005548789.localdomain sudo[265566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:55:46 np0005548789.localdomain podman[265568]: 2025-12-06 09:55:46.500483872 +0000 UTC m=+0.075935669 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:55:46 np0005548789.localdomain podman[265568]: 2025-12-06 09:55:46.532006374 +0000 UTC m=+0.107458171 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:55:46 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:55:46 np0005548789.localdomain python3.9[265569]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:55:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:55:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:55:46 np0005548789.localdomain network[265608]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:55:46 np0005548789.localdomain network[265609]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:55:46 np0005548789.localdomain network[265610]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:55:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:47.285 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:47.286 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:55:47.287 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:47 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:47.382 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:55:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:48.558 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:48 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52216 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E345EF0000000001030307) 
Dec 06 09:55:49 np0005548789.localdomain sshd[265455]: Received disconnect from 179.33.210.213 port 50414:11: Bye Bye [preauth]
Dec 06 09:55:49 np0005548789.localdomain sshd[265455]: Disconnected from authenticating user root 179.33.210.213 port 50414 [preauth]
Dec 06 09:55:49 np0005548789.localdomain sudo[265566]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:50 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:55:50Z|00051|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 06 09:55:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:52.430 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:52.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:53.523 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:53.523 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 09:55:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:53.558 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 09:55:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:53.587 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:55:53 np0005548789.localdomain podman[265752]: 2025-12-06 09:55:53.903814762 +0000 UTC m=+0.068630676 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:55:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:55:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:55:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:55:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:55:54 np0005548789.localdomain podman[265752]: 2025-12-06 09:55:54.016519473 +0000 UTC m=+0.181335367 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 09:55:54 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:55:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:55:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17712 "" "Go-http-client/1.1"
Dec 06 09:55:54 np0005548789.localdomain sudo[265869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuklpkfxekxmaxkmjdzbcytzajnnnleu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014954.3474212-345-132517136777290/AnsiballZ_file.py
Dec 06 09:55:54 np0005548789.localdomain sudo[265869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:54 np0005548789.localdomain python3.9[265871]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:55:54 np0005548789.localdomain sudo[265869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:55 np0005548789.localdomain sudo[265979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nldzabmxkrqiyikiuiahakwvnjpaerii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014955.262835-369-198511673380970/AnsiballZ_modprobe.py
Dec 06 09:55:55 np0005548789.localdomain sudo[265979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:55 np0005548789.localdomain python3.9[265981]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 06 09:55:55 np0005548789.localdomain sudo[265979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:56 np0005548789.localdomain sudo[266089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnmrkmrokqsaittvuesxgpnwhtodfrtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.1880472-393-266289901890538/AnsiballZ_stat.py
Dec 06 09:55:56 np0005548789.localdomain sudo[266089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:56 np0005548789.localdomain python3.9[266091]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:56 np0005548789.localdomain sudo[266089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:56 np0005548789.localdomain sudo[266146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgwxhwhmcncibzdwxbmeubomelrotxjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.1880472-393-266289901890538/AnsiballZ_file.py
Dec 06 09:55:56 np0005548789.localdomain sudo[266146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:57 np0005548789.localdomain python3.9[266148]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:57 np0005548789.localdomain sudo[266146]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:57 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:57.479 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:57 np0005548789.localdomain sudo[266256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyjntiomxyaodsrbxcouekpnuvwbzfcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014957.515632-432-78353664530583/AnsiballZ_lineinfile.py
Dec 06 09:55:57 np0005548789.localdomain sudo[266256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:57 np0005548789.localdomain python3.9[266258]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:57 np0005548789.localdomain sudo[266256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:58.537 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:58 np0005548789.localdomain sudo[266366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjksjxvxiwieyrqunsdbjoqereazisjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014958.2635312-459-82322429808777/AnsiballZ_file.py
Dec 06 09:55:58 np0005548789.localdomain sudo[266366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:58.635 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:55:58 np0005548789.localdomain python3.9[266368]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:58 np0005548789.localdomain sudo[266366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:59 np0005548789.localdomain sudo[266476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeclfordesxvxdjcbxgvaeofrhvmatjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014959.0956688-486-206723369828535/AnsiballZ_stat.py
Dec 06 09:55:59 np0005548789.localdomain sudo[266476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:59.497 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:59.533 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:55:59 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:55:59.534 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:55:59 np0005548789.localdomain python3.9[266478]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:55:59 np0005548789.localdomain sudo[266476]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:00 np0005548789.localdomain sudo[266588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brzgjkhqyjwjotndiwthopeegikkhdii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.0759296-513-142367318338163/AnsiballZ_stat.py
Dec 06 09:56:00 np0005548789.localdomain sudo[266588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:56:00 np0005548789.localdomain podman[266591]: 2025-12-06 09:56:00.456489236 +0000 UTC m=+0.077993612 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:56:00 np0005548789.localdomain podman[266591]: 2025-12-06 09:56:00.466235423 +0000 UTC m=+0.087739809 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 09:56:00 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:56:00 np0005548789.localdomain python3.9[266590]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.592 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.592 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.593 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:56:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:00.593 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:56:00 np0005548789.localdomain sudo[266588]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.109 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.129 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.129 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.130 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.130 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.155 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.156 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.156 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.156 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.157 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:56:01 np0005548789.localdomain sudo[266718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkrisrbvhfdmrtoisquuvlfbdhewqnyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.9050522-541-35526760185130/AnsiballZ_command.py
Dec 06 09:56:01 np0005548789.localdomain sudo[266718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:01 np0005548789.localdomain python3.9[266721]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:01 np0005548789.localdomain sudo[266718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.598 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.697 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.698 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.891 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.893 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11866MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.893 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:01.894 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.194 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.194 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.194 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:56:02 np0005548789.localdomain sudo[266851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovosjqsxfeqfztkhqjruaptmkxuqtqpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014961.790818-570-118887398265578/AnsiballZ_replace.py
Dec 06 09:56:02 np0005548789.localdomain sudo[266851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.268 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.333 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.333 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.349 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.385 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,HW_CPU_X86_BMI,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:56:02 np0005548789.localdomain python3.9[266853]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.421 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:56:02 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Dec 06 09:56:02 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:56:02 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:56:02 np0005548789.localdomain sudo[266851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:02 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.530 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.870 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.877 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.903 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.906 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.906 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.907 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:02.907 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 09:56:03 np0005548789.localdomain sudo[266984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auvbzrkkxqitzayukugnsqxtyxuojayr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014962.7167587-597-118134964144551/AnsiballZ_lineinfile.py
Dec 06 09:56:03 np0005548789.localdomain sudo[266984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:03 np0005548789.localdomain python3.9[266986]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:03 np0005548789.localdomain sudo[266984]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:03 np0005548789.localdomain sudo[267094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orfhtkmsghitsyygcabypphazrizqjtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014963.33339-597-97605505989443/AnsiballZ_lineinfile.py
Dec 06 09:56:03 np0005548789.localdomain sudo[267094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:03.641 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32400 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E37FA80000000001030307) 
Dec 06 09:56:03 np0005548789.localdomain python3.9[267096]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:03 np0005548789.localdomain sudo[267094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 np0005548789.localdomain sudo[267204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntnyhgmfxkgjuxqmsirvnkytsoxyfxqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014964.0235271-597-165753910723914/AnsiballZ_lineinfile.py
Dec 06 09:56:04 np0005548789.localdomain sudo[267204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:56:04 np0005548789.localdomain systemd[1]: tmp-crun.z1xR0e.mount: Deactivated successfully.
Dec 06 09:56:04 np0005548789.localdomain podman[267207]: 2025-12-06 09:56:04.386993496 +0000 UTC m=+0.070925766 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:56:04 np0005548789.localdomain podman[267207]: 2025-12-06 09:56:04.392790433 +0000 UTC m=+0.076722723 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:56:04 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:56:04 np0005548789.localdomain python3.9[267206]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:04 np0005548789.localdomain sudo[267204]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32401 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E383AF0000000001030307) 
Dec 06 09:56:05 np0005548789.localdomain sudo[267337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfwswchfkesufvbcaoxndlznprkmuvcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014964.6753783-597-237058346046532/AnsiballZ_lineinfile.py
Dec 06 09:56:05 np0005548789.localdomain sudo[267337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 np0005548789.localdomain python3.9[267339]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:05 np0005548789.localdomain sudo[267337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52217 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E385F00000000001030307) 
Dec 06 09:56:05 np0005548789.localdomain sudo[267447]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxdcrssryeqxtmwjdvyjxlxrpqtakhpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014965.4558957-684-51512300868159/AnsiballZ_stat.py
Dec 06 09:56:05 np0005548789.localdomain sudo[267447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 np0005548789.localdomain python3.9[267449]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:05 np0005548789.localdomain sudo[267447]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:06.294 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:06.294 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:06.295 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32402 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E38BAF0000000001030307) 
Dec 06 09:56:06 np0005548789.localdomain sudo[267559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eybkwdfuaicirpzfyfeghhsfnrwctbkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014966.5338032-714-186857585638137/AnsiballZ_file.py
Dec 06 09:56:06 np0005548789.localdomain sudo[267559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:07 np0005548789.localdomain python3.9[267561]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:07 np0005548789.localdomain sudo[267559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:07 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:07.574 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:07 np0005548789.localdomain sudo[267669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sieubkuxidvpmflnlzmmqkpwwjpgmdui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.275939-738-64952499051230/AnsiballZ_stat.py
Dec 06 09:56:07 np0005548789.localdomain sudo[267669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:07 np0005548789.localdomain python3.9[267671]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:07 np0005548789.localdomain sudo[267669]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=439 DF PROTO=TCP SPT=32932 DPT=9102 SEQ=690632304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E38FF00000000001030307) 
Dec 06 09:56:08 np0005548789.localdomain sudo[267726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxyqrubntgkgeenwnarvcbkiuvhporvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.275939-738-64952499051230/AnsiballZ_file.py
Dec 06 09:56:08 np0005548789.localdomain sudo[267726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:08 np0005548789.localdomain python3.9[267728]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:08 np0005548789.localdomain sudo[267726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:08 np0005548789.localdomain sudo[267836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jitmheruqwpmwnfjewzholtkhchenvan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014968.415154-738-237145334351681/AnsiballZ_stat.py
Dec 06 09:56:08 np0005548789.localdomain sudo[267836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:08.697 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:08 np0005548789.localdomain python3.9[267838]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:08 np0005548789.localdomain sudo[267836]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:09 np0005548789.localdomain sudo[267893]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzipfwxbyumvwofrahyhmiwjnlaaccgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014968.415154-738-237145334351681/AnsiballZ_file.py
Dec 06 09:56:09 np0005548789.localdomain sudo[267893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:09 np0005548789.localdomain python3.9[267895]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:09 np0005548789.localdomain sudo[267893]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:09 np0005548789.localdomain sudo[268003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmpidavxgwzleojazvznauvdtyhkhghh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014969.7135615-807-264713008539350/AnsiballZ_file.py
Dec 06 09:56:10 np0005548789.localdomain sudo[268003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:56:10 np0005548789.localdomain systemd[1]: tmp-crun.BsYvoK.mount: Deactivated successfully.
Dec 06 09:56:10 np0005548789.localdomain podman[268006]: 2025-12-06 09:56:10.125074002 +0000 UTC m=+0.098727704 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:56:10 np0005548789.localdomain podman[268006]: 2025-12-06 09:56:10.136175601 +0000 UTC m=+0.109829313 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 09:56:10 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:56:10 np0005548789.localdomain python3.9[268005]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:10 np0005548789.localdomain sudo[268003]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:10 np0005548789.localdomain sudo[268130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujvsdhvruobmwnpzmkmkxcoplzpixced ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014970.4345515-831-278011183154787/AnsiballZ_stat.py
Dec 06 09:56:10 np0005548789.localdomain sudo[268130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32403 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E39B6F0000000001030307) 
Dec 06 09:56:10 np0005548789.localdomain python3.9[268132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:10 np0005548789.localdomain sudo[268130]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:11 np0005548789.localdomain sudo[268187]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcgnlmzgvipdzulkuhrpoyplxfxadxqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014970.4345515-831-278011183154787/AnsiballZ_file.py
Dec 06 09:56:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:56:11 np0005548789.localdomain sudo[268187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:11 np0005548789.localdomain podman[268189]: 2025-12-06 09:56:11.26993942 +0000 UTC m=+0.099941492 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_id=edpm, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41)
Dec 06 09:56:11 np0005548789.localdomain podman[268189]: 2025-12-06 09:56:11.285110683 +0000 UTC m=+0.115112695 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Dec 06 09:56:11 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:56:11 np0005548789.localdomain python3.9[268190]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:11 np0005548789.localdomain sudo[268187]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:11 np0005548789.localdomain sudo[268315]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptoxyqdyqqnoieubswnikfobyestteev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014971.663904-867-280217684247926/AnsiballZ_stat.py
Dec 06 09:56:11 np0005548789.localdomain sudo[268315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:12 np0005548789.localdomain python3.9[268317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:12 np0005548789.localdomain sudo[268315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:12 np0005548789.localdomain sudo[268372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaadnfeqdoakweqvzcxtkzadvggezbzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014971.663904-867-280217684247926/AnsiballZ_file.py
Dec 06 09:56:12 np0005548789.localdomain sudo[268372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:12 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:12.619 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:12 np0005548789.localdomain python3.9[268374]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:12 np0005548789.localdomain sudo[268372]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:13 np0005548789.localdomain sudo[268482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iihbdxmswzvkialexazdevdfkejgrmkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014972.9895983-903-222693542850971/AnsiballZ_systemd.py
Dec 06 09:56:13 np0005548789.localdomain sudo[268482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:56:13 np0005548789.localdomain systemd[1]: tmp-crun.yTrVZh.mount: Deactivated successfully.
Dec 06 09:56:13 np0005548789.localdomain podman[268485]: 2025-12-06 09:56:13.680117312 +0000 UTC m=+0.086218353 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 09:56:13 np0005548789.localdomain podman[268485]: 2025-12-06 09:56:13.691149648 +0000 UTC m=+0.097250689 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:56:13 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:56:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:13.736 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:13 np0005548789.localdomain python3.9[268484]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:13 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:56:14 np0005548789.localdomain systemd-sysv-generator[268533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:14 np0005548789.localdomain systemd-rc-local-generator[268528]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:14 np0005548789.localdomain sudo[268482]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:15 np0005548789.localdomain sudo[268650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkyrunxcsqcvyddwnqdoghuacbnfyimd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014975.6088881-927-91293577025038/AnsiballZ_stat.py
Dec 06 09:56:15 np0005548789.localdomain sudo[268650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 np0005548789.localdomain python3.9[268652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:16 np0005548789.localdomain sudo[268650]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:16 np0005548789.localdomain sudo[268707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ugcznitawuehhuypxdheknjberdsxwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014975.6088881-927-91293577025038/AnsiballZ_file.py
Dec 06 09:56:16 np0005548789.localdomain sudo[268707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 np0005548789.localdomain python3.9[268709]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:16 np0005548789.localdomain sudo[268707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:56:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:56:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:56:16 np0005548789.localdomain systemd[1]: tmp-crun.zdmXB7.mount: Deactivated successfully.
Dec 06 09:56:16 np0005548789.localdomain podman[268751]: 2025-12-06 09:56:16.925415905 +0000 UTC m=+0.085380498 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:56:16 np0005548789.localdomain podman[268751]: 2025-12-06 09:56:16.965068406 +0000 UTC m=+0.125033009 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:56:16 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:56:17 np0005548789.localdomain sudo[268840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmxihwjqufkbsjerhzpykqbvqppsgqpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014976.7992554-963-234266280572930/AnsiballZ_stat.py
Dec 06 09:56:17 np0005548789.localdomain sudo[268840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:17 np0005548789.localdomain python3.9[268842]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:17 np0005548789.localdomain sudo[268840]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:17 np0005548789.localdomain sudo[268897]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwoyhtzpapubvktiysjwszvfwpyvjewq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014976.7992554-963-234266280572930/AnsiballZ_file.py
Dec 06 09:56:17 np0005548789.localdomain sudo[268897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:17 np0005548789.localdomain sshd[268900]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:56:17 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:17.671 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:17 np0005548789.localdomain python3.9[268899]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:17 np0005548789.localdomain sudo[268897]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:18 np0005548789.localdomain sudo[269009]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrwzassfpoikmhieqmkxkoypdtuywhjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014978.0972528-999-61378142046320/AnsiballZ_systemd.py
Dec 06 09:56:18 np0005548789.localdomain sudo[269009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:18 np0005548789.localdomain python3.9[269011]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:56:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:18.781 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:18 np0005548789.localdomain systemd-rc-local-generator[269039]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:18 np0005548789.localdomain systemd-sysv-generator[269043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:18 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:19 np0005548789.localdomain sshd[268900]: Received disconnect from 14.194.101.210 port 56198:11: Bye Bye [preauth]
Dec 06 09:56:19 np0005548789.localdomain sshd[268900]: Disconnected from authenticating user root 14.194.101.210 port 56198 [preauth]
Dec 06 09:56:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32404 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E3BBEF0000000001030307) 
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: Starting Create netns directory...
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:56:19 np0005548789.localdomain systemd[1]: Finished Create netns directory.
Dec 06 09:56:19 np0005548789.localdomain sudo[269009]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:19 np0005548789.localdomain sudo[269162]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eifqguinzdilfjafcxhslcqwtrbpchgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014979.69771-1029-112695400613648/AnsiballZ_file.py
Dec 06 09:56:19 np0005548789.localdomain sudo[269162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:20 np0005548789.localdomain python3.9[269164]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:20 np0005548789.localdomain sudo[269162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:20 np0005548789.localdomain sudo[269272]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spuzwbrafgxgkslpdjlghiozbclarxho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014980.4850776-1053-68269899742070/AnsiballZ_stat.py
Dec 06 09:56:20 np0005548789.localdomain sudo[269272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:20 np0005548789.localdomain python3.9[269274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:21 np0005548789.localdomain sudo[269272]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:21 np0005548789.localdomain sudo[269329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rajefrwdsatehdmvoissfqwnbzlxibna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014980.4850776-1053-68269899742070/AnsiballZ_file.py
Dec 06 09:56:21 np0005548789.localdomain sudo[269329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:21 np0005548789.localdomain python3.9[269331]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:21 np0005548789.localdomain sudo[269329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:22 np0005548789.localdomain sudo[269439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slxbhbdsjtcziazsoprywvdzkgqapofp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014981.9413486-1095-99913072475964/AnsiballZ_file.py
Dec 06 09:56:22 np0005548789.localdomain sudo[269439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:22 np0005548789.localdomain python3.9[269441]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.379 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:56:22 np0005548789.localdomain sudo[269439]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.411 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.412 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.412 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.471 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:22 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:22.720 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:23 np0005548789.localdomain sudo[269549]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frzmvuwojpilvvmfoilixttudcklxrxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014982.7628143-1119-207108348819894/AnsiballZ_stat.py
Dec 06 09:56:23 np0005548789.localdomain sudo[269549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:23 np0005548789.localdomain python3.9[269551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:23 np0005548789.localdomain sudo[269549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:23 np0005548789.localdomain sudo[269606]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwuuajcvqcioouexusogpnxsgaqixjse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014982.7628143-1119-207108348819894/AnsiballZ_file.py
Dec 06 09:56:23 np0005548789.localdomain sudo[269606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:23 np0005548789.localdomain python3.9[269608]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.kh1soc9z recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:23 np0005548789.localdomain sudo[269606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:23.814 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:56:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:56:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:56:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:56:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:56:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17722 "" "Go-http-client/1.1"
Dec 06 09:56:24 np0005548789.localdomain sudo[269716]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roagfnxkphidqsylckavlqwkscojwcag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.1627133-1155-160813162053448/AnsiballZ_file.py
Dec 06 09:56:24 np0005548789.localdomain sudo[269716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:56:24 np0005548789.localdomain podman[269718]: 2025-12-06 09:56:24.547360838 +0000 UTC m=+0.088308657 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 09:56:24 np0005548789.localdomain podman[269718]: 2025-12-06 09:56:24.589089731 +0000 UTC m=+0.130037500 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 09:56:24 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:56:24 np0005548789.localdomain python3.9[269719]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:24 np0005548789.localdomain sudo[269716]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:25 np0005548789.localdomain sudo[269851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbbmpzxaghsqwokkuudygcyidxdeuatl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.880667-1179-73136901254017/AnsiballZ_stat.py
Dec 06 09:56:25 np0005548789.localdomain sudo[269851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:25 np0005548789.localdomain sudo[269851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:25 np0005548789.localdomain sudo[269908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehzmuxynkdojqapvjnkgdslskhkrwyrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.880667-1179-73136901254017/AnsiballZ_file.py
Dec 06 09:56:25 np0005548789.localdomain sudo[269908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:25 np0005548789.localdomain sudo[269908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:26 np0005548789.localdomain sudo[270018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uorcbkmseibhrluelcdzetwhqpiywnjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014986.4235911-1221-194883930944964/AnsiballZ_container_config_data.py
Dec 06 09:56:26 np0005548789.localdomain sudo[270018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:27 np0005548789.localdomain python3.9[270020]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 06 09:56:27 np0005548789.localdomain sudo[270018]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:27 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:27.780 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:27 np0005548789.localdomain sudo[270128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qquzkqsqenyosdluahsbgvnpfjgmarne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014987.4910276-1248-231769577061542/AnsiballZ_container_config_hash.py
Dec 06 09:56:27 np0005548789.localdomain sudo[270128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:28 np0005548789.localdomain python3.9[270130]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:56:28 np0005548789.localdomain sudo[270128]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:28.816 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:28 np0005548789.localdomain sudo[270238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkntgpzyjsvdrfezmllnxkevwfipaahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014988.5395787-1276-3188204547085/AnsiballZ_podman_container_info.py
Dec 06 09:56:28 np0005548789.localdomain sudo[270238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:29 np0005548789.localdomain python3.9[270240]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:56:29 np0005548789.localdomain sudo[270238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:56:30 np0005548789.localdomain podman[270285]: 2025-12-06 09:56:30.916541719 +0000 UTC m=+0.080474928 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:56:30 np0005548789.localdomain podman[270285]: 2025-12-06 09:56:30.950190266 +0000 UTC m=+0.114123455 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:56:30 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:56:32 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:32.823 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:33 np0005548789.localdomain sudo[270393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdnctnkmolkkjbuaodynkitlwixpkqix ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014993.0687265-1314-63011956670651/AnsiballZ_edpm_container_manage.py
Dec 06 09:56:33 np0005548789.localdomain sudo[270393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11733 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E3F4D70000000001030307) 
Dec 06 09:56:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:33.818 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:33 np0005548789.localdomain python3[270395]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:56:34 np0005548789.localdomain python3[270395]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7",
                                                                    "Digest": "sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:11:02.031267563Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 249482216,
                                                                    "VirtualSize": 249482216,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/a6426b16bb5884060eaf559f46c5a81bf85811eff8d5d75aaee95a48f0b492cc/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:8c448567789503f6c5be645a12473dfc27734872532d528b6ee764c214f9f2f3"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:24.212273596Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:01.523582443Z",
                                                                              "created_by": "/bin/sh -c dnf -y install device-mapper-multipath iscsi-initiator-utils && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:11:03.162365736Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 06 09:56:34 np0005548789.localdomain sudo[270393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:34 np0005548789.localdomain sshd[270540]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:56:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11734 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E3F8EF0000000001030307) 
Dec 06 09:56:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:56:34 np0005548789.localdomain systemd[1]: tmp-crun.6ocq2N.mount: Deactivated successfully.
Dec 06 09:56:34 np0005548789.localdomain podman[270548]: 2025-12-06 09:56:34.915648983 +0000 UTC m=+0.074110314 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:56:34 np0005548789.localdomain sudo[270583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-udsjqaibabwbfssfskzobeewcsedxlky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014994.46459-1338-272620767615632/AnsiballZ_stat.py
Dec 06 09:56:34 np0005548789.localdomain sudo[270583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:34 np0005548789.localdomain podman[270548]: 2025-12-06 09:56:34.950671812 +0000 UTC m=+0.109133093 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:56:34 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:56:35 np0005548789.localdomain python3.9[270592]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:35 np0005548789.localdomain sudo[270583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:35 np0005548789.localdomain sshd[270540]: Received disconnect from 64.227.102.57 port 56692:11: Bye Bye [preauth]
Dec 06 09:56:35 np0005548789.localdomain sshd[270540]: Disconnected from authenticating user root 64.227.102.57 port 56692 [preauth]
Dec 06 09:56:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32405 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E3FBEF0000000001030307) 
Dec 06 09:56:35 np0005548789.localdomain sudo[270702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcdvfxztkohqfofkgwhqgyuyggbvkvcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014995.5492303-1365-192550627560994/AnsiballZ_file.py
Dec 06 09:56:35 np0005548789.localdomain sudo[270702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:36 np0005548789.localdomain python3.9[270704]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:36 np0005548789.localdomain sudo[270702]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:36 np0005548789.localdomain sudo[270757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klpxcrfopvmcdtmnlsxyzdnhtggtapce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014995.5492303-1365-192550627560994/AnsiballZ_stat.py
Dec 06 09:56:36 np0005548789.localdomain sudo[270757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:36 np0005548789.localdomain python3.9[270759]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:36 np0005548789.localdomain sudo[270757]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11735 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E400EF0000000001030307) 
Dec 06 09:56:37 np0005548789.localdomain sudo[270866]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwymxxzifnoqlsbkegfovywbtuvogkqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014996.5451448-1365-203454329109261/AnsiballZ_copy.py
Dec 06 09:56:37 np0005548789.localdomain sudo[270866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:37 np0005548789.localdomain python3.9[270868]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014996.5451448-1365-203454329109261/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:37 np0005548789.localdomain sudo[270866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52218 DF PROTO=TCP SPT=53674 DPT=9102 SEQ=1596883805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E403EF0000000001030307) 
Dec 06 09:56:37 np0005548789.localdomain sudo[270921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzaptiafnikpuomzraoqobtvapxypgcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014996.5451448-1365-203454329109261/AnsiballZ_systemd.py
Dec 06 09:56:37 np0005548789.localdomain sudo[270921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:37 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:37.869 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:38 np0005548789.localdomain sudo[270924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:56:38 np0005548789.localdomain sudo[270924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:38 np0005548789.localdomain sudo[270924]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 np0005548789.localdomain python3.9[270923]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:38 np0005548789.localdomain sudo[270942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:56:38 np0005548789.localdomain sudo[270942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:38 np0005548789.localdomain sudo[270921]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 np0005548789.localdomain sudo[270942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:38.821 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:39 np0005548789.localdomain sudo[271027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:56:39 np0005548789.localdomain sudo[271027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:56:39 np0005548789.localdomain sudo[271027]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:39 np0005548789.localdomain python3.9[271118]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:40 np0005548789.localdomain sudo[271226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-matgxsfflzmyajyagjetcpqsagqpugpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015000.2504182-1467-60646400112293/AnsiballZ_file.py
Dec 06 09:56:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:56:40 np0005548789.localdomain sudo[271226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:40 np0005548789.localdomain systemd[1]: tmp-crun.lyPeF1.mount: Deactivated successfully.
Dec 06 09:56:40 np0005548789.localdomain podman[271228]: 2025-12-06 09:56:40.681456737 +0000 UTC m=+0.103536092 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm)
Dec 06 09:56:40 np0005548789.localdomain podman[271228]: 2025-12-06 09:56:40.696135065 +0000 UTC m=+0.118214470 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 09:56:40 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:56:40 np0005548789.localdomain python3.9[271229]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:40 np0005548789.localdomain sudo[271226]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11736 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E410AF0000000001030307) 
Dec 06 09:56:41 np0005548789.localdomain sudo[271354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quljvvyfnfknbqiifexpfhjdgvxhpzmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015001.482225-1504-186754165283424/AnsiballZ_file.py
Dec 06 09:56:41 np0005548789.localdomain sudo[271354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:56:41 np0005548789.localdomain systemd[1]: tmp-crun.mf2P2V.mount: Deactivated successfully.
Dec 06 09:56:41 np0005548789.localdomain podman[271357]: 2025-12-06 09:56:41.882672605 +0000 UTC m=+0.104601555 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Dec 06 09:56:41 np0005548789.localdomain podman[271357]: 2025-12-06 09:56:41.903194871 +0000 UTC m=+0.125123781 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41)
Dec 06 09:56:41 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:56:41 np0005548789.localdomain python3.9[271356]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:56:41 np0005548789.localdomain sudo[271354]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:42 np0005548789.localdomain sudo[271484]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnqgzjtqqjyqqthygzqwawxrkzflzdvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015002.2098002-1527-59767850653129/AnsiballZ_modprobe.py
Dec 06 09:56:42 np0005548789.localdomain sudo[271484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:42 np0005548789.localdomain python3.9[271486]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 06 09:56:42 np0005548789.localdomain sudo[271484]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:42 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:42.924 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:43 np0005548789.localdomain sudo[271594]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alcczdmcnlylnwtjrnwzpniytzcjxfyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015003.0506806-1551-184429263286119/AnsiballZ_stat.py
Dec 06 09:56:43 np0005548789.localdomain sudo[271594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:43 np0005548789.localdomain python3.9[271596]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:43 np0005548789.localdomain sudo[271594]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:43.825 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:56:43 np0005548789.localdomain sudo[271661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddqyeujsrxlgcpavdzhfacizsohruchu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015003.0506806-1551-184429263286119/AnsiballZ_file.py
Dec 06 09:56:43 np0005548789.localdomain sudo[271661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:43 np0005548789.localdomain podman[271634]: 2025-12-06 09:56:43.932844936 +0000 UTC m=+0.088430750 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 09:56:44 np0005548789.localdomain podman[271634]: 2025-12-06 09:56:44.004660188 +0000 UTC m=+0.160246182 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 06 09:56:44 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:56:44 np0005548789.localdomain python3.9[271664]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:44 np0005548789.localdomain sudo[271661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:44 np0005548789.localdomain sudo[271778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrjydhefgrxluhzjppnswykynfzmbmhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015004.364538-1590-205961366530803/AnsiballZ_lineinfile.py
Dec 06 09:56:44 np0005548789.localdomain sudo[271778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:44 np0005548789.localdomain python3.9[271780]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:44 np0005548789.localdomain sudo[271778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:45 np0005548789.localdomain sudo[271888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtmwljmsrnxamijtrmlwkyzpqvtlluiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015005.2619116-1617-34223191593440/AnsiballZ_dnf.py
Dec 06 09:56:45 np0005548789.localdomain sudo[271888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:45 np0005548789.localdomain python3.9[271890]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:56:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:56:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:56:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:56:47.287 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:56:47.287 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:56:47.289 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:56:47 np0005548789.localdomain podman[271893]: 2025-12-06 09:56:47.759504636 +0000 UTC m=+0.081136348 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:56:47 np0005548789.localdomain podman[271893]: 2025-12-06 09:56:47.772210534 +0000 UTC m=+0.093842216 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:56:47 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:56:47 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:47.927 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:48.868 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:49 np0005548789.localdomain sudo[271888]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11737 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E431EF0000000001030307) 
Dec 06 09:56:50 np0005548789.localdomain python3.9[272021]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:56:51 np0005548789.localdomain sudo[272133]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kijlwioldmezgkjtmqpyjdrxadbrqtkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015010.774125-1669-140770578167569/AnsiballZ_file.py
Dec 06 09:56:51 np0005548789.localdomain sudo[272133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:51 np0005548789.localdomain python3.9[272135]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:51 np0005548789.localdomain sudo[272133]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:52 np0005548789.localdomain sudo[272243]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yicadnclskkalbgcpgsyerahejiatcdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015011.8310502-1702-245255927342006/AnsiballZ_systemd_service.py
Dec 06 09:56:52 np0005548789.localdomain sudo[272243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:52 np0005548789.localdomain python3.9[272245]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:56:52 np0005548789.localdomain systemd-rc-local-generator[272272]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:52 np0005548789.localdomain systemd-sysv-generator[272277]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:56:52 np0005548789.localdomain sudo[272243]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:52 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:52.964 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:53 np0005548789.localdomain python3.9[272389]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:56:53 np0005548789.localdomain sshd[272390]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:56:53 np0005548789.localdomain network[272408]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:56:53 np0005548789.localdomain network[272409]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:56:53 np0005548789.localdomain network[272410]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:56:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:53.871 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:56:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:56:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:56:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:56:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:56:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17723 "" "Go-http-client/1.1"
Dec 06 09:56:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:56:54 np0005548789.localdomain podman[272433]: 2025-12-06 09:56:54.727477268 +0000 UTC m=+0.081029025 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:56:54 np0005548789.localdomain podman[272433]: 2025-12-06 09:56:54.766091626 +0000 UTC m=+0.119643353 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:56:54 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:56:54 np0005548789.localdomain sshd[272390]: Received disconnect from 118.219.234.233 port 37764:11: Bye Bye [preauth]
Dec 06 09:56:54 np0005548789.localdomain sshd[272390]: Disconnected from authenticating user root 118.219.234.233 port 37764 [preauth]
Dec 06 09:56:55 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:56:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:58.004 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:56:58.874 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:56:59 np0005548789.localdomain sudo[272666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfjmmxvmkenpryaqyplhvlavuiwtxhuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015019.209068-1759-23864607315640/AnsiballZ_systemd_service.py
Dec 06 09:56:59 np0005548789.localdomain sudo[272666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:59 np0005548789.localdomain python3.9[272668]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:56:59 np0005548789.localdomain sudo[272666]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:00 np0005548789.localdomain sudo[272777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bapykagisspdxueedruciyscwvjffjep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015019.9950202-1759-19209792268171/AnsiballZ_systemd_service.py
Dec 06 09:57:00 np0005548789.localdomain sudo[272777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:57:00 np0005548789.localdomain python3.9[272779]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:00 np0005548789.localdomain sudo[272777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.940 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.940 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.941 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:57:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:00.941 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:57:01 np0005548789.localdomain sudo[272888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwttghbginhgkxoedlvvvgxhmpttpzjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015020.8542392-1759-195896976207510/AnsiballZ_systemd_service.py
Dec 06 09:57:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:57:01 np0005548789.localdomain sudo[272888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:01 np0005548789.localdomain systemd[1]: tmp-crun.rpIhmt.mount: Deactivated successfully.
Dec 06 09:57:01 np0005548789.localdomain podman[272890]: 2025-12-06 09:57:01.303136142 +0000 UTC m=+0.109636648 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:57:01 np0005548789.localdomain podman[272890]: 2025-12-06 09:57:01.332284691 +0000 UTC m=+0.138785217 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:57:01 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.399 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.421 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.422 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.423 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.423 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.423 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.424 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.453 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.453 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.454 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.454 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.455 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:57:01 np0005548789.localdomain python3.9[272891]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:01 np0005548789.localdomain sudo[272888]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:01 np0005548789.localdomain sudo[273035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyflzjwszfgqaifchwflilxiheaekgdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015021.6709259-1759-257361184140953/AnsiballZ_systemd_service.py
Dec 06 09:57:01 np0005548789.localdomain sudo[273035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:01.917 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.119 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.120 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.281 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.282 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11838MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.283 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.283 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:02 np0005548789.localdomain python3.9[273039]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:02 np0005548789.localdomain sudo[273035]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.388 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.389 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.389 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.452 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:57:02 np0005548789.localdomain sudo[273168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmgkdtodytivirypwnqlekuggfrzsmzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015022.4480631-1759-222599027615696/AnsiballZ_systemd_service.py
Dec 06 09:57:02 np0005548789.localdomain sudo[273168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.905 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.912 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.934 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.937 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:57:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:02.937 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:03.007 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:03.016 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:03.016 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:03 np0005548789.localdomain python3.9[273170]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:03 np0005548789.localdomain sudo[273168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:03 np0005548789.localdomain sudo[273281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mubppfirommtlzqsxvjtcxmjgbcdxpqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015023.267106-1759-245968888178277/AnsiballZ_systemd_service.py
Dec 06 09:57:03 np0005548789.localdomain sudo[273281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48528 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E46A070000000001030307) 
Dec 06 09:57:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:03.876 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:03 np0005548789.localdomain python3.9[273283]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:03 np0005548789.localdomain sudo[273281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:04 np0005548789.localdomain sudo[273392]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgeajkrpflusykpoanxkqfrnryrxkptt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015024.04562-1759-230667639345363/AnsiballZ_systemd_service.py
Dec 06 09:57:04 np0005548789.localdomain sudo[273392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:04.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:04 np0005548789.localdomain python3.9[273394]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:04 np0005548789.localdomain sudo[273392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48529 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E46E2F0000000001030307) 
Dec 06 09:57:05 np0005548789.localdomain sudo[273503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbnvktheqeqvrydrwbwlkticyaampvwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015024.7939713-1759-202438072722253/AnsiballZ_systemd_service.py
Dec 06 09:57:05 np0005548789.localdomain sudo[273503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:57:05 np0005548789.localdomain systemd[1]: tmp-crun.fJQYTY.mount: Deactivated successfully.
Dec 06 09:57:05 np0005548789.localdomain podman[273505]: 2025-12-06 09:57:05.237074857 +0000 UTC m=+0.090233275 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:57:05 np0005548789.localdomain podman[273505]: 2025-12-06 09:57:05.24631826 +0000 UTC m=+0.099476688 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:57:05 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:57:05 np0005548789.localdomain python3.9[273506]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:05 np0005548789.localdomain sudo[273503]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11738 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E471EF0000000001030307) 
Dec 06 09:57:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:06.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:06.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:57:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48530 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4762F0000000001030307) 
Dec 06 09:57:07 np0005548789.localdomain sudo[273636]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsocrzrhoaldisyvkqyhfoykwhhkusdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015026.9095998-1936-254022229685943/AnsiballZ_file.py
Dec 06 09:57:07 np0005548789.localdomain sudo[273636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:07 np0005548789.localdomain python3.9[273638]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:07 np0005548789.localdomain sudo[273636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:07 np0005548789.localdomain sudo[273746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayzjcxfescraazqcyfbqjkumymoqxkxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015027.4766827-1936-244846887498105/AnsiballZ_file.py
Dec 06 09:57:07 np0005548789.localdomain sudo[273746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32406 DF PROTO=TCP SPT=53894 DPT=9102 SEQ=3828970947 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E479EF0000000001030307) 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.915 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 9461 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62dfa7d2-f5e9-4bd7-b4b3-0ff7bbda66a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9461, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:07.912306', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed207af6-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '3966393b3e25aac4ca3ed7b0834410c6fbdac21627e3958b0df6d6fe933a1a34'}]}, 'timestamp': '2025-12-06 09:57:07.917141', '_unique_id': '534bc53231c84fbb9013b6d570a86167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.933 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.934 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ee6098c-9ebb-489e-863e-7bd23d1d6d87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:07.920453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed231e28-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': '22b905e7f1223ecd88b846e3e6617db996c512b3babd052d9f8855f0649f72a6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:07.920453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed233304-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': 'b8308019fd51a3975d5701c03eb50c1125db62be760fd53770e40cb0729eeac2'}]}, 'timestamp': '2025-12-06 09:57:07.934800', '_unique_id': '0ddec6eebfa44eed980a5f74e51d90e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.936 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.937 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:57:07 np0005548789.localdomain python3.9[273748]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50586b76-fd9b-4f87-9d32-723ddcd924dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:07.937539', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed28b11c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': 'ade3b3e8584038870ed9366e3528f1b372153be46135e12f5af981ec2e7f9089'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:07.937539', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed28c65c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '838878e4faa21653b95a476d2214f81aa30d8e2e94a56af155ea179601da82b1'}]}, 'timestamp': '2025-12-06 09:57:07.971291', '_unique_id': '334c72e5bfe44343b40739a28088e5bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.973 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:57:07 np0005548789.localdomain sudo[273746]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 52.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fca32807-7ffb-48e5-819e-3cab18562a42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.37890625, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:57:07.974132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ed2c23c4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.241890373, 'message_signature': '7ea4d356a2276580414d4ae3c52caad1ab69424a3ef4a0c0973a21eeea8d1e2f'}]}, 'timestamp': '2025-12-06 09:57:07.993353', '_unique_id': 'f8f5573a02cb4676bac67f7560ff2086'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7753410b-f092-42d9-8f32-f3309e3a3e9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:07.996183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed2ca650-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': '9f3dc85f3a6ed3e9e29b54e07fb4fe67266996bc3712265124ef85fd1bfa45a8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:07.996183', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed2cb866-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': '22ea942adca4ce7360245f968422372e55d34a5a899bdb7eb5dc1126661dcc56'}]}, 'timestamp': '2025-12-06 09:57:07.997099', '_unique_id': 'faad32db3b6e47c2a7e6ec163d25d421'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.000 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53da2765-f05c-4943-bed1-2bc6b8e286df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:07.999665', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed2d30b6-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': 'fae15123509e8891b7884ab14e2aab43a69fc34ed02075b092d5718b191e8de3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:07.999665', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed2d434e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.169846523, 'message_signature': '3a1447ae1541164cacd513252efd092271fbe93e8485ea763b08c3736c3c3e72'}]}, 'timestamp': '2025-12-06 09:57:08.000660', '_unique_id': 'cc7d07be4fa4443485a557e5bf845c80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 524 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4b9675a-ab2f-4be8-bfec-163236929546', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 524, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:08.003117', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed2db54a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '30449e3ea13ebadacdf55be0157cb95ba812cb45d5e634cbe2bfca65ae69ab63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:08.003117', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed2dc788-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': 'bb35d5a5d0200399bdb58ce67d5a52b6e4ba8d29bdb7831575443105cd4a843b'}]}, 'timestamp': '2025-12-06 09:57:08.004074', '_unique_id': '550789307d9c42a0a8572f02502da1d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1043514478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 200503964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae02039d-eb4a-49b1-9a71-646c1bd97c05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1043514478, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:08.006488', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed2e38e4-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '04287330ced4e155e8f57df4a3ff0c697c637e62bcf50b3e91257262c8f811a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200503964, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:08.006488', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed2e4b54-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '77a3d0d7ae2b7b88a96cda1728edf1bbed26bb1c4cc30b0b86e8225559a4869b'}]}, 'timestamp': '2025-12-06 09:57:08.007414', '_unique_id': '0a2308ccd3c94a23bf1772751fe77581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7108e5b1-f667-4b01-a97b-9569f53c6c8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:08.009883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed2ebdaa-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '3e6632db63175e1bbde779abf779fed9ca0c8104d6b2c07f060f417dc11ec07e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:08.009883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed2ed09c-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '301c1e4d3aae45e003c16d0ccb27e6cfa2711842c5b1ee3846ed2557f92d063d'}]}, 'timestamp': '2025-12-06 09:57:08.010895', '_unique_id': '3fa75abda40244869d8e836f98cc393d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.013 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.013 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a224913a-83bd-458c-ae76-edc8605a26f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.013802', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed2f5724-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '59817b026f94bb7e2428f849dc6b6546e00d5762cd85c343f9e0e1eb55591d9b'}]}, 'timestamp': '2025-12-06 09:57:08.014297', '_unique_id': '31f47b0abe134254b03de19e9d3a2e8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1fb34b5-1544-4b08-b6c1-dabf32483eaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.016560', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed2fc506-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '758f20508ecc9d001dd8003f780b9d611f42a1f40678ef3c282717ea122644ac'}]}, 'timestamp': '2025-12-06 09:57:08.017114', '_unique_id': 'e4be80e9a0bc4037946daf182129f7f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9696c544-ca6e-478c-bc7a-8f150fdb23e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.019922', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed30463e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': 'a3d0dafa47f014c2e657f1a38e135e0171c30579ead55422a94a4e3713274d75'}]}, 'timestamp': '2025-12-06 09:57:08.020459', '_unique_id': '196390bf837c493eb40e5670fa022888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.023 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a21aec9a-4818-4c1e-9b20-2ee31e077df8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.023733', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed30e260-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '3939103945f8d4258791de5d6902e36b2c824d7a931c32853ea3cf529626bb7b'}]}, 'timestamp': '2025-12-06 09:57:08.024540', '_unique_id': '7aefa85d777247fe9ba183216cdb19a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.027 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.027 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba304367-e8c3-4708-89cc-9c8881696cee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.027744', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed317a54-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': 'd899aa9f18cf85f74cf80df8f10a0574fe36fcbcd45ff3a2a512985879b6f5d4'}]}, 'timestamp': '2025-12-06 09:57:08.028434', '_unique_id': 'b7ae27dd944246b7a791eb8f0f9e40bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:08.047 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9d1e5a0-380d-4111-afa2-e4d48cebf4bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.045517', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed343000-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '2b381c4e940f98cda2b0b2d98e848ff8ecadae2f59dbc94a70aeed13f4102f6f'}]}, 'timestamp': '2025-12-06 09:57:08.046079', '_unique_id': '04bd5d079aef446e8602eebf53d5928f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.049 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.049 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3240774-79ca-4b99-87a0-89db06101102', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.049896', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed34d866-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '3d6839866218a66352c1b51a1c03c9624e957b6ecfcf1bd8957ca77e9b820167'}]}, 'timestamp': '2025-12-06 09:57:08.050378', '_unique_id': '204825a98a3541b9ad79f8ac75240bf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.052 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a2bcb7f-a193-46c7-9a08-a4df9f9a8dcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.052631', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed354418-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '1fa61a8fb3184da8d21c5c1563bcbbffcd719a07174b688118637fbd2aa58896'}]}, 'timestamp': '2025-12-06 09:57:08.053053', '_unique_id': '2cdd3a7e492646e39e2f13d7f87df5da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.054 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 223261606 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.054 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 30984668 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '282d62f4-6d84-4f88-a99e-aa7de173c1a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 223261606, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:08.054572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed358c7a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': 'f7bb277b6c202677ccb767432f8c63626a8309da89905353665c6f911db63983'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30984668, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:08.054572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed359774-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '7ac531f3dcfe11b06575d59f8a4957619f911ab9eb5adc509e93c6870caed47a'}]}, 'timestamp': '2025-12-06 09:57:08.055154', '_unique_id': '29630b38cfd045f5b8a11dffae61dd09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.055 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.056 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 73908224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a7f2147-6af7-407d-b550-87547d54fb4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73908224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T09:57:08.056704', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ed35e03a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': '46d0e153b97648f0abe486278f990dec6674c3b2145fecc8508284a16cba70cf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T09:57:08.056704', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ed35ea3a-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.186909024, 'message_signature': 'b132b75a7c77771c07fa77fdf85f0460e8b772fd39ffd39d33816348277b2782'}]}, 'timestamp': '2025-12-06 09:57:08.057272', '_unique_id': 'a605e28691324433a8b01a9e7953b056'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.058 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.058 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 54990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59fdac27-99bc-44a4-8d85-97afbfc570bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54990000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T09:57:08.058618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ed3629f0-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.241890373, 'message_signature': 'e82db03b163d325c68f3993e718eac53f0147c2ed5b3c2d117ecea4151ba293f'}]}, 'timestamp': '2025-12-06 09:57:08.058915', '_unique_id': 'c2166662e9214158851d58d8240590e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.059 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.060 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 93 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ebaf13-9ac5-477c-96c8-f4c561834867', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 93, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T09:57:08.060284', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ed366a6e-d289-11f0-aaf2-fa163e118844', 'monotonic_time': 11446.161664054, 'message_signature': '66f205a0dae88bd8ac671c9831dbd2e3c61964b9c5fb6900433798cdf5c8ee30'}]}, 'timestamp': '2025-12-06 09:57:08.060609', '_unique_id': 'a976aca861ee47f0b5aff47b253f189c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 09:57:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:57:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 09:57:08 np0005548789.localdomain sudo[273856]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbkbmkdxfmtzgmbafcwidmoyxkhgtqck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015028.1366162-1936-250352519870532/AnsiballZ_file.py
Dec 06 09:57:08 np0005548789.localdomain sudo[273856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:08 np0005548789.localdomain python3.9[273858]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:08 np0005548789.localdomain sudo[273856]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:08.878 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:09 np0005548789.localdomain sudo[273966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwfxxohikroefufbiuubbklzrgvfkexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015028.7822068-1936-177493104388046/AnsiballZ_file.py
Dec 06 09:57:09 np0005548789.localdomain sudo[273966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:09 np0005548789.localdomain python3.9[273968]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:09 np0005548789.localdomain sudo[273966]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:09 np0005548789.localdomain sudo[274076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxlfgvzyutkjvivghqfpmmbmdyibskmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015029.4615886-1936-8595201563478/AnsiballZ_file.py
Dec 06 09:57:09 np0005548789.localdomain sudo[274076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:09 np0005548789.localdomain python3.9[274078]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:09 np0005548789.localdomain sudo[274076]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:10 np0005548789.localdomain sudo[274186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-valvkhpualsywgzbsvlbxohgwbdzbduw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015030.0894146-1936-211301662029569/AnsiballZ_file.py
Dec 06 09:57:10 np0005548789.localdomain sudo[274186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:10 np0005548789.localdomain python3.9[274188]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:10 np0005548789.localdomain sudo[274186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:10 np0005548789.localdomain sshd[274245]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48531 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E485F00000000001030307) 
Dec 06 09:57:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:57:10 np0005548789.localdomain sudo[274310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yikerebvcijjakezygddofnzeooasxek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015030.6627533-1936-14584641943431/AnsiballZ_file.py
Dec 06 09:57:10 np0005548789.localdomain podman[274276]: 2025-12-06 09:57:10.934024639 +0000 UTC m=+0.087722879 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:57:10 np0005548789.localdomain sudo[274310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:10 np0005548789.localdomain podman[274276]: 2025-12-06 09:57:10.968267864 +0000 UTC m=+0.121966084 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:57:10 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:57:11 np0005548789.localdomain python3.9[274319]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:11 np0005548789.localdomain sudo[274310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:11 np0005548789.localdomain sudo[274427]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syzumxppadsgkermcnaxltcxbievlijr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015031.2811744-1936-105777249451387/AnsiballZ_file.py
Dec 06 09:57:11 np0005548789.localdomain sudo[274427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:11 np0005548789.localdomain python3.9[274429]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:11 np0005548789.localdomain sudo[274427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:11 np0005548789.localdomain sshd[274245]: Received disconnect from 154.113.10.34 port 38726:11: Bye Bye [preauth]
Dec 06 09:57:11 np0005548789.localdomain sshd[274245]: Disconnected from authenticating user root 154.113.10.34 port 38726 [preauth]
Dec 06 09:57:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:57:12 np0005548789.localdomain podman[274447]: 2025-12-06 09:57:12.054643086 +0000 UTC m=+0.081459817 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public)
Dec 06 09:57:12 np0005548789.localdomain podman[274447]: 2025-12-06 09:57:12.068011454 +0000 UTC m=+0.094828215 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec 06 09:57:12 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:57:12 np0005548789.localdomain sudo[274557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlsncgwfnndjzqqkvpzylgyrhbljezvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015032.4685867-2107-127461350110102/AnsiballZ_file.py
Dec 06 09:57:12 np0005548789.localdomain sudo[274557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:12 np0005548789.localdomain python3.9[274559]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:13 np0005548789.localdomain sudo[274557]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:13.091 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:13 np0005548789.localdomain sudo[274667]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpoijsiqiryqpftoizyzzzsospincgks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015033.1162632-2107-201420886057778/AnsiballZ_file.py
Dec 06 09:57:13 np0005548789.localdomain sudo[274667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:13 np0005548789.localdomain python3.9[274669]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:13 np0005548789.localdomain sudo[274667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:13.880 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:14 np0005548789.localdomain sudo[274777]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkqgheykxhvfikkzikwgpdigiciyzefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015033.724347-2107-72791942855057/AnsiballZ_file.py
Dec 06 09:57:14 np0005548789.localdomain sudo[274777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:14 np0005548789.localdomain python3.9[274779]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:14 np0005548789.localdomain sudo[274777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:14 np0005548789.localdomain sudo[274887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqledougezxevmskhmykizgsstucwgav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015034.4248528-2107-77925166195531/AnsiballZ_file.py
Dec 06 09:57:14 np0005548789.localdomain sudo[274887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:57:14 np0005548789.localdomain podman[274890]: 2025-12-06 09:57:14.943974904 +0000 UTC m=+0.096153846 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 09:57:14 np0005548789.localdomain podman[274890]: 2025-12-06 09:57:14.954807054 +0000 UTC m=+0.106985996 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 06 09:57:14 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:57:15 np0005548789.localdomain python3.9[274889]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:15 np0005548789.localdomain sudo[274887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:15 np0005548789.localdomain sudo[275016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmjdpmjuhxekxiuqidajabtpvqndpazb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015035.2106469-2107-139092669618281/AnsiballZ_file.py
Dec 06 09:57:15 np0005548789.localdomain sudo[275016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:15 np0005548789.localdomain python3.9[275018]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:15 np0005548789.localdomain sudo[275016]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:16 np0005548789.localdomain sudo[275126]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmcmwcthkfdnvmabvniwqebdhlmppcdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015035.8642194-2107-168063702739301/AnsiballZ_file.py
Dec 06 09:57:16 np0005548789.localdomain sudo[275126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:16 np0005548789.localdomain python3.9[275128]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:16 np0005548789.localdomain sudo[275126]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:57:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:57:16 np0005548789.localdomain sudo[275236]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snlgnnllwvqgbeumuanvagcnbcjprdam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015036.4828649-2107-117286501157962/AnsiballZ_file.py
Dec 06 09:57:16 np0005548789.localdomain sudo[275236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:17 np0005548789.localdomain python3.9[275238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:17 np0005548789.localdomain sudo[275236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:17 np0005548789.localdomain sudo[275346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmrghhldfkjcmxuquyuiqvxttlkhvlaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015037.1673734-2107-29704767199606/AnsiballZ_file.py
Dec 06 09:57:17 np0005548789.localdomain sudo[275346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:17 np0005548789.localdomain python3.9[275348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:17 np0005548789.localdomain sudo[275346]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:57:17 np0005548789.localdomain systemd[1]: tmp-crun.IcmrjZ.mount: Deactivated successfully.
Dec 06 09:57:17 np0005548789.localdomain podman[275366]: 2025-12-06 09:57:17.917923415 +0000 UTC m=+0.080622832 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:57:17 np0005548789.localdomain podman[275366]: 2025-12-06 09:57:17.926734634 +0000 UTC m=+0.089434061 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 09:57:17 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:57:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:18.139 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:18 np0005548789.localdomain sudo[275479]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfsmdbtlheverktgkhbruebwrgsolazm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015038.2900074-2281-23554386726593/AnsiballZ_command.py
Dec 06 09:57:18 np0005548789.localdomain sudo[275479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:18 np0005548789.localdomain python3.9[275481]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:18 np0005548789.localdomain sudo[275479]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:18 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:18.884 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48532 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4A5EF0000000001030307) 
Dec 06 09:57:19 np0005548789.localdomain python3.9[275591]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:57:20 np0005548789.localdomain sudo[275699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxbgysgzeqrbhnaoigohkjzdcbyjwzeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015039.9907928-2335-135494129003817/AnsiballZ_systemd_service.py
Dec 06 09:57:20 np0005548789.localdomain sudo[275699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:20 np0005548789.localdomain python3.9[275701]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 09:57:20 np0005548789.localdomain systemd-rc-local-generator[275728]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:20 np0005548789.localdomain systemd-sysv-generator[275732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:20 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 09:57:21 np0005548789.localdomain sudo[275699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:21 np0005548789.localdomain sudo[275845]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izbjprotvosjsyzknslkqwxzsgidfuaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015041.2930734-2359-276742532226085/AnsiballZ_command.py
Dec 06 09:57:21 np0005548789.localdomain sudo[275845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:21 np0005548789.localdomain python3.9[275847]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:21 np0005548789.localdomain sudo[275845]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:21 np0005548789.localdomain sshd[275849]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:22 np0005548789.localdomain sudo[275957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsldosnlegnmvwkbegcleeugwojoodgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015041.9636865-2359-78833853078670/AnsiballZ_command.py
Dec 06 09:57:22 np0005548789.localdomain sudo[275957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:22 np0005548789.localdomain python3.9[275959]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:22 np0005548789.localdomain sudo[275957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:22 np0005548789.localdomain sudo[276068]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duvkcbqrshldzxjrhzujbbkvqvmhxvib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015042.5527587-2359-58815834590588/AnsiballZ_command.py
Dec 06 09:57:22 np0005548789.localdomain sudo[276068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:23 np0005548789.localdomain python3.9[276070]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:23 np0005548789.localdomain sudo[276068]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:23.142 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:23 np0005548789.localdomain sudo[276179]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwzuudjqzkpayoxfpcweujtauspopndm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015043.1761634-2359-149732002349052/AnsiballZ_command.py
Dec 06 09:57:23 np0005548789.localdomain sudo[276179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:23 np0005548789.localdomain python3.9[276181]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:23 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:23.887 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:57:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:57:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:57:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:57:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:57:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17717 "" "Go-http-client/1.1"
Dec 06 09:57:24 np0005548789.localdomain sudo[276179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:57:24 np0005548789.localdomain systemd[1]: tmp-crun.LeOjCl.mount: Deactivated successfully.
Dec 06 09:57:24 np0005548789.localdomain podman[276204]: 2025-12-06 09:57:24.933295642 +0000 UTC m=+0.094275058 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:57:24 np0005548789.localdomain podman[276204]: 2025-12-06 09:57:24.995204811 +0000 UTC m=+0.156184227 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:57:25 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:57:25 np0005548789.localdomain sudo[276315]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrnkwgbpnwlrxazzfkntqplmbuuachat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015044.8593662-2359-561156170260/AnsiballZ_command.py
Dec 06 09:57:25 np0005548789.localdomain sudo[276315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:25 np0005548789.localdomain python3.9[276317]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:25 np0005548789.localdomain sudo[276315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:25 np0005548789.localdomain sudo[276426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tagfuxkxzgsnvmcmxnqquxlzzoladdyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015045.5310707-2359-207106038639977/AnsiballZ_command.py
Dec 06 09:57:25 np0005548789.localdomain sudo[276426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:26 np0005548789.localdomain python3.9[276428]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:26 np0005548789.localdomain sudo[276426]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:26 np0005548789.localdomain sudo[276537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yropkpiglepolelxahwyofxosmzeykhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015046.195115-2359-69373300427265/AnsiballZ_command.py
Dec 06 09:57:26 np0005548789.localdomain sudo[276537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:26 np0005548789.localdomain python3.9[276539]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:26 np0005548789.localdomain sudo[276537]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:27 np0005548789.localdomain sudo[276648]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gstxggqaxjjtcmeulpzhcqygzyfzfzcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015046.8355863-2359-33951507697632/AnsiballZ_command.py
Dec 06 09:57:27 np0005548789.localdomain sudo[276648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:27 np0005548789.localdomain python3.9[276650]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:27 np0005548789.localdomain sudo[276648]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:28.187 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:28 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:28.889 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:29 np0005548789.localdomain sshd[276669]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:30 np0005548789.localdomain sudo[276760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmlfvmxwxspmwfsxqsvdeieiowytfubo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015049.9157925-2566-201949338074342/AnsiballZ_file.py
Dec 06 09:57:30 np0005548789.localdomain sudo[276760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:30 np0005548789.localdomain python3.9[276762]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:30 np0005548789.localdomain sudo[276760]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:30 np0005548789.localdomain sudo[276870]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sntljbgnfalgapwmgpuhzbdcuxggciui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015050.585407-2566-67827181672130/AnsiballZ_file.py
Dec 06 09:57:30 np0005548789.localdomain sudo[276870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:31 np0005548789.localdomain python3.9[276872]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:31 np0005548789.localdomain sudo[276870]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:31 np0005548789.localdomain sudo[276980]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuenjdmkdveqbibgbhjsyiazvxkenprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015051.250069-2566-270205860459560/AnsiballZ_file.py
Dec 06 09:57:31 np0005548789.localdomain sudo[276980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:57:31 np0005548789.localdomain systemd[1]: tmp-crun.emWgcX.mount: Deactivated successfully.
Dec 06 09:57:31 np0005548789.localdomain podman[276982]: 2025-12-06 09:57:31.638487831 +0000 UTC m=+0.089811353 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:57:31 np0005548789.localdomain podman[276982]: 2025-12-06 09:57:31.646803835 +0000 UTC m=+0.098127367 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 09:57:31 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:57:31 np0005548789.localdomain python3.9[276983]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:31 np0005548789.localdomain sudo[276980]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:32 np0005548789.localdomain sudo[277108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juadyucbsmpljzgdjibvhfqcdzmyjkzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015052.0447555-2632-21534152024001/AnsiballZ_file.py
Dec 06 09:57:32 np0005548789.localdomain sudo[277108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:32 np0005548789.localdomain python3.9[277110]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:32 np0005548789.localdomain sudo[277108]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 np0005548789.localdomain sudo[277218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpgehefnaixriozjlvhhucsnqnrkkfrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015052.706644-2632-100855419352283/AnsiballZ_file.py
Dec 06 09:57:33 np0005548789.localdomain sudo[277218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:33 np0005548789.localdomain python3.9[277220]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:33 np0005548789.localdomain sudo[277218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:33.227 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:33 np0005548789.localdomain sudo[277328]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keughsarwngiisyikggbnvbhogxbglfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015053.3591383-2632-71502743512709/AnsiballZ_file.py
Dec 06 09:57:33 np0005548789.localdomain sudo[277328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1212 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4DF370000000001030307) 
Dec 06 09:57:33 np0005548789.localdomain python3.9[277330]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:33 np0005548789.localdomain sudo[277328]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:33.893 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:34 np0005548789.localdomain sudo[277438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adydbitifbuwilfstqdmedpuesmtbhmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015054.0228393-2632-45652814796037/AnsiballZ_file.py
Dec 06 09:57:34 np0005548789.localdomain sudo[277438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:34 np0005548789.localdomain python3.9[277440]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:34 np0005548789.localdomain sudo[277438]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1213 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4E32F0000000001030307) 
Dec 06 09:57:34 np0005548789.localdomain sudo[277548]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdogxcjrhibkgadubfabrubfvpwvamgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015054.6295478-2632-140011228033274/AnsiballZ_file.py
Dec 06 09:57:34 np0005548789.localdomain sudo[277548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:35 np0005548789.localdomain python3.9[277550]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:35 np0005548789.localdomain sudo[277548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48533 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4E5EF0000000001030307) 
Dec 06 09:57:35 np0005548789.localdomain sudo[277658]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsmbfeoyvwtficifsevkfaguriwnhnzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015055.2614126-2632-75659494307700/AnsiballZ_file.py
Dec 06 09:57:35 np0005548789.localdomain sudo[277658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:57:35 np0005548789.localdomain sshd[277667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:35 np0005548789.localdomain podman[277661]: 2025-12-06 09:57:35.670080047 +0000 UTC m=+0.082061506 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:57:35 np0005548789.localdomain podman[277661]: 2025-12-06 09:57:35.679986169 +0000 UTC m=+0.091967648 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:57:35 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:57:35 np0005548789.localdomain python3.9[277660]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:35 np0005548789.localdomain sudo[277658]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:36 np0005548789.localdomain sudo[277790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wodinlxnnwupwneonlvyvuikhaqsrggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015055.949264-2632-35074933674296/AnsiballZ_file.py
Dec 06 09:57:36 np0005548789.localdomain sudo[277790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:36 np0005548789.localdomain python3.9[277792]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:36 np0005548789.localdomain sudo[277790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1214 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4EB2F0000000001030307) 
Dec 06 09:57:37 np0005548789.localdomain sshd[277667]: Received disconnect from 14.194.101.210 port 54986:11: Bye Bye [preauth]
Dec 06 09:57:37 np0005548789.localdomain sshd[277667]: Disconnected from authenticating user root 14.194.101.210 port 54986 [preauth]
Dec 06 09:57:37 np0005548789.localdomain sshd[277810]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11739 DF PROTO=TCP SPT=45218 DPT=9102 SEQ=596911704 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4EFEF0000000001030307) 
Dec 06 09:57:37 np0005548789.localdomain sshd[277810]: Received disconnect from 64.227.102.57 port 35770:11: Bye Bye [preauth]
Dec 06 09:57:37 np0005548789.localdomain sshd[277810]: Disconnected from authenticating user root 64.227.102.57 port 35770 [preauth]
Dec 06 09:57:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:38.276 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:38 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:38.896 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:39 np0005548789.localdomain sudo[277812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:57:39 np0005548789.localdomain sudo[277812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:39 np0005548789.localdomain sudo[277812]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:39 np0005548789.localdomain sudo[277830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:57:39 np0005548789.localdomain sudo[277830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:39 np0005548789.localdomain sshd[276669]: error: kex_exchange_identification: read: Connection timed out
Dec 06 09:57:39 np0005548789.localdomain sshd[276669]: banner exchange: Connection from 123.160.164.187 port 42572: Connection timed out
Dec 06 09:57:40 np0005548789.localdomain sudo[277830]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1215 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E4FAEF0000000001030307) 
Dec 06 09:57:41 np0005548789.localdomain sudo[277879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:57:41 np0005548789.localdomain sudo[277879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:57:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:57:41 np0005548789.localdomain sudo[277879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:41 np0005548789.localdomain systemd[1]: tmp-crun.IhTb5W.mount: Deactivated successfully.
Dec 06 09:57:41 np0005548789.localdomain podman[277897]: 2025-12-06 09:57:41.269974656 +0000 UTC m=+0.086719259 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:57:41 np0005548789.localdomain podman[277897]: 2025-12-06 09:57:41.310360898 +0000 UTC m=+0.127105521 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:57:41 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:57:42 np0005548789.localdomain sudo[278007]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlrpymqdplwlxtocenpfvwywapejyita ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015061.9174435-2957-280220925116323/AnsiballZ_getent.py
Dec 06 09:57:42 np0005548789.localdomain sudo[278007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:57:42 np0005548789.localdomain systemd[1]: tmp-crun.CI4bZ5.mount: Deactivated successfully.
Dec 06 09:57:42 np0005548789.localdomain podman[278010]: 2025-12-06 09:57:42.430717587 +0000 UTC m=+0.091392600 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, vcs-type=git, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Dec 06 09:57:42 np0005548789.localdomain podman[278010]: 2025-12-06 09:57:42.472561375 +0000 UTC m=+0.133236428 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Dec 06 09:57:42 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:57:42 np0005548789.localdomain python3.9[278009]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 06 09:57:42 np0005548789.localdomain sudo[278007]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:43.313 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:43 np0005548789.localdomain sshd[278048]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:57:43 np0005548789.localdomain sshd[278048]: Accepted publickey for zuul from 192.168.122.30 port 38444 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 09:57:43 np0005548789.localdomain systemd-logind[766]: New session 61 of user zuul.
Dec 06 09:57:43 np0005548789.localdomain systemd[1]: Started Session 61 of User zuul.
Dec 06 09:57:43 np0005548789.localdomain sshd[278048]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 09:57:43 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:43.899 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:44 np0005548789.localdomain sshd[278051]: Received disconnect from 192.168.122.30 port 38444:11: disconnected by user
Dec 06 09:57:44 np0005548789.localdomain sshd[278051]: Disconnected from user zuul 192.168.122.30 port 38444
Dec 06 09:57:44 np0005548789.localdomain sshd[278048]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:57:44 np0005548789.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Dec 06 09:57:44 np0005548789.localdomain systemd-logind[766]: Session 61 logged out. Waiting for processes to exit.
Dec 06 09:57:44 np0005548789.localdomain systemd-logind[766]: Removed session 61.
Dec 06 09:57:44 np0005548789.localdomain python3.9[278159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:45 np0005548789.localdomain python3.9[278245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015064.289732-3038-260918501980785/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:57:45 np0005548789.localdomain python3.9[278353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:45 np0005548789.localdomain podman[278354]: 2025-12-06 09:57:45.917943106 +0000 UTC m=+0.075859716 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 09:57:45 np0005548789.localdomain podman[278354]: 2025-12-06 09:57:45.936297557 +0000 UTC m=+0.094214197 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:57:45 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:57:46 np0005548789.localdomain python3.9[278427]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:57:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:57:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:57:46 np0005548789.localdomain python3.9[278535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:57:47.288 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:57:47.289 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:57:47.290 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:47 np0005548789.localdomain python3.9[278621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015066.535572-3038-74208196040910/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:48 np0005548789.localdomain python3.9[278729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:48.335 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:48 np0005548789.localdomain python3.9[278815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015067.6185415-3038-225793067926828/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=84cd402761cf817a5c030b63eb0a858a413df311 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:57:48 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:48.902 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:48 np0005548789.localdomain podman[278849]: 2025-12-06 09:57:48.949075085 +0000 UTC m=+0.102352291 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:57:48 np0005548789.localdomain podman[278849]: 2025-12-06 09:57:48.96350276 +0000 UTC m=+0.116780016 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:57:48 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:57:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1216 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E51BEF0000000001030307) 
Dec 06 09:57:49 np0005548789.localdomain python3.9[278947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:49 np0005548789.localdomain python3.9[279033]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015068.83889-3038-167063296716008/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:50 np0005548789.localdomain python3.9[279141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:51 np0005548789.localdomain python3.9[279227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015069.9751801-3038-96652130942322/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:52 np0005548789.localdomain sudo[279335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svmcgmdjzpkekhnsqexhhqcyqgfkcsnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015071.8943832-3287-105051267565469/AnsiballZ_file.py
Dec 06 09:57:52 np0005548789.localdomain sudo[279335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:52 np0005548789.localdomain python3.9[279337]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:52 np0005548789.localdomain sudo[279335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:52 np0005548789.localdomain sudo[279445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwqsdbdmlavjjaxguoawxylvugwyrggc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015072.6611626-3311-56425492594877/AnsiballZ_copy.py
Dec 06 09:57:52 np0005548789.localdomain sudo[279445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:53 np0005548789.localdomain python3.9[279447]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:53 np0005548789.localdomain sudo[279445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:53.338 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:53 np0005548789.localdomain sudo[279555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oslywmxfjxecszzxbivpuxmauxwfsrci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015073.434961-3335-228955994063025/AnsiballZ_stat.py
Dec 06 09:57:53 np0005548789.localdomain sudo[279555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:53 np0005548789.localdomain python3.9[279557]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:53 np0005548789.localdomain sudo[279555]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:57:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:57:53 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:53.935 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:57:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:57:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:57:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17727 "" "Go-http-client/1.1"
Dec 06 09:57:54 np0005548789.localdomain sudo[279667]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhqmdluaupduxqgdwyiiqqnxvunrjkej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015074.287689-3362-56003729399605/AnsiballZ_file.py
Dec 06 09:57:54 np0005548789.localdomain sudo[279667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:54 np0005548789.localdomain python3.9[279669]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:54 np0005548789.localdomain sudo[279667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:57:55 np0005548789.localdomain systemd[1]: tmp-crun.SCOg4w.mount: Deactivated successfully.
Dec 06 09:57:55 np0005548789.localdomain podman[279769]: 2025-12-06 09:57:55.546634055 +0000 UTC m=+0.097168831 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec 06 09:57:55 np0005548789.localdomain podman[279769]: 2025-12-06 09:57:55.589126107 +0000 UTC m=+0.139660863 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:57:55 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:57:55 np0005548789.localdomain python3.9[279783]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:56 np0005548789.localdomain python3.9[279913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:56 np0005548789.localdomain python3.9[279968]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:57 np0005548789.localdomain python3.9[280076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:58 np0005548789.localdomain python3.9[280131]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:58.341 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:58 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:57:58.936 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:57:58 np0005548789.localdomain sudo[280239]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdethxxrvhxocixmiqhnnjfjurfxphfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015078.678827-3491-91648087479941/AnsiballZ_container_config_data.py
Dec 06 09:57:58 np0005548789.localdomain sudo[280239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:59 np0005548789.localdomain python3.9[280241]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 06 09:57:59 np0005548789.localdomain sudo[280239]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:59 np0005548789.localdomain sudo[280349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czepjihwwtsjxlgammwjwgjqpirmldra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015079.5611634-3518-214943748549146/AnsiballZ_container_config_hash.py
Dec 06 09:57:59 np0005548789.localdomain sudo[280349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:00 np0005548789.localdomain python3.9[280351]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:58:00 np0005548789.localdomain sudo[280349]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.501 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.632 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.632 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.632 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:58:00 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:00.632 230888 DEBUG nova.objects.instance [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:58:00 np0005548789.localdomain sudo[280459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxzeuldrsmovikvtnluwekeiddsgyzmm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015080.5181422-3548-211600182597979/AnsiballZ_edpm_container_manage.py
Dec 06 09:58:00 np0005548789.localdomain sudo[280459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.070 230888 DEBUG nova.network.neutron [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.085 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.085 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.086 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.086 230888 DEBUG nova.compute.manager [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.087 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.104 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.105 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.105 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.105 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.106 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:01 np0005548789.localdomain python3[280461]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:58:01 np0005548789.localdomain python3[280461]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:58:01 np0005548789.localdomain sudo[280459]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.668 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.728 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.728 230888 DEBUG nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:58:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.908 230888 WARNING nova.virt.libvirt.driver [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.909 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11840MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.909 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.910 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:01 np0005548789.localdomain systemd[1]: tmp-crun.sPJdzD.mount: Deactivated successfully.
Dec 06 09:58:01 np0005548789.localdomain podman[280603]: 2025-12-06 09:58:01.925606168 +0000 UTC m=+0.086661957 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 06 09:58:01 np0005548789.localdomain podman[280603]: 2025-12-06 09:58:01.962116905 +0000 UTC m=+0.123172664 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:58:01 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.993 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.994 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:58:01 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:01.994 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:58:02 np0005548789.localdomain sudo[280671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfosteuhzfjuhhooumexqnpckutuonzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015081.745493-3572-220023823316639/AnsiballZ_stat.py
Dec 06 09:58:02 np0005548789.localdomain sudo[280671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.030 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:02 np0005548789.localdomain python3.9[280674]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:02 np0005548789.localdomain sudo[280671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.431 230888 DEBUG oslo_concurrency.processutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.438 230888 DEBUG nova.compute.provider_tree [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.456 230888 DEBUG nova.scheduler.client.report [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.458 230888 DEBUG nova.compute.resource_tracker [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.458 230888 DEBUG oslo_concurrency.lockutils [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.872 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:02 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:02.873 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:03 np0005548789.localdomain sudo[280806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqvudhuhnozhmkyiakzryvuoxxzpqitl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015082.8821723-3608-242443639048575/AnsiballZ_container_config_data.py
Dec 06 09:58:03 np0005548789.localdomain sudo[280806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:03 np0005548789.localdomain python3.9[280808]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 06 09:58:03 np0005548789.localdomain sudo[280806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:03.383 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:03.496 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6956 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E554670000000001030307) 
Dec 06 09:58:03 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:03.939 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:04 np0005548789.localdomain sudo[280916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivfrsoximzoekckhbomvgpxvzsnpfwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015083.8521628-3635-19330927736306/AnsiballZ_container_config_hash.py
Dec 06 09:58:04 np0005548789.localdomain sudo[280916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:04 np0005548789.localdomain python3.9[280918]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:58:04 np0005548789.localdomain sudo[280916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:04.495 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:04 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:04.523 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6957 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5586F0000000001030307) 
Dec 06 09:58:05 np0005548789.localdomain sudo[281026]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiqnotecrlvozjgsylgzsctrtwefpnys ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015084.824279-3665-279492981132634/AnsiballZ_edpm_container_manage.py
Dec 06 09:58:05 np0005548789.localdomain sudo[281026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:05 np0005548789.localdomain python3[281028]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:58:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1217 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E55BEF0000000001030307) 
Dec 06 09:58:05 np0005548789.localdomain python3[281028]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3",
                                                                    "Digest": "sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2025-12-01T06:31:10.62653219Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20251125",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1211779450,
                                                                    "VirtualSize": 1211779450,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/bb270959ea4f0d2c0dd791aa5a80a96b2d6621117349e00f19fca53fc0632a22/diff:/var/lib/containers/storage/overlay/11c5062d45c4d7c0ad6abaddd64ed9bdbf7963c4793402f2ed3e5264e255ad60/diff:/var/lib/containers/storage/overlay/ac70de19a933522ca2cf73df928823e8823ff6b4231733a8230c668e15d517e9/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/45b05c829d68772ce6f113ebe908af5bcf8533af84d5ff30fea8dfca06e71a2d/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",
                                                                              "sha256:d26dbee55abfd9d572bfbbd4b765c5624affd9ef117ad108fb34be41e199a619",
                                                                              "sha256:86c2cd3987225f8a9bf38cc88e9c24b56bdf4a194f2301186519b4a7571b0c92",
                                                                              "sha256:baa8e0bc73d6b505f07c40d4f69a464312cc41ae2045c7975dd4759c27721a22",
                                                                              "sha256:d0cde44181262e43c105085c32a5af158b232f2e2ce4fe4b50530d7cdc5126cd"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20251125",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223494528Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:36.223562059Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20251125\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-11-25T04:02:39.054452717Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025707917Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025744608Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025767729Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025791379Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.02581523Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.025867611Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:09:28.469442331Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:02.029095017Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:05.672474685Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.113425253Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:06.532320725Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.370061347Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:07.805172373Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.259306372Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:08.625948784Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.028304824Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.423316076Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:09.801219631Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.239187116Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:10.70996597Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.147342611Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:11.5739488Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.006975065Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:12.421255505Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.066694755Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.475695836Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:16.8971372Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:18.542651107Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622503041Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622561802Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622578342Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:20.622594423Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:10:22.080892529Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:15.092312074Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:53.218820537Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:12:56.858075591Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:17:53.072482982Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:02.761216507Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:18:03.785234187Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:17.194997182Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:19:24.14458279Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:29:30.048641643Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:fa2bb8efef6782c26ea7f1675eeb36dd",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:09.707360362Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.208898452Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624465805Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:10.624514176Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2025-12-01T06:31:18.661822382Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"fa2bb8efef6782c26ea7f1675eeb36dd\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 06 09:58:05 np0005548789.localdomain sudo[281026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:58:05 np0005548789.localdomain podman[281092]: 2025-12-06 09:58:05.938961683 +0000 UTC m=+0.087268865 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:05 np0005548789.localdomain podman[281092]: 2025-12-06 09:58:05.952203712 +0000 UTC m=+0.100510914 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 09:58:05 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:58:06 np0005548789.localdomain sudo[281222]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpyetesofuxsdkcxwnqcewrxauyxexxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.0403442-3689-33268475398242/AnsiballZ_stat.py
Dec 06 09:58:06 np0005548789.localdomain sudo[281222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:06 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:06.501 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:06 np0005548789.localdomain python3.9[281224]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:06 np0005548789.localdomain sudo[281222]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6958 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5606F0000000001030307) 
Dec 06 09:58:07 np0005548789.localdomain sudo[281334]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uypwgadywzgqhnwvfoyccgkknmdicict ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.9882386-3716-67617986341557/AnsiballZ_file.py
Dec 06 09:58:07 np0005548789.localdomain sudo[281334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:07 np0005548789.localdomain python3.9[281336]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:07 np0005548789.localdomain sudo[281334]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:07 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:07.500 230888 DEBUG oslo_service.periodic_task [None req-f3bc090b-070e-4f32-9f6c-48ba00ae8240 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48534 DF PROTO=TCP SPT=37038 DPT=9102 SEQ=3499426967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E563EF0000000001030307) 
Dec 06 09:58:07 np0005548789.localdomain sudo[281443]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbjehpfbezglomejjllnigfmrpiqwajs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.5055525-3716-125524085609437/AnsiballZ_copy.py
Dec 06 09:58:07 np0005548789.localdomain sudo[281443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 np0005548789.localdomain python3.9[281445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015087.5055525-3716-125524085609437/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:08 np0005548789.localdomain sudo[281443]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:08.421 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:08 np0005548789.localdomain sudo[281498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwmyzioqbktbsgzerzlwjlwkqtlxufkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.5055525-3716-125524085609437/AnsiballZ_systemd.py
Dec 06 09:58:08 np0005548789.localdomain sudo[281498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 np0005548789.localdomain python3.9[281500]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:08 np0005548789.localdomain sudo[281498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:08 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:08.941 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:10 np0005548789.localdomain python3.9[281610]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6959 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5702F0000000001030307) 
Dec 06 09:58:11 np0005548789.localdomain python3.9[281718]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:58:11 np0005548789.localdomain podman[281789]: 2025-12-06 09:58:11.927797812 +0000 UTC m=+0.086382257 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec 06 09:58:11 np0005548789.localdomain podman[281789]: 2025-12-06 09:58:11.941115293 +0000 UTC m=+0.099699718 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 09:58:11 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:58:12 np0005548789.localdomain python3.9[281844]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:58:12 np0005548789.localdomain systemd[1]: tmp-crun.VDQoCU.mount: Deactivated successfully.
Dec 06 09:58:12 np0005548789.localdomain podman[281902]: 2025-12-06 09:58:12.920192748 +0000 UTC m=+0.084169789 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:58:12 np0005548789.localdomain podman[281902]: 2025-12-06 09:58:12.964368232 +0000 UTC m=+0.128345063 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:58:12 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:58:13 np0005548789.localdomain sudo[281974]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rslmnxjdlxgsoslhwyaebjhovbqayskk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015092.675039-3885-18617364045936/AnsiballZ_podman_container.py
Dec 06 09:58:13 np0005548789.localdomain sudo[281974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:13 np0005548789.localdomain python3.9[281976]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:58:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:13.456 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:13 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 102.7 (342 of 333 items), suggesting rotation.
Dec 06 09:58:13 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 09:58:13 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:58:13 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:58:13 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:58:13 np0005548789.localdomain sudo[281974]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:13 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:13.944 230888 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:14 np0005548789.localdomain sudo[282107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmxflhkrijttcktqgydhxlzlhrzjjnnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015093.8610709-3908-113970265549123/AnsiballZ_systemd.py
Dec 06 09:58:14 np0005548789.localdomain sudo[282107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:14 np0005548789.localdomain python3.9[282109]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:58:14 np0005548789.localdomain systemd[1]: Stopping nova_compute container...
Dec 06 09:58:14 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:14.620 230888 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Dec 06 09:58:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:16.398 230888 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Dec 06 09:58:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:16.400 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:58:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:16.400 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:58:16 np0005548789.localdomain nova_compute[230884]: 2025-12-06 09:58:16.401 230888 DEBUG oslo_concurrency.lockutils [None req-c4502e63-c574-4851-9fe1-84d6480c5d52 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:58:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:58:16 np0005548789.localdomain podman[282126]: 2025-12-06 09:58:16.690306444 +0000 UTC m=+0.105263371 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:16 np0005548789.localdomain podman[282126]: 2025-12-06 09:58:16.732257829 +0000 UTC m=+0.147214746 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: libpod-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8.scope: Deactivated successfully.
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: libpod-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8.scope: Consumed 20.370s CPU time.
Dec 06 09:58:16 np0005548789.localdomain virtqemud[203911]: End of file while reading data: Input/output error
Dec 06 09:58:16 np0005548789.localdomain podman[282113]: 2025-12-06 09:58:16.805400617 +0000 UTC m=+2.258915126 container died 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8-userdata-shm.mount: Deactivated successfully.
Dec 06 09:58:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad-merged.mount: Deactivated successfully.
Dec 06 09:58:16 np0005548789.localdomain podman[282113]: 2025-12-06 09:58:16.968303936 +0000 UTC m=+2.421818415 container cleanup 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:58:16 np0005548789.localdomain podman[282113]: nova_compute
Dec 06 09:58:17 np0005548789.localdomain podman[282176]: error opening file `/run/crun/6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8/status`: No such file or directory
Dec 06 09:58:17 np0005548789.localdomain podman[282163]: 2025-12-06 09:58:17.067968532 +0000 UTC m=+0.066473963 container cleanup 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Dec 06 09:58:17 np0005548789.localdomain podman[282163]: nova_compute
Dec 06 09:58:17 np0005548789.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 06 09:58:17 np0005548789.localdomain systemd[1]: Stopped nova_compute container.
Dec 06 09:58:17 np0005548789.localdomain systemd[1]: Starting nova_compute container...
Dec 06 09:58:17 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:58:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aea125146c60e1b0ee38b0c4ee8c70ba3c42600b7bfc70695e2bff0e11c0ad/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:17 np0005548789.localdomain podman[282178]: 2025-12-06 09:58:17.228162408 +0000 UTC m=+0.133638747 container init 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:17 np0005548789.localdomain podman[282178]: 2025-12-06 09:58:17.23894211 +0000 UTC m=+0.144418469 container start 6674d58fdb9d90e78bfb85f434c919baa1836ad3e98a097c0a64c1152f7163c8 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:58:17 np0005548789.localdomain podman[282178]: nova_compute
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + sudo -E kolla_set_configs
Dec 06 09:58:17 np0005548789.localdomain systemd[1]: Started nova_compute container.
Dec 06 09:58:17 np0005548789.localdomain sudo[282107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Validating config file
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying service configuration files
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /etc/ceph
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Creating directory /etc/ceph
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Writing out command to execute
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: ++ cat /run_command
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + CMD=nova-compute
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + ARGS=
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + sudo kolla_copy_cacerts
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + [[ ! -n '' ]]
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + . kolla_extend_start
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: Running command: 'nova-compute'
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + umask 0022
Dec 06 09:58:17 np0005548789.localdomain nova_compute[282193]: + exec nova-compute
Dec 06 09:58:18 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6960 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E58FF00000000001030307) 
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.027 282197 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.028 282197 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.028 282197 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.028 282197 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.150 282197 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.172 282197 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.172 282197 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.647 282197 INFO nova.virt.driver [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.811 282197 INFO nova.compute.provider_config [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.820 282197 DEBUG oslo_concurrency.lockutils [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.821 282197 DEBUG oslo_concurrency.lockutils [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.821 282197 DEBUG oslo_concurrency.lockutils [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.821 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.821 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.821 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.822 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.823 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] console_host                   = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.824 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.825 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.826 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] host                           = np0005548789.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.827 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.828 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.829 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.830 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.831 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.832 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] my_block_storage_ip            = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] my_ip                          = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.833 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.834 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.835 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.836 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.837 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.838 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.839 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.840 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.841 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.842 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.843 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.844 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.845 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.846 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.847 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.848 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.849 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.850 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.851 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.852 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.853 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.854 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.855 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.856 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.857 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.858 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.859 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.860 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.861 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.861 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.861 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.861 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.861 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.862 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.863 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.864 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.865 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.866 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.867 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.868 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.869 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.870 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.871 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.872 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.872 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.872 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.872 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.872 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.873 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.874 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.874 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.874 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.874 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.874 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.875 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.876 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.877 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.878 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.879 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.880 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.880 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.880 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.880 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.880 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.881 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.882 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.883 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.884 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.885 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.886 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.886 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.886 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.886 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.886 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.887 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.888 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.889 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.890 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.891 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.892 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.893 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.894 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.895 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 WARNING oslo_config.cfg [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: and ``live_migration_inbound_addr`` respectively.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: ).  Its value may be silently ignored in the future.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.896 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.897 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.898 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.899 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.899 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.899 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.900 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rbd_secret_uuid        = 1939e851-b10c-5c3b-9bb7-8e7f380233e8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.900 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.900 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.901 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.901 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.901 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.901 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.902 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.902 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.902 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.903 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.903 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.903 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.904 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.905 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.906 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.907 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.908 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.909 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.910 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.911 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.912 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.913 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.914 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.915 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.916 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.917 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.918 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.919 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.920 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.921 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.922 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.923 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.924 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.925 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.926 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.927 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.928 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.929 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.930 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.931 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.932 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.933 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.934 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.935 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.936 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.936 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.936 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.936 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.936 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.937 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.server_proxyclient_address = 192.168.122.107 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.937 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.937 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.937 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.938 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.939 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.940 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.941 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.942 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.943 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain systemd[1]: tmp-crun.pXzmmy.mount: Deactivated successfully.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.944 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.945 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.946 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.947 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.948 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.949 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.950 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.951 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain podman[282227]: 2025-12-06 09:58:19.952053696 +0000 UTC m=+0.109775020 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.952 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.953 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.954 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.955 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.956 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.956 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.956 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.956 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.956 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.957 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.957 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.957 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.957 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.957 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.958 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.959 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.959 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.959 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.959 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.960 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.961 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.962 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.963 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.963 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.963 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.963 282197 DEBUG oslo_service.service [None req-26677738-55ed-441a-a4de-14131a4fde81 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.964 282197 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 09:58:19 np0005548789.localdomain podman[282227]: 2025-12-06 09:58:19.965269174 +0000 UTC m=+0.122990508 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:19 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.979 282197 INFO nova.virt.node [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.979 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.980 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.980 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.980 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.988 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fefc5627730> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.989 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fefc5627730> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.990 282197 INFO nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Connection event '1' reason 'None'
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:19.994 282197 INFO nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   <host>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <uuid>0b20d7bd-1341-4912-afa7-eec4e2b0c648</uuid>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <cpu>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <arch>x86_64</arch>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <model>EPYC-Rome-v4</model>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <vendor>AMD</vendor>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <microcode version='16777317'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <signature family='23' model='49' stepping='0'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='x2apic'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='tsc-deadline'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='osxsave'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='hypervisor'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='tsc_adjust'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='spec-ctrl'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='stibp'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='arch-capabilities'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='ssbd'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='cmp_legacy'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='topoext'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='virt-ssbd'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='lbrv'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='tsc-scale'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='vmcb-clean'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='pause-filter'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='pfthreshold'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='svme-addr-chk'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='rdctl-no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='mds-no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <feature name='pschange-mc-no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <pages unit='KiB' size='4'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <pages unit='KiB' size='2048'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <pages unit='KiB' size='1048576'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </cpu>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <power_management>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <suspend_mem/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <suspend_disk/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <suspend_hybrid/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </power_management>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <iommu support='no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <migration_features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <live/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <uri_transports>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:         <uri_transport>tcp</uri_transport>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:         <uri_transport>rdma</uri_transport>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       </uri_transports>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </migration_features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <topology>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <cells num='1'>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:         <cell id='0'>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <memory unit='KiB'>16116612</memory>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <pages unit='KiB' size='4'>4029153</pages>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <distances>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <sibling id='0' value='10'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           </distances>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           <cpus num='8'>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:           </cpus>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:         </cell>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       </cells>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </topology>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <cache>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </cache>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <secmodel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <model>selinux</model>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <doi>0</doi>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </secmodel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <secmodel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <model>dac</model>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <doi>0</doi>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </secmodel>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   </host>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   <guest>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <os_type>hvm</os_type>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <arch name='i686'>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <wordsize>32</wordsize>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <domain type='qemu'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <domain type='kvm'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </arch>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <pae/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <nonpae/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <acpi default='on' toggle='yes'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <apic default='on' toggle='no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <cpuselection/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <deviceboot/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <externalSnapshot/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   </guest>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   <guest>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <os_type>hvm</os_type>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <arch name='x86_64'>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <wordsize>64</wordsize>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <domain type='qemu'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <domain type='kvm'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </arch>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     <features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <acpi default='on' toggle='yes'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <apic default='on' toggle='no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <cpuselection/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <deviceboot/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <disksnapshot default='on' toggle='no'/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:       <externalSnapshot/>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:     </features>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]:   </guest>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: </capabilities>
Dec 06 09:58:19 np0005548789.localdomain nova_compute[282193]: 
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.000 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.003 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: <domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <domain>kvm</domain>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <arch>i686</arch>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <vcpu max='1024'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <iothreads supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <os supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='firmware'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <loader supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>rom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pflash</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='readonly'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>yes</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='secure'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </loader>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </os>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='maximumMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <vendor>AMD</vendor>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='succor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='custom' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-128'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-256'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-512'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <memoryBacking supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='sourceType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>anonymous</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>memfd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </memoryBacking>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <disk supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='diskDevice'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>disk</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cdrom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>floppy</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>lun</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>fdc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>sata</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <graphics supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vnc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egl-headless</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </graphics>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <video supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='modelType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vga</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cirrus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>none</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>bochs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ramfb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </video>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hostdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='mode'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>subsystem</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='startupPolicy'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>mandatory</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>requisite</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>optional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='subsysType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pci</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='capsType'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='pciBackend'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hostdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <rng supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>random</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </rng>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <filesystem supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='driverType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>path</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>handle</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtiofs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </filesystem>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <tpm supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-tis</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-crb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emulator</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>external</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendVersion'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>2.0</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </tpm>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <redirdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </redirdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <channel supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </channel>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <crypto supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </crypto>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <interface supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>passt</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </interface>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <panic supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>isa</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>hyperv</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </panic>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <console supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>null</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dev</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pipe</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stdio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>udp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tcp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu-vdagent</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </console>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <gic supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <genid supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backup supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <async-teardown supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <ps2 supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sev supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sgx supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hyperv supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='features'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>relaxed</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vapic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>spinlocks</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vpindex</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>runtime</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>synic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stimer</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reset</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vendor_id</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>frequencies</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reenlightenment</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tlbflush</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ipi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>avic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emsr_bitmap</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>xmm_input</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hyperv>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <launchSecurity supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='sectype'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tdx</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </launchSecurity>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: </domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.006 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: <domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <domain>kvm</domain>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <arch>i686</arch>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <vcpu max='240'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <iothreads supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <os supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='firmware'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <loader supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>rom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pflash</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='readonly'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>yes</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='secure'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </loader>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </os>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='maximumMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <vendor>AMD</vendor>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='succor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='custom' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-128'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-256'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-512'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <memoryBacking supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='sourceType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>anonymous</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>memfd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </memoryBacking>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <disk supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='diskDevice'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>disk</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cdrom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>floppy</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>lun</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ide</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>fdc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>sata</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <graphics supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vnc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egl-headless</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </graphics>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <video supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='modelType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vga</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cirrus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>none</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>bochs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ramfb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </video>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hostdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='mode'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>subsystem</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='startupPolicy'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>mandatory</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>requisite</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>optional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='subsysType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pci</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='capsType'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='pciBackend'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hostdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <rng supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>random</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </rng>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <filesystem supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='driverType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>path</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>handle</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtiofs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </filesystem>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <tpm supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-tis</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-crb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emulator</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>external</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendVersion'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>2.0</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </tpm>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <redirdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </redirdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <channel supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </channel>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <crypto supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </crypto>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <interface supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>passt</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </interface>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <panic supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>isa</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>hyperv</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </panic>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <console supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>null</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dev</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pipe</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stdio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>udp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tcp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu-vdagent</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </console>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <gic supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <genid supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backup supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <async-teardown supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <ps2 supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sev supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sgx supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hyperv supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='features'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>relaxed</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vapic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>spinlocks</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vpindex</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>runtime</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>synic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stimer</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reset</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vendor_id</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>frequencies</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reenlightenment</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tlbflush</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ipi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>avic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emsr_bitmap</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>xmm_input</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hyperv>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <launchSecurity supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='sectype'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tdx</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </launchSecurity>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: </domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.032 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.034 282197 DEBUG nova.virt.libvirt.volume.mount [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.037 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: <domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <domain>kvm</domain>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <arch>x86_64</arch>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <vcpu max='1024'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <iothreads supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <os supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='firmware'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>efi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <loader supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>rom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pflash</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='readonly'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>yes</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='secure'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>yes</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </loader>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </os>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='maximumMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <vendor>AMD</vendor>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='succor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='custom' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-128'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-256'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-512'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <memoryBacking supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='sourceType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>anonymous</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>memfd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </memoryBacking>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <disk supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='diskDevice'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>disk</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cdrom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>floppy</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>lun</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>fdc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>sata</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <graphics supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vnc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egl-headless</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </graphics>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <video supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='modelType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vga</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cirrus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>none</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>bochs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ramfb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </video>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hostdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='mode'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>subsystem</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='startupPolicy'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>mandatory</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>requisite</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>optional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='subsysType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pci</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='capsType'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='pciBackend'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hostdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <rng supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>random</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </rng>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <filesystem supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='driverType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>path</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>handle</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtiofs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </filesystem>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <tpm supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-tis</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-crb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emulator</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>external</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendVersion'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>2.0</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </tpm>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <redirdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </redirdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <channel supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </channel>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <crypto supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </crypto>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <interface supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>passt</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </interface>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <panic supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>isa</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>hyperv</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </panic>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <console supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>null</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dev</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pipe</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stdio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>udp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tcp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu-vdagent</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </console>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <gic supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <genid supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backup supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <async-teardown supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <ps2 supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sev supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sgx supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hyperv supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='features'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>relaxed</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vapic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>spinlocks</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vpindex</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>runtime</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>synic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stimer</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reset</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vendor_id</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>frequencies</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reenlightenment</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tlbflush</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ipi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>avic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emsr_bitmap</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>xmm_input</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hyperv>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <launchSecurity supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='sectype'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tdx</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </launchSecurity>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: </domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.090 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: <domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <domain>kvm</domain>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <arch>x86_64</arch>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <vcpu max='240'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <iothreads supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <os supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='firmware'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <loader supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>rom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pflash</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='readonly'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>yes</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='secure'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>no</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </loader>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </os>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-passthrough' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='hostPassthroughMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='maximum' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='maximumMigratable'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>on</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>off</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='host-model' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <vendor>AMD</vendor>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='x2apic'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='hypervisor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='stibp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='overflow-recov'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='succor'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lbrv'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='tsc-scale'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pause-filter'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='pfthreshold'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <feature policy='disable' name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <mode name='custom' supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Broadwell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Cooperlake-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Denverton-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Dhyana-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='auto-ibrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Milan-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amd-psfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='no-nested-data-bp'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='null-sel-clr-base'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='stibp-always-on'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-Rome-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='EPYC-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='GraniteRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-128'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-256'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx10-512'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='prefetchiti'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Haswell-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v6'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Icelake-Server-v7'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='IvyBridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='KnightsMill-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4fmaps'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-4vnniw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512er'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512pf'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G4-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Opteron_G5-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fma4'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tbm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xop'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SapphireRapids-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='amx-tile'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-bf16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-fp16'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512-vpopcntdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bitalg'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vbmi2'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrc'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fzrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='la57'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='taa-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='tsx-ldtrk'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xfd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='SierraForest-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ifma'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-ne-convert'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx-vnni-int8'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='bus-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cmpccxadd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fbsdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='fsrs'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ibrs-all'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mcdt-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pbrsb-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='psdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='sbdr-ssdp-no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='serialize'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vaes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='vpclmulqdq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Client-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='hle'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='rtm'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Skylake-Server-v5'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512bw'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512cd'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512dq'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512f'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='avx512vl'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='invpcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pcid'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='pku'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='mpx'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v2'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v3'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='core-capability'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='split-lock-detect'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='Snowridge-v4'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='cldemote'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='erms'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='gfni'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdir64b'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='movdiri'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='xsaves'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='athlon-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='core2duo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='coreduo-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='n270-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='ss'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <blockers model='phenom-v1'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnow'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <feature name='3dnowext'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </blockers>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </mode>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </cpu>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <memoryBacking supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <enum name='sourceType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>anonymous</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <value>memfd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </memoryBacking>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <disk supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='diskDevice'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>disk</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cdrom</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>floppy</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>lun</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ide</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>fdc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>sata</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <graphics supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vnc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egl-headless</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </graphics>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <video supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='modelType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vga</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>cirrus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>none</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>bochs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ramfb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </video>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hostdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='mode'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>subsystem</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='startupPolicy'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>mandatory</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>requisite</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>optional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='subsysType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pci</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>scsi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='capsType'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='pciBackend'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hostdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <rng supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtio-non-transitional</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>random</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>egd</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </rng>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <filesystem supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='driverType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>path</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>handle</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>virtiofs</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </filesystem>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <tpm supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-tis</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tpm-crb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emulator</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>external</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendVersion'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>2.0</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </tpm>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <redirdev supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='bus'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>usb</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </redirdev>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <channel supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </channel>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <crypto supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendModel'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>builtin</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </crypto>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <interface supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='backendType'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>default</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>passt</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </interface>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <panic supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='model'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>isa</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>hyperv</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </panic>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <console supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='type'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>null</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vc</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pty</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dev</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>file</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>pipe</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stdio</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>udp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tcp</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>unix</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>qemu-vdagent</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>dbus</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </console>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </devices>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   <features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <gic supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <vmcoreinfo supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <genid supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backingStoreInput supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <backup supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <async-teardown supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <ps2 supported='yes'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sev supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <sgx supported='no'/>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <hyperv supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='features'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>relaxed</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vapic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>spinlocks</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vpindex</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>runtime</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>synic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>stimer</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reset</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>vendor_id</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>frequencies</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>reenlightenment</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tlbflush</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>ipi</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>avic</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>emsr_bitmap</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>xmm_input</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <spinlocks>4095</spinlocks>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <stimer_direct>on</stimer_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_direct>off</tlbflush_direct>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <tlbflush_extended>off</tlbflush_extended>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </defaults>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </hyperv>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     <launchSecurity supported='yes'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       <enum name='sectype'>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:         <value>tdx</value>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:       </enum>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:     </launchSecurity>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:   </features>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: </domainCapabilities>
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.147 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.147 282197 INFO nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Secure Boot support detected
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.150 282197 INFO nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.150 282197 INFO nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.163 282197 DEBUG nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.197 282197 INFO nova.virt.node [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Determined node identity 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from /var/lib/nova/compute_id
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.220 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Verified node 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad matches my host np0005548789.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.267 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.272 282197 DEBUG nova.virt.libvirt.vif [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005548789.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T08:44:43Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.273 282197 DEBUG nova.network.os_vif_util [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.274 282197 DEBUG nova.network.os_vif_util [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.275 282197 DEBUG os_vif [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.319 282197 DEBUG ovsdbapp.backend.ovs_idl [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.319 282197 DEBUG ovsdbapp.backend.ovs_idl [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.319 282197 DEBUG ovsdbapp.backend.ovs_idl [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.320 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.320 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.320 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.321 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.322 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.325 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.342 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.342 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.342 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.343 282197 INFO oslo.privsep.daemon [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpg389lvne/privsep.sock']
Dec 06 09:58:20 np0005548789.localdomain sudo[282366]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uacsyzcnymjqwkhitinwxvlhuhhbsler ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015100.6509812-3935-251419629589052/AnsiballZ_podman_container.py
Dec 06 09:58:20 np0005548789.localdomain sudo[282366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.977 282197 INFO oslo.privsep.daemon [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Spawned new privsep daemon via rootwrap
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.873 282356 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.878 282356 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.882 282356 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 06 09:58:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:20.882 282356 INFO oslo.privsep.daemon [-] privsep daemon running as pid 282356
Dec 06 09:58:21 np0005548789.localdomain python3.9[282368]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.235 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.236 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86fc0b7a-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.237 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86fc0b7a-fb, col_values=(('external_ids', {'iface-id': '86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:77:f3', 'vm-uuid': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.238 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.238 282197 INFO os_vif [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb')
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.239 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.243 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.244 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.334 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.335 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.335 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.335 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.336 282197 DEBUG oslo_concurrency.processutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope.
Dec 06 09:58:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:58:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:21 np0005548789.localdomain podman[282395]: 2025-12-06 09:58:21.509247798 +0000 UTC m=+0.163644224 container init a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 06 09:58:21 np0005548789.localdomain podman[282395]: 2025-12-06 09:58:21.521069232 +0000 UTC m=+0.175465648 container start a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:21 np0005548789.localdomain python3.9[282368]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Applying nova statedir ownership
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/console.log
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/55d01870b6a0ce0995b6b5844cf47638cdf46fbf
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-55d01870b6a0ce0995b6b5844cf47638cdf46fbf
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/20273498b7380904530133bcb3f720bd45f4f00b810dc4597d81d23acd8f9673
Dec 06 09:58:21 np0005548789.localdomain nova_compute_init[282434]: INFO:nova_statedir:Nova statedir ownership complete
Dec 06 09:58:21 np0005548789.localdomain systemd[1]: libpod-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope: Deactivated successfully.
Dec 06 09:58:21 np0005548789.localdomain podman[282435]: 2025-12-06 09:58:21.59356329 +0000 UTC m=+0.055397151 container died a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:58:21 np0005548789.localdomain podman[282446]: 2025-12-06 09:58:21.679663369 +0000 UTC m=+0.081038243 container cleanup a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 06 09:58:21 np0005548789.localdomain systemd[1]: libpod-conmon-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b.scope: Deactivated successfully.
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.773 282197 DEBUG oslo_concurrency.processutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:21 np0005548789.localdomain sudo[282366]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.871 282197 DEBUG nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:58:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:21.872 282197 DEBUG nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.072 282197 WARNING nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.073 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11849MB free_disk=41.83721923828125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.074 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.074 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.268 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.268 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.269 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.344 282197 DEBUG nova.scheduler.client.report [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.394 282197 DEBUG nova.scheduler.client.report [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.394 282197 DEBUG nova.compute.provider_tree [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 09:58:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ceb10c7340fd1e23819ce0ca67ef154f878e8bf4b0dc6dca13387cdceeb0c7ee-merged.mount: Deactivated successfully.
Dec 06 09:58:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a90088f7335f0424abe9208b181fba6d3fc6d1408325e575f0ba866a5d87ad9b-userdata-shm.mount: Deactivated successfully.
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.409 282197 DEBUG nova.scheduler.client.report [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 09:58:22 np0005548789.localdomain sshd[263860]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:58:22 np0005548789.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Dec 06 09:58:22 np0005548789.localdomain systemd[1]: session-60.scope: Consumed 1min 30.613s CPU time.
Dec 06 09:58:22 np0005548789.localdomain systemd-logind[766]: Session 60 logged out. Waiting for processes to exit.
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.434 282197 DEBUG nova.scheduler.client.report [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 09:58:22 np0005548789.localdomain systemd-logind[766]: Removed session 60.
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.470 282197 DEBUG oslo_concurrency.processutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.908 282197 DEBUG oslo_concurrency.processutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.914 282197 DEBUG nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.915 282197 INFO nova.virt.libvirt.host [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] kernel doesn't support AMD SEV
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.917 282197 DEBUG nova.compute.provider_tree [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.917 282197 DEBUG nova.virt.libvirt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:58:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:22.958 282197 DEBUG nova.scheduler.client.report [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.021 282197 DEBUG nova.compute.resource_tracker [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.022 282197 DEBUG oslo_concurrency.lockutils [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.022 282197 DEBUG nova.service [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.056 282197 DEBUG nova.service [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.056 282197 DEBUG nova.servicegroup.drivers.db [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] DB_Driver: join new ServiceGroup member np0005548789.localdomain to the compute group, service = <Service: host=np0005548789.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 09:58:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:58:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:58:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:58:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:58:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:23.950 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:58:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17716 "" "Go-http-client/1.1"
Dec 06 09:58:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:25.324 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:58:25 np0005548789.localdomain podman[282512]: 2025-12-06 09:58:25.939536603 +0000 UTC m=+0.092496776 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:58:26 np0005548789.localdomain podman[282512]: 2025-12-06 09:58:26.050553521 +0000 UTC m=+0.203513664 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 09:58:26 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:58:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:28.952 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:30.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:32 np0005548789.localdomain sshd[282538]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:58:32 np0005548789.localdomain podman[282540]: 2025-12-06 09:58:32.923511523 +0000 UTC m=+0.082695194 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:58:32 np0005548789.localdomain podman[282540]: 2025-12-06 09:58:32.9285901 +0000 UTC m=+0.087773821 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:58:32 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:58:33 np0005548789.localdomain sshd[282558]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8251 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5C9970000000001030307) 
Dec 06 09:58:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:33.955 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:34 np0005548789.localdomain sshd[282538]: Received disconnect from 118.219.234.233 port 39528:11: Bye Bye [preauth]
Dec 06 09:58:34 np0005548789.localdomain sshd[282538]: Disconnected from authenticating user root 118.219.234.233 port 39528 [preauth]
Dec 06 09:58:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8252 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5CDAF0000000001030307) 
Dec 06 09:58:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6961 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5CFEF0000000001030307) 
Dec 06 09:58:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:35.328 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8253 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5D5AF0000000001030307) 
Dec 06 09:58:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:58:36 np0005548789.localdomain podman[282560]: 2025-12-06 09:58:36.920870564 +0000 UTC m=+0.082118956 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:58:36 np0005548789.localdomain podman[282560]: 2025-12-06 09:58:36.929898262 +0000 UTC m=+0.091146634 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:58:36 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:58:37 np0005548789.localdomain sshd[282558]: Received disconnect from 179.33.210.213 port 52690:11: Bye Bye [preauth]
Dec 06 09:58:37 np0005548789.localdomain sshd[282558]: Disconnected from authenticating user root 179.33.210.213 port 52690 [preauth]
Dec 06 09:58:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1218 DF PROTO=TCP SPT=55404 DPT=9102 SEQ=3800488960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5D9EF0000000001030307) 
Dec 06 09:58:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:37.914 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:58:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:37.915 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 09:58:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:37.914 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:38.958 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:40.331 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8254 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E5E56F0000000001030307) 
Dec 06 09:58:41 np0005548789.localdomain sudo[282583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:58:41 np0005548789.localdomain sudo[282583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:41 np0005548789.localdomain sudo[282583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:41 np0005548789.localdomain sudo[282601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:58:41 np0005548789.localdomain sudo[282601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:42 np0005548789.localdomain sudo[282601]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:42 np0005548789.localdomain sshd[282651]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:58:42 np0005548789.localdomain sshd[282651]: Received disconnect from 64.227.102.57 port 34666:11: Bye Bye [preauth]
Dec 06 09:58:42 np0005548789.localdomain sshd[282651]: Disconnected from authenticating user root 64.227.102.57 port 34666 [preauth]
Dec 06 09:58:42 np0005548789.localdomain podman[282653]: 2025-12-06 09:58:42.937709867 +0000 UTC m=+0.088747201 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 09:58:42 np0005548789.localdomain podman[282653]: 2025-12-06 09:58:42.974379169 +0000 UTC m=+0.125416503 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:58:42 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:58:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:58:43 np0005548789.localdomain podman[282672]: 2025-12-06 09:58:43.091709211 +0000 UTC m=+0.075455680 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 09:58:43 np0005548789.localdomain podman[282672]: 2025-12-06 09:58:43.106036164 +0000 UTC m=+0.089782613 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Dec 06 09:58:43 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:58:43 np0005548789.localdomain sudo[282693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:58:43 np0005548789.localdomain sudo[282693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:58:43 np0005548789.localdomain sudo[282693]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:43.962 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:44 np0005548789.localdomain sshd[282711]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:45.333 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:45 np0005548789.localdomain sshd[282711]: Received disconnect from 154.113.10.34 port 49442:11: Bye Bye [preauth]
Dec 06 09:58:45 np0005548789.localdomain sshd[282711]: Disconnected from authenticating user root 154.113.10.34 port 49442 [preauth]
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:58:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:58:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:58:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:58:46 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:46.917 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:58:46 np0005548789.localdomain podman[282713]: 2025-12-06 09:58:46.931851239 +0000 UTC m=+0.093638452 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 09:58:46 np0005548789.localdomain podman[282713]: 2025-12-06 09:58:46.944145968 +0000 UTC m=+0.105933131 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 09:58:46 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:58:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:47.289 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:47.289 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:58:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:48.965 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8255 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E605EF0000000001030307) 
Dec 06 09:58:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:50.334 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:58:50 np0005548789.localdomain podman[282731]: 2025-12-06 09:58:50.919389756 +0000 UTC m=+0.081553328 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:58:50 np0005548789.localdomain podman[282731]: 2025-12-06 09:58:50.933235954 +0000 UTC m=+0.095399516 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:58:50 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:58:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:58:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:58:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:58:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150643 "" "Go-http-client/1.1"
Dec 06 09:58:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:53.969 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:58:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17728 "" "Go-http-client/1.1"
Dec 06 09:58:54 np0005548789.localdomain sshd[282755]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.059 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.337 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.518 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.519 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.519 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.519 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:58:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:55.567 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:56 np0005548789.localdomain sshd[282755]: Received disconnect from 14.194.101.210 port 41728:11: Bye Bye [preauth]
Dec 06 09:58:56 np0005548789.localdomain sshd[282755]: Disconnected from authenticating user root 14.194.101.210 port 41728 [preauth]
Dec 06 09:58:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:58:56 np0005548789.localdomain systemd[1]: tmp-crun.j9TwJW.mount: Deactivated successfully.
Dec 06 09:58:56 np0005548789.localdomain podman[282757]: 2025-12-06 09:58:56.453410266 +0000 UTC m=+0.088803003 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 09:58:56 np0005548789.localdomain podman[282757]: 2025-12-06 09:58:56.490132809 +0000 UTC m=+0.125525526 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 09:58:56 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:58:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:56.950 282197 DEBUG nova.compute.manager [None req-ac182712-08dd-46f8-8abb-4b803f552cb2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:58:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:56.954 282197 INFO nova.compute.manager [None req-ac182712-08dd-46f8-8abb-4b803f552cb2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Retrieving diagnostics
Dec 06 09:58:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:58:58.970 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:00.338 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.267 282197 DEBUG oslo_concurrency.lockutils [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.268 282197 DEBUG oslo_concurrency.lockutils [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.268 282197 DEBUG nova.compute.manager [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.273 282197 DEBUG nova.compute.manager [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.278 282197 DEBUG nova.objects.instance [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'flavor' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.323 282197 DEBUG nova.virt.libvirt.driver [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Dec 06 09:59:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46738 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E63EC70000000001030307) 
Dec 06 09:59:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:59:03 np0005548789.localdomain podman[282781]: 2025-12-06 09:59:03.911597285 +0000 UTC m=+0.079253018 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:59:03 np0005548789.localdomain podman[282781]: 2025-12-06 09:59:03.9422102 +0000 UTC m=+0.109866003 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 09:59:03 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:59:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:03.979 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46739 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E642EF0000000001030307) 
Dec 06 09:59:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:05.340 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8256 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E645EF0000000001030307) 
Dec 06 09:59:05 np0005548789.localdomain kernel: device tap86fc0b7a-fb left promiscuous mode
Dec 06 09:59:05 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015145.7730] device (tap86fc0b7a-fb): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 06 09:59:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:05Z|00052|binding|INFO|Releasing lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b from this chassis (sb_readonly=0)
Dec 06 09:59:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:05Z|00053|binding|INFO|Setting lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b down in Southbound
Dec 06 09:59:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:05.780 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:05Z|00054|binding|INFO|Removing iface tap86fc0b7a-fb ovn-installed in OVS
Dec 06 09:59:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:05.782 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:05.791 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:05 np0005548789.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 06 09:59:05 np0005548789.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 49.447s CPU time.
Dec 06 09:59:05 np0005548789.localdomain systemd-machined[84444]: Machine qemu-1-instance-00000002 terminated.
Dec 06 09:59:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:05.929 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:77:f3 192.168.0.162'], port_security=['fa:16:3e:64:77:f3 192.168.0.162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.162/24', 'neutron:device_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005548789.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '7', 'neutron:security_group_ids': '65e67ecb-ffcf-41e6-8b8b-ed491f2580ec 7ce08e20-be94-4509-a371-aa5c036416af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7872d306-938e-4ee0-be61-57ba3983d747, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:59:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:05.932 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b in datapath 652b6bdc-40ce-45b7-8aa5-3bca79987993 unbound from our chassis
Dec 06 09:59:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:05.934 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 652b6bdc-40ce-45b7-8aa5-3bca79987993, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 09:59:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:05.935 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[39edd12a-91dc-4645-91d6-31a216bde723]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:05.936 160509 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 namespace which is not needed anymore
Dec 06 09:59:06 np0005548789.localdomain systemd[1]: libpod-12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445.scope: Deactivated successfully.
Dec 06 09:59:06 np0005548789.localdomain podman[282833]: 2025-12-06 09:59:06.133520607 +0000 UTC m=+0.080613370 container died 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.162 282197 DEBUG nova.compute.manager [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received event network-vif-unplugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.164 282197 DEBUG oslo_concurrency.lockutils [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.165 282197 DEBUG oslo_concurrency.lockutils [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.165 282197 DEBUG oslo_concurrency.lockutils [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.166 282197 DEBUG nova.compute.manager [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] No waiting events found dispatching network-vif-unplugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.166 282197 WARNING nova.compute.manager [req-d65108c9-0142-483f-8f47-53d9963f6f31 req-60ed4ced-25b4-415d-9320-569c668981a5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received unexpected event network-vif-unplugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for instance with vm_state active and task_state powering-off.
Dec 06 09:59:06 np0005548789.localdomain podman[282833]: 2025-12-06 09:59:06.283673623 +0000 UTC m=+0.230766356 container cleanup 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Dec 06 09:59:06 np0005548789.localdomain podman[282847]: 2025-12-06 09:59:06.298420837 +0000 UTC m=+0.154844770 container cleanup 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c)
Dec 06 09:59:06 np0005548789.localdomain systemd[1]: libpod-conmon-12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445.scope: Deactivated successfully.
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.345 282197 INFO nova.virt.libvirt.driver [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Instance shutdown successfully after 3 seconds.
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.352 282197 INFO nova.virt.libvirt.driver [-] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Instance destroyed successfully.
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.353 282197 DEBUG nova.objects.instance [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'numa_topology' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:06 np0005548789.localdomain podman[282864]: 2025-12-06 09:59:06.371904526 +0000 UTC m=+0.067027961 container remove 12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, vcs-type=git, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.377 282197 DEBUG nova.compute.manager [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.377 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[1b594e9c-e502-486a-b75e-4758075ccdba]: (4, ('Sat Dec  6 09:59:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 (12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445)\n12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445\nSat Dec  6 09:59:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 (12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445)\n12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.383 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[548e6120-0748-4c2a-bfa5-82a9a964071a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.388 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap652b6bdc-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.391 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:06 np0005548789.localdomain kernel: device tap652b6bdc-40 left promiscuous mode
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.399 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.403 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f4182768-2be8-4d62-9cb4-9c92c4f47b77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.420 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a661a2-56f3-444a-acde-c631a950e5b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.421 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad10031-8146-4312-bfdc-f04ca7e6d6d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.435 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[13b2df6a-5855-449b-857e-61cbe3aa8012]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 710075, 'reachable_time': 38110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282883, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.445 160720 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 09:59:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:06.446 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[cf46cea4-461b-4221-892f-87a4b245b8fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:06.475 282197 DEBUG oslo_concurrency.lockutils [None req-9994761a-656d-44be-9816-3ec8b7c0a5d2 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46740 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E64AF00000000001030307) 
Dec 06 09:59:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:59:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-31fbdb956fdb20faf0121dfd2c519c9e748cc292d5fc54ebad7f5d80f477ded1-merged.mount: Deactivated successfully.
Dec 06 09:59:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12ba6c9101b1d507aed708fcb0f1f5958064f89edf86d2af4b3dde5856898445-userdata-shm.mount: Deactivated successfully.
Dec 06 09:59:07 np0005548789.localdomain systemd[1]: run-netns-ovnmeta\x2d652b6bdc\x2d40ce\x2d45b7\x2d8aa5\x2d3bca79987993.mount: Deactivated successfully.
Dec 06 09:59:07 np0005548789.localdomain podman[282885]: 2025-12-06 09:59:07.17831114 +0000 UTC m=+0.083335974 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:59:07 np0005548789.localdomain podman[282885]: 2025-12-06 09:59:07.186455202 +0000 UTC m=+0.091480066 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 09:59:07 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:59:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6962 DF PROTO=TCP SPT=55472 DPT=9102 SEQ=2212927204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E64DEF0000000001030307) 
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.913 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.914 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.915 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.916 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.917 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.919 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.919 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.919 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.920 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.921 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.922 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.922 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.923 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.924 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.925 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.926 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.927 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.928 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.929 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.930 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.931 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.932 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.933 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.934 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.934 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 09:59:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 09:59:07.935 12 DEBUG ceilometer.compute.pollsters [-] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000002, id=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.204 282197 DEBUG nova.compute.manager [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.205 282197 DEBUG oslo_concurrency.lockutils [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.205 282197 DEBUG oslo_concurrency.lockutils [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.205 282197 DEBUG oslo_concurrency.lockutils [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.206 282197 DEBUG nova.compute.manager [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] No waiting events found dispatching network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.206 282197 WARNING nova.compute.manager [req-b3db8e6d-df2a-4a07-89dc-5e53ac7d2c74 req-50397831-ef50-49d3-b98b-3969cce41c9f 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received unexpected event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for instance with vm_state stopped and task_state None.
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.713 282197 DEBUG nova.compute.manager [None req-0127c97c-bea6-4504-bf33-a61bf9bd186a ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server [None req-0127c97c-bea6-4504-bf33-a61bf9bd186a ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     raise self.value
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     raise self.value
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 06 09:59:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:08.740 282197 ERROR oslo_messaging.rpc.server 
Dec 06 09:59:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:09.008 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:10.341 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46741 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E65AAF0000000001030307) 
Dec 06 09:59:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:59:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:59:13 np0005548789.localdomain podman[282909]: 2025-12-06 09:59:13.913673015 +0000 UTC m=+0.069923410 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:59:13 np0005548789.localdomain podman[282909]: 2025-12-06 09:59:13.923192329 +0000 UTC m=+0.079405713 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:59:13 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:59:13 np0005548789.localdomain systemd[1]: tmp-crun.2ZLy1p.mount: Deactivated successfully.
Dec 06 09:59:13 np0005548789.localdomain podman[282908]: 2025-12-06 09:59:13.989006 +0000 UTC m=+0.146855354 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 09:59:14 np0005548789.localdomain podman[282908]: 2025-12-06 09:59:14.00519206 +0000 UTC m=+0.163041794 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 09:59:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:14.011 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:14 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:59:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:15.342 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:59:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:59:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:59:17 np0005548789.localdomain systemd[1]: tmp-crun.43vKQk.mount: Deactivated successfully.
Dec 06 09:59:17 np0005548789.localdomain podman[282947]: 2025-12-06 09:59:17.729519363 +0000 UTC m=+0.066088942 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:59:17 np0005548789.localdomain podman[282947]: 2025-12-06 09:59:17.771151627 +0000 UTC m=+0.107721176 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 09:59:17 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:59:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:19.053 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:19.213 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:19.213 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:19.214 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 09:59:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:19.214 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 09:59:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46742 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E67BEF0000000001030307) 
Dec 06 09:59:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:20.343 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:20.711 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:59:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:20.711 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:59:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:20.712 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 09:59:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:20.713 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.018 282197 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765015146.0170066, b7ed0a2e-9350-4933-9334-4e5e08d3e6aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.019 282197 INFO nova.compute.manager [-] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] VM Stopped (Lifecycle Event)
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.150 282197 DEBUG nova.compute.manager [None req-e2983e19-cca8-48fc-8597-763cb6b84e91 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.153 282197 DEBUG nova.compute.manager [None req-e2983e19-cca8-48fc-8597-763cb6b84e91 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.700 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.730 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.730 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.731 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.732 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.732 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.733 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.733 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.734 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.734 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.734 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.750 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.750 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.751 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.751 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 09:59:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:21.752 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:59:21 np0005548789.localdomain sshd[275849]: fatal: Timeout before authentication for 45.78.222.162 port 35134
Dec 06 09:59:21 np0005548789.localdomain podman[282967]: 2025-12-06 09:59:21.922888704 +0000 UTC m=+0.087354637 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 09:59:21 np0005548789.localdomain podman[282967]: 2025-12-06 09:59:21.930142628 +0000 UTC m=+0.094608581 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 09:59:21 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.229 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.313 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.313 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.522 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.523 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12294MB free_disk=41.837059020996094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.524 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.524 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.627 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.628 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.628 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 09:59:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:22.686 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:23.157 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:23.164 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 09:59:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:23.185 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 09:59:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:23.212 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 09:59:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:23.212 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:23 np0005548789.localdomain podman[241090]: time="2025-12-06T09:59:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:59:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:59:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148368 "" "Go-http-client/1.1"
Dec 06 09:59:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:59:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17247 "" "Go-http-client/1.1"
Dec 06 09:59:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:24.056 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:25.346 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:59:26 np0005548789.localdomain podman[283033]: 2025-12-06 09:59:26.919581206 +0000 UTC m=+0.081135716 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Dec 06 09:59:26 np0005548789.localdomain podman[283033]: 2025-12-06 09:59:26.952868553 +0000 UTC m=+0.114423003 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:59:26 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.760 282197 DEBUG nova.compute.manager [None req-52434a90-0e32-4809-b1dd-44e10953cee5 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server [None req-52434a90-0e32-4809-b1dd-44e10953cee5 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     raise self.value
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     raise self.value
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Dec 06 09:59:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:28.784 282197 ERROR oslo_messaging.rpc.server 
Dec 06 09:59:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:29.093 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:30.349 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13457 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6B3F60000000001030307) 
Dec 06 09:59:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:33.900 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'flavor' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:33.926 282197 DEBUG oslo_concurrency.lockutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:59:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:33.927 282197 DEBUG oslo_concurrency.lockutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:59:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:33.927 282197 DEBUG nova.network.neutron [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 09:59:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:33.928 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:34.138 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13458 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6B7EF0000000001030307) 
Dec 06 09:59:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 09:59:34 np0005548789.localdomain podman[283059]: 2025-12-06 09:59:34.918060884 +0000 UTC m=+0.082181068 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 09:59:34 np0005548789.localdomain podman[283059]: 2025-12-06 09:59:34.956381537 +0000 UTC m=+0.120501791 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 06 09:59:34 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.351 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.674 282197 DEBUG nova.network.neutron [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.697 282197 DEBUG oslo_concurrency.lockutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.727 282197 INFO nova.virt.libvirt.driver [-] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Instance destroyed successfully.
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.727 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'numa_topology' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46743 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6BBEF0000000001030307) 
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.742 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'resources' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.777 282197 DEBUG nova.virt.libvirt.vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005548789.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='e0d06706-da90-478a-9829-34b75a3ce049',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T09:59:06Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.778 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.779 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.780 282197 DEBUG os_vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.782 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.783 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86fc0b7a-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.785 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.788 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.791 282197 INFO os_vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb')
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.795 282197 DEBUG nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.795 282197 INFO nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] UEFI support detected
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.803 282197 DEBUG nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Start _get_guest_xml network_info=[{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=e0d06706-da90-478a-9829-34b75a3ce049,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'image_id': 'e0d06706-da90-478a-9829-34b75a3ce049'}], 'ephemerals': [{'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_format': None, 'device_name': '/dev/vdb', 'size': 1, 'device_type': 'disk'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.808 282197 WARNING nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.810 282197 DEBUG nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Searching host: 'np0005548789.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.811 282197 DEBUG nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.813 282197 DEBUG nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Searching host: 'np0005548789.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.814 282197 DEBUG nova.virt.libvirt.host [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.815 282197 DEBUG nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.816 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-06T08:43:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='3b9dcd46-fa1b-4714-ba2b-665da2f67af6',id=2,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=e0d06706-da90-478a-9829-34b75a3ce049,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.817 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.817 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.818 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.818 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.819 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.819 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.820 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.820 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.820 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.821 282197 DEBUG nova.virt.hardware [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.821 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'vcpu_model' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.844 282197 DEBUG nova.privsep.utils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 06 09:59:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:35.844 282197 DEBUG oslo_concurrency.processutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:35Z|00055|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.267 282197 DEBUG oslo_concurrency.processutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.268 282197 DEBUG oslo_concurrency.processutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.722 282197 DEBUG oslo_concurrency.processutils [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.724 282197 DEBUG nova.virt.libvirt.vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005548789.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='e0d06706-da90-478a-9829-34b75a3ce049',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T09:59:06Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.725 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.725 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.727 282197 DEBUG nova.objects.instance [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Lazy-loading 'pci_devices' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.744 282197 DEBUG nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] End _get_guest_xml xml=<domain type="kvm">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <uuid>b7ed0a2e-9350-4933-9334-4e5e08d3e6aa</uuid>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <name>instance-00000002</name>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <memory>524288</memory>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <vcpu>1</vcpu>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <metadata>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:name>test</nova:name>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:creationTime>2025-12-06 09:59:35</nova:creationTime>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:flavor name="m1.small">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:memory>512</nova:memory>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:disk>1</nova:disk>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:swap>0</nova:swap>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:ephemeral>1</nova:ephemeral>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:vcpus>1</nova:vcpus>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </nova:flavor>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:owner>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:user uuid="ff0049f3313348bdb67886d170c1c765">admin</nova:user>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:project uuid="3d603431c0bb4967bafc7a0aa6108bfe">admin</nova:project>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </nova:owner>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:root type="image" uuid="e0d06706-da90-478a-9829-34b75a3ce049"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <nova:ports>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <nova:port uuid="86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:           <nova:ip type="fixed" address="192.168.0.162" ipVersion="4"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         </nova:port>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </nova:ports>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </nova:instance>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </metadata>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <sysinfo type="smbios">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <system>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="manufacturer">RDO</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="product">OpenStack Compute</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="serial">b7ed0a2e-9350-4933-9334-4e5e08d3e6aa</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="uuid">b7ed0a2e-9350-4933-9334-4e5e08d3e6aa</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <entry name="family">Virtual Machine</entry>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </system>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </sysinfo>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <os>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <boot dev="hd"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <smbios mode="sysinfo"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </os>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <features>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <acpi/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <apic/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <vmcoreinfo/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </features>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <clock offset="utc">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <timer name="pit" tickpolicy="delay"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <timer name="hpet" present="no"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </clock>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <cpu mode="host-model" match="exact">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <topology sockets="1" cores="1" threads="1"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </cpu>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   <devices>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <disk type="network" device="disk">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <driver type="raw" cache="none"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <source protocol="rbd" name="vms/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa_disk">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.103" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.105" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.104" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </source>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <auth username="openstack">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </auth>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <target dev="vda" bus="virtio"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <disk type="network" device="disk">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <driver type="raw" cache="none"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <source protocol="rbd" name="vms/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa_disk.eph0">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.103" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.105" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <host name="172.18.0.104" port="6789"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </source>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <auth username="openstack">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:         <secret type="ceph" uuid="1939e851-b10c-5c3b-9bb7-8e7f380233e8"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       </auth>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <target dev="vdb" bus="virtio"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </disk>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <interface type="ethernet">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <mac address="fa:16:3e:64:77:f3"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <model type="virtio"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <driver name="vhost" rx_queue_size="512"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <mtu size="1292"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <target dev="tap86fc0b7a-fb"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </interface>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <serial type="pty">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <log file="/var/lib/nova/instances/b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/console.log" append="off"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </serial>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <video>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <model type="virtio"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </video>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <input type="tablet" bus="usb"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <input type="keyboard" bus="usb"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <rng model="virtio">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <backend model="random">/dev/urandom</backend>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </rng>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="pci" model="pcie-root-port"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <controller type="usb" index="0"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     <memballoon model="virtio">
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:       <stats period="10"/>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:     </memballoon>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:   </devices>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: </domain>
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.745 282197 DEBUG nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.746 282197 DEBUG nova.virt.libvirt.driver [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.747 282197 DEBUG nova.virt.libvirt.vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-06T08:44:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005548789.localdomain',hostname='test',id=2,image_ref='e0d06706-da90-478a-9829-34b75a3ce049',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T08:44:43Z,launched_on='np0005548789.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005548789.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='3d603431c0bb4967bafc7a0aa6108bfe',ramdisk_id='',reservation_id='r-02dpupig',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='e0d06706-da90-478a-9829-34b75a3ce049',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T09:59:06Z,user_data=None,user_id='ff0049f3313348bdb67886d170c1c765',uuid=b7ed0a2e-9350-4933-9334-4e5e08d3e6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.747 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converting VIF {"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.748 282197 DEBUG nova.network.os_vif_util [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.748 282197 DEBUG os_vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.749 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.749 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.749 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.752 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.752 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86fc0b7a-fb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13459 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6BFEF0000000001030307) 
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.752 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86fc0b7a-fb, col_values=(('external_ids', {'iface-id': '86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:77:f3', 'vm-uuid': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.754 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.756 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.760 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.760 282197 INFO os_vif [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:77:f3,bridge_name='br-int',has_traffic_filtering=True,id=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b,network=Network(652b6bdc-40ce-45b7-8aa5-3bca79987993),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86fc0b7a-fb')
Dec 06 09:59:36 np0005548789.localdomain systemd[1]: Started libvirt secret daemon.
Dec 06 09:59:36 np0005548789.localdomain kernel: device tap86fc0b7a-fb entered promiscuous mode
Dec 06 09:59:36 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:36Z|00056|binding|INFO|Claiming lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for this chassis.
Dec 06 09:59:36 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:36Z|00057|binding|INFO|86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b: Claiming fa:16:3e:64:77:f3 192.168.0.162
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.871 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015176.8739] manager: (tap86fc0b7a-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/16)
Dec 06 09:59:36 np0005548789.localdomain systemd-udevd[283151]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:59:36 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:36Z|00058|binding|INFO|Setting lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b ovn-installed in OVS
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.888 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:36Z|00059|binding|INFO|Setting lport 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b up in Southbound
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.887 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:77:f3 192.168.0.162'], port_security=['fa:16:3e:64:77:f3 192.168.0.162'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.162/24', 'neutron:device_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '8', 'neutron:security_group_ids': '65e67ecb-ffcf-41e6-8b8b-ed491f2580ec 7ce08e20-be94-4509-a371-aa5c036416af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7872d306-938e-4ee0-be61-57ba3983d747, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.890 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b in datapath 652b6bdc-40ce-45b7-8aa5-3bca79987993 bound to our chassis
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.892 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 652b6bdc-40ce-45b7-8aa5-3bca79987993
Dec 06 09:59:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015176.8987] device (tap86fc0b7a-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 09:59:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015176.8994] device (tap86fc0b7a-fb): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.901 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4477694e-f633-4a6e-896d-8f816e3a3a80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.903 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap652b6bdc-41 in ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.904 160674 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap652b6bdc-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.905 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[93134fe5-4989-4f27-bbb8-bf0289360e0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.906 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[7a81297a-5cb9-4c25-9050-8d2de27fa905]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.919 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[1e61e7b0-bc1a-44af-a307-cb90a96c3609]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.923 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:36.931 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.933 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[bb124a56-b3d0-435e-8e6a-9e2d1de501de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain systemd-machined[84444]: New machine qemu-2-instance-00000002.
Dec 06 09:59:36 np0005548789.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.961 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[62dfe517-ba9a-440a-9031-c1104aeb992f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:36.968 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3ada8c-5467-496d-9efd-5fe02cec1b18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:36 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015176.9704] manager: (tap652b6bdc-40): new Veth device (/org/freedesktop/NetworkManager/Devices/17)
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.001 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[8868444f-3a65-4b55-adc2-911de08d4d5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.006 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb230e0-83b3-437a-9fc0-7aded87f7305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap652b6bdc-41: link becomes ready
Dec 06 09:59:37 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap652b6bdc-40: link becomes ready
Dec 06 09:59:37 np0005548789.localdomain NetworkManager[5973]: <info>  [1765015177.0270] device (tap652b6bdc-40): carrier: link connected
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.032 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[1065fe97-5b45-4138-be02-3423e131d2dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.052 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[75949468-8ef4-4f8b-859d-b99db8ac5f35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap652b6bdc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:a7:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1159520, 'reachable_time': 26990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283188, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.065 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[61f3dcf6-7a49-4e0c-ae38-43a51e2f640f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:a70c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1159520, 'tstamp': 1159520}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283189, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.065 282197 DEBUG nova.compute.manager [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.066 282197 DEBUG oslo_concurrency.lockutils [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.067 282197 DEBUG oslo_concurrency.lockutils [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.067 282197 DEBUG oslo_concurrency.lockutils [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.067 282197 DEBUG nova.compute.manager [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] No waiting events found dispatching network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.068 282197 WARNING nova.compute.manager [req-8684fbc5-7c31-402f-831f-167241e84181 req-802597e0-7f32-4c4c-9049-f6b1a959ea2d 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received unexpected event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for instance with vm_state stopped and task_state powering-on.
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.083 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8d0d2403-0160-4bed-95b9-e7145865de01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap652b6bdc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b4:a7:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1159520, 'reachable_time': 26990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283197, 'error': None, 'target': 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.113 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[676e856f-f794-4ae6-8736-df26f1af5038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.179 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3d588d-7bce-46eb-b163-55de9d6d705f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.181 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap652b6bdc-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.182 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.183 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap652b6bdc-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.186 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:37 np0005548789.localdomain kernel: device tap652b6bdc-40 entered promiscuous mode
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.190 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.196 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap652b6bdc-40, col_values=(('external_ids', {'iface-id': '4fb81ffd-e198-4628-9bd0-0c0f0c89c33a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:59:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:37Z|00060|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.199 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.212 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.214 160509 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.216 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[33da2e55-6231-4c66-aa7e-a1dfd131a766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.217 160509 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: global
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log         /dev/log local0 debug
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log-tag     haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     user        root
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     group       root
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     maxconn     1024
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     pidfile     /var/lib/neutron/external/pids/652b6bdc-40ce-45b7-8aa5-3bca79987993.pid.haproxy
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     daemon
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: defaults
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log global
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     mode http
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option httplog
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option dontlognull
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option http-server-close
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option forwardfor
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     retries                 3
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-request    30s
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout connect         30s
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout client          32s
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout server          32s
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-keep-alive 30s
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: listen listener
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     bind 169.254.169.254:80
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:     http-request add-header X-OVN-Network-ID 652b6bdc-40ce-45b7-8aa5-3bca79987993
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 09:59:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:37.218 160509 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'env', 'PROCESS_TAG=haproxy-652b6bdc-40ce-45b7-8aa5-3bca79987993', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/652b6bdc-40ce-45b7-8aa5-3bca79987993.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.330 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765015177.3299198, b7ed0a2e-9350-4933-9334-4e5e08d3e6aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.330 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] VM Resumed (Lifecycle Event)
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.354 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.359 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.360 282197 DEBUG nova.compute.manager [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.364 282197 INFO nova.virt.libvirt.driver [-] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Instance rebooted successfully.
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.364 282197 DEBUG nova.compute.manager [None req-ca3c1125-e182-4a1b-89a6-1f4388da9e90 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.398 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] During sync_power_state the instance has a pending task (powering-on). Skip.
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.398 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765015177.3605745, b7ed0a2e-9350-4933-9334-4e5e08d3e6aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.398 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] VM Started (Lifecycle Event)
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.437 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:37.441 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 09:59:37 np0005548789.localdomain podman[283265]: 
Dec 06 09:59:37 np0005548789.localdomain podman[283265]: 2025-12-06 09:59:37.708853657 +0000 UTC m=+0.094238429 container create 09754e96bffe808c203c680a69a65deae010c5f97ae8e7bcaef645b11fa10ca7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 09:59:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 09:59:37 np0005548789.localdomain podman[283265]: 2025-12-06 09:59:37.66260197 +0000 UTC m=+0.047986792 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 09:59:37 np0005548789.localdomain systemd[1]: Started libpod-conmon-09754e96bffe808c203c680a69a65deae010c5f97ae8e7bcaef645b11fa10ca7.scope.
Dec 06 09:59:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8257 DF PROTO=TCP SPT=50026 DPT=9102 SEQ=3864606856 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6C3EF0000000001030307) 
Dec 06 09:59:37 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 09:59:37 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c614c2452f63581ed05d0d387559645c496c93a80ca0ed66fe42b66557922bf7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:59:37 np0005548789.localdomain podman[283278]: 2025-12-06 09:59:37.845812735 +0000 UTC m=+0.103472935 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 09:59:37 np0005548789.localdomain podman[283265]: 2025-12-06 09:59:37.855784374 +0000 UTC m=+0.241169156 container init 09754e96bffe808c203c680a69a65deae010c5f97ae8e7bcaef645b11fa10ca7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:59:37 np0005548789.localdomain podman[283265]: 2025-12-06 09:59:37.866881996 +0000 UTC m=+0.252266778 container start 09754e96bffe808c203c680a69a65deae010c5f97ae8e7bcaef645b11fa10ca7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:37 np0005548789.localdomain podman[283278]: 2025-12-06 09:59:37.883096877 +0000 UTC m=+0.140757057 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 09:59:37 np0005548789.localdomain neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993[283289]: [NOTICE]   (283303) : New worker (283305) forked
Dec 06 09:59:37 np0005548789.localdomain neutron-haproxy-ovnmeta-652b6bdc-40ce-45b7-8aa5-3bca79987993[283289]: [NOTICE]   (283303) : Loading success.
Dec 06 09:59:37 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.107 282197 DEBUG nova.compute.manager [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.108 282197 DEBUG oslo_concurrency.lockutils [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.109 282197 DEBUG oslo_concurrency.lockutils [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.109 282197 DEBUG oslo_concurrency.lockutils [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.110 282197 DEBUG nova.compute.manager [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] No waiting events found dispatching network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.110 282197 WARNING nova.compute.manager [req-67f2f013-ad7a-4a0d-93ae-076de7106831 req-c73f1e4d-3992-4560-aef4-596064f1b75c 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Received unexpected event network-vif-plugged-86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b for instance with vm_state active and task_state None.
Dec 06 09:59:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:39.197 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:39 np0005548789.localdomain snmpd[67279]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Dec 06 09:59:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13460 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6CFAF0000000001030307) 
Dec 06 09:59:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:41.756 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:43 np0005548789.localdomain sudo[283317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:43 np0005548789.localdomain sudo[283317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:43 np0005548789.localdomain sudo[283317]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:43 np0005548789.localdomain sudo[283335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 09:59:43 np0005548789.localdomain sudo[283335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:44.226 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:44 np0005548789.localdomain sudo[283335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:44 np0005548789.localdomain sshd[283392]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:59:44 np0005548789.localdomain sudo[283384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:44 np0005548789.localdomain sudo[283384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 09:59:44 np0005548789.localdomain sudo[283384]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 09:59:44 np0005548789.localdomain sudo[283409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 09:59:44 np0005548789.localdomain sudo[283409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:44 np0005548789.localdomain podman[283404]: 2025-12-06 09:59:44.717832649 +0000 UTC m=+0.082831448 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 09:59:44 np0005548789.localdomain podman[283404]: 2025-12-06 09:59:44.734399721 +0000 UTC m=+0.099398570 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 06 09:59:44 np0005548789.localdomain systemd[1]: tmp-crun.8mKM99.mount: Deactivated successfully.
Dec 06 09:59:44 np0005548789.localdomain podman[283405]: 2025-12-06 09:59:44.754288304 +0000 UTC m=+0.118663814 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:59:44 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 09:59:44 np0005548789.localdomain podman[283405]: 2025-12-06 09:59:44.792217686 +0000 UTC m=+0.156593226 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:59:44 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 09:59:45 np0005548789.localdomain sudo[283409]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   09:59:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 09:59:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 09:59:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:46.758 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:46 np0005548789.localdomain sshd[283476]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 09:59:47 np0005548789.localdomain sshd[283392]: Received disconnect from 45.78.222.162 port 55012:11: Bye Bye [preauth]
Dec 06 09:59:47 np0005548789.localdomain sshd[283392]: Disconnected from authenticating user root 45.78.222.162 port 55012 [preauth]
Dec 06 09:59:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:47.290 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:47.292 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:47.293 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:47 np0005548789.localdomain sshd[283476]: Received disconnect from 64.227.102.57 port 45702:11: Bye Bye [preauth]
Dec 06 09:59:47 np0005548789.localdomain sshd[283476]: Disconnected from authenticating user root 64.227.102.57 port 45702 [preauth]
Dec 06 09:59:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 09:59:47 np0005548789.localdomain podman[283478]: 2025-12-06 09:59:47.92251519 +0000 UTC m=+0.076916075 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:59:47 np0005548789.localdomain podman[283478]: 2025-12-06 09:59:47.940013511 +0000 UTC m=+0.094414416 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:59:47 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 09:59:48 np0005548789.localdomain sudo[283497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:59:48 np0005548789.localdomain sudo[283497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 09:59:48 np0005548789.localdomain sudo[283497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:49 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13461 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E6EFEF0000000001030307) 
Dec 06 09:59:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:49.271 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:50 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T09:59:50Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:64:77:f3 192.168.0.162
Dec 06 09:59:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:51.806 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 09:59:52 np0005548789.localdomain podman[283515]: 2025-12-06 09:59:52.927304681 +0000 UTC m=+0.078757513 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:59:52 np0005548789.localdomain podman[283515]: 2025-12-06 09:59:52.96322085 +0000 UTC m=+0.114673702 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 09:59:52 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 09:59:53 np0005548789.localdomain podman[241090]: time="2025-12-06T09:59:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 09:59:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:59:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149555 "" "Go-http-client/1.1"
Dec 06 09:59:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:09:59:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17733 "" "Go-http-client/1.1"
Dec 06 09:59:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:54.302 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:55.914 282197 DEBUG nova.compute.manager [None req-8782af70-4f70-461c-ad40-bbf0accb7649 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 09:59:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:55.919 282197 INFO nova.compute.manager [None req-8782af70-4f70-461c-ad40-bbf0accb7649 ff0049f3313348bdb67886d170c1c765 3d603431c0bb4967bafc7a0aa6108bfe - - default default] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Retrieving diagnostics
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:56.090 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:56.093 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:56 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:56.839 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 09:59:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.831 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.832 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.7391040
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32776 [06/Dec/2025:09:59:56.089] listener listener/metadata 0/0/0/1742/1742 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.850 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.852 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32790 [06/Dec/2025:09:59:57.850] listener listener/metadata 0/0/0/24/24 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.874 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0227432
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.894 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.895 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.908 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.908 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0138922
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32792 [06/Dec/2025:09:59:57.893] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.919 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.920 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain podman[283539]: 2025-12-06 09:59:57.923503285 +0000 UTC m=+0.081852169 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.935 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.936 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0155873
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32802 [06/Dec/2025:09:59:57.918] listener listener/metadata 0/0/0/17/17 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.940 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.940 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.955 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32810 [06/Dec/2025:09:59:57.939] listener listener/metadata 0/0/0/16/16 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.956 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0151703
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.959 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.960 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.980 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32820 [06/Dec/2025:09:59:57.959] listener listener/metadata 0/0/0/21/21 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.980 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.0202470
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.984 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.985 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.998 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:57 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32824 [06/Dec/2025:09:59:57.983] listener listener/metadata 0/0/0/15/15 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Dec 06 09:59:57 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:57.999 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0142181
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.002 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.003 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.015 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.016 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0128081
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32828 [06/Dec/2025:09:59:58.002] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.019 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.020 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain podman[283539]: 2025-12-06 09:59:58.022378939 +0000 UTC m=+0.180727823 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.033 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.034 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0140586
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32844 [06/Dec/2025:09:59:58.019] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.038 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.039 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32858 [06/Dec/2025:09:59:58.037] listener listener/metadata 0/0/0/17/17 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.055 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0154448
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.061 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.062 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.080 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32870 [06/Dec/2025:09:59:58.061] listener listener/metadata 0/0/0/19/19 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.080 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0182438
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.086 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.087 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.100 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32872 [06/Dec/2025:09:59:58.085] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.101 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0138178
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.106 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.107 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.121 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32878 [06/Dec/2025:09:59:58.106] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.121 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0143259
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.127 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.127 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.141 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32886 [06/Dec/2025:09:59:58.126] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.141 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0139818
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.148 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.149 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.163 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32890 [06/Dec/2025:09:59:58.148] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.164 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0147219
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.170 160637 DEBUG eventlet.wsgi.server [-] (160637) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.171 160637 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Accept: */*
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Connection: close
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Content-Type: text/plain
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: Host: 169.254.169.254
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: User-Agent: curl/7.84.0
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Forwarded-For: 192.168.0.162
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: X-Ovn-Network-Id: 652b6bdc-40ce-45b7-8aa5-3bca79987993 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.188 160637 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 06 09:59:58 np0005548789.localdomain haproxy-metadata-proxy-652b6bdc-40ce-45b7-8aa5-3bca79987993[283305]: 192.168.0.162:32906 [06/Dec/2025:09:59:58.170] listener listener/metadata 0/0/0/18/18 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Dec 06 09:59:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 09:59:58.189 160637 INFO eventlet.wsgi.server [-] 192.168.0.162,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0174277
Dec 06 09:59:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 09:59:59.339 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:01.879 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6026 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E729270000000001030307) 
Dec 06 10:00:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:04.376 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6027 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E72D2F0000000001030307) 
Dec 06 10:00:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13462 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E72FEF0000000001030307) 
Dec 06 10:00:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:00:05 np0005548789.localdomain podman[283565]: 2025-12-06 10:00:05.904919862 +0000 UTC m=+0.068230688 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 06 10:00:05 np0005548789.localdomain podman[283565]: 2025-12-06 10:00:05.914025544 +0000 UTC m=+0.077336380 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:05 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:00:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6028 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7352F0000000001030307) 
Dec 06 10:00:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:00:06Z|00061|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec 06 10:00:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:06.926 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46744 DF PROTO=TCP SPT=54848 DPT=9102 SEQ=227793754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E739EF0000000001030307) 
Dec 06 10:00:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:00:08 np0005548789.localdomain podman[283584]: 2025-12-06 10:00:08.916025581 +0000 UTC m=+0.078010821 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:00:08 np0005548789.localdomain podman[283584]: 2025-12-06 10:00:08.952200299 +0000 UTC m=+0.114185539 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:00:08 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:00:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5849 writes, 25K keys, 5849 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5849 writes, 797 syncs, 7.34 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 88 writes, 255 keys, 88 commit groups, 1.0 writes per commit group, ingest: 0.39 MB, 0.00 MB/s
                                                          Interval WAL: 88 writes, 37 syncs, 2.38 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:00:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:09.415 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6029 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E744EF0000000001030307) 
Dec 06 10:00:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:11.962 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:12 np0005548789.localdomain sshd[283607]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:00:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.2 total, 600.0 interval
                                                          Cumulative writes: 4914 writes, 22K keys, 4914 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4914 writes, 686 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 35 writes, 105 keys, 35 commit groups, 1.0 writes per commit group, ingest: 0.11 MB, 0.00 MB/s
                                                          Interval WAL: 35 writes, 17 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:00:14 np0005548789.localdomain sshd[283607]: Received disconnect from 118.219.234.233 port 41298:11: Bye Bye [preauth]
Dec 06 10:00:14 np0005548789.localdomain sshd[283607]: Disconnected from authenticating user root 118.219.234.233 port 41298 [preauth]
Dec 06 10:00:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:00:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:00:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:14.467 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:00:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:00:14 np0005548789.localdomain podman[283609]: 2025-12-06 10:00:14.931317049 +0000 UTC m=+0.082584662 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Dec 06 10:00:14 np0005548789.localdomain podman[283609]: 2025-12-06 10:00:14.946320442 +0000 UTC m=+0.097588135 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 06 10:00:14 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:00:15 np0005548789.localdomain systemd[1]: tmp-crun.CbqpIS.mount: Deactivated successfully.
Dec 06 10:00:15 np0005548789.localdomain podman[283610]: 2025-12-06 10:00:15.043944578 +0000 UTC m=+0.192856718 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:00:15 np0005548789.localdomain podman[283610]: 2025-12-06 10:00:15.056155965 +0000 UTC m=+0.205068115 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:15 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:00:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:00:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:17.007 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:00:18 np0005548789.localdomain podman[283648]: 2025-12-06 10:00:18.903550528 +0000 UTC m=+0.070574911 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:00:18 np0005548789.localdomain podman[283648]: 2025-12-06 10:00:18.916270751 +0000 UTC m=+0.083295134 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:18 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:00:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6030 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E765EF0000000001030307) 
Dec 06 10:00:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:19.505 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:22.049 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:22 np0005548789.localdomain sshd[283667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.175 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.252 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.253 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.253 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:00:23 np0005548789.localdomain sshd[283669]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.798 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.798 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.799 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:00:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:23.799 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:00:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:00:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:00:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:00:23 np0005548789.localdomain podman[283671]: 2025-12-06 10:00:23.917537574 +0000 UTC m=+0.079996782 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:00:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:00:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149555 "" "Go-http-client/1.1"
Dec 06 10:00:23 np0005548789.localdomain sshd[283667]: Received disconnect from 14.194.101.210 port 49292:11: Bye Bye [preauth]
Dec 06 10:00:23 np0005548789.localdomain sshd[283667]: Disconnected from authenticating user root 14.194.101.210 port 49292 [preauth]
Dec 06 10:00:23 np0005548789.localdomain podman[283671]: 2025-12-06 10:00:23.999495456 +0000 UTC m=+0.161954614 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:00:24 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:00:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:00:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17731 "" "Go-http-client/1.1"
Dec 06 10:00:24 np0005548789.localdomain sshd[283669]: Received disconnect from 154.113.10.34 port 42096:11: Bye Bye [preauth]
Dec 06 10:00:24 np0005548789.localdomain sshd[283669]: Disconnected from authenticating user root 154.113.10.34 port 42096 [preauth]
Dec 06 10:00:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:24.556 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.085 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.118 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.119 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.119 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.120 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.120 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.121 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.121 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.122 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.122 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.123 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.148 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.149 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.149 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.149 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.150 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.612 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.698 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.698 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.909 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.910 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12019MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.911 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:25.911 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.150 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.151 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.151 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.201 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.636 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.644 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.688 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.716 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:00:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:26.716 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:27.087 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:00:28 np0005548789.localdomain systemd[1]: tmp-crun.JDWJJz.mount: Deactivated successfully.
Dec 06 10:00:28 np0005548789.localdomain podman[283742]: 2025-12-06 10:00:28.900390239 +0000 UTC m=+0.065641378 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:00:28 np0005548789.localdomain podman[283742]: 2025-12-06 10:00:28.962444976 +0000 UTC m=+0.127696085 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:00:28 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:00:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:29.558 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:32.123 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39150 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E79E570000000001030307) 
Dec 06 10:00:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:34.560 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39151 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7A26F0000000001030307) 
Dec 06 10:00:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6031 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7A5EF0000000001030307) 
Dec 06 10:00:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39152 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7AA6F0000000001030307) 
Dec 06 10:00:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:00:36 np0005548789.localdomain podman[283765]: 2025-12-06 10:00:36.920854663 +0000 UTC m=+0.083233673 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:00:36 np0005548789.localdomain podman[283765]: 2025-12-06 10:00:36.952068727 +0000 UTC m=+0.114447737 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:00:36 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:00:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:37.178 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13463 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=735695711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7ADEF0000000001030307) 
Dec 06 10:00:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:39.581 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:00:39 np0005548789.localdomain systemd[1]: tmp-crun.hJbOKn.mount: Deactivated successfully.
Dec 06 10:00:39 np0005548789.localdomain podman[283783]: 2025-12-06 10:00:39.914631088 +0000 UTC m=+0.081430006 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:00:39 np0005548789.localdomain podman[283783]: 2025-12-06 10:00:39.924169722 +0000 UTC m=+0.090968600 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:00:39 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:00:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39153 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7BA2F0000000001030307) 
Dec 06 10:00:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:42.214 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:44.584 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:00:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:00:45 np0005548789.localdomain podman[283806]: 2025-12-06 10:00:45.917773939 +0000 UTC m=+0.083733118 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec 06 10:00:45 np0005548789.localdomain podman[283806]: 2025-12-06 10:00:45.933291157 +0000 UTC m=+0.099249936 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:00:45 np0005548789.localdomain systemd[1]: tmp-crun.8tQovD.mount: Deactivated successfully.
Dec 06 10:00:45 np0005548789.localdomain podman[283807]: 2025-12-06 10:00:45.970558578 +0000 UTC m=+0.130631255 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 06 10:00:45 np0005548789.localdomain podman[283807]: 2025-12-06 10:00:45.982095605 +0000 UTC m=+0.142168242 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:00:45 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:00:46 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:00:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:00:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:00:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:47.266 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:00:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:00:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:00:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:48 np0005548789.localdomain sudo[283845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:00:48 np0005548789.localdomain sudo[283845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:48 np0005548789.localdomain sudo[283845]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:48 np0005548789.localdomain sudo[283863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:00:48 np0005548789.localdomain sudo[283863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:48 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39154 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E7D9EF0000000001030307) 
Dec 06 10:00:49 np0005548789.localdomain sudo[283863]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:49.619 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:00:49 np0005548789.localdomain systemd[1]: tmp-crun.y6F8uA.mount: Deactivated successfully.
Dec 06 10:00:49 np0005548789.localdomain podman[283912]: 2025-12-06 10:00:49.945970836 +0000 UTC m=+0.100269869 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:00:49 np0005548789.localdomain podman[283912]: 2025-12-06 10:00:49.961155744 +0000 UTC m=+0.115454817 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:49 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:00:50 np0005548789.localdomain sshd[283931]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:00:51 np0005548789.localdomain sshd[283931]: Received disconnect from 64.227.102.57 port 40414:11: Bye Bye [preauth]
Dec 06 10:00:51 np0005548789.localdomain sshd[283931]: Disconnected from authenticating user root 64.227.102.57 port 40414 [preauth]
Dec 06 10:00:52 np0005548789.localdomain sudo[283933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:00:52 np0005548789.localdomain sudo[283933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:00:52 np0005548789.localdomain sudo[283933]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:52.299 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:00:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:00:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:00:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149555 "" "Go-http-client/1.1"
Dec 06 10:00:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:00:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17736 "" "Go-http-client/1.1"
Dec 06 10:00:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:54.621 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:00:54 np0005548789.localdomain podman[283951]: 2025-12-06 10:00:54.917006485 +0000 UTC m=+0.081554771 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:00:54 np0005548789.localdomain podman[283951]: 2025-12-06 10:00:54.926900571 +0000 UTC m=+0.091448807 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:00:54 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:00:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:57.345 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:00:59.624 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:00:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:00:59 np0005548789.localdomain podman[283975]: 2025-12-06 10:00:59.935946995 +0000 UTC m=+0.096347318 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:01:00 np0005548789.localdomain podman[283975]: 2025-12-06 10:01:00.00185382 +0000 UTC m=+0.162254103 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:01:00 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:01:01 np0005548789.localdomain CROND[284001]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 10:01:01 np0005548789.localdomain run-parts[284004]: (/etc/cron.hourly) starting 0anacron
Dec 06 10:01:01 np0005548789.localdomain run-parts[284010]: (/etc/cron.hourly) finished 0anacron
Dec 06 10:01:01 np0005548789.localdomain CROND[284000]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 10:01:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:02.401 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:03 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22833 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E813860000000001030307) 
Dec 06 10:01:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:04.627 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:04 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22834 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E817AF0000000001030307) 
Dec 06 10:01:05 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39155 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E819EF0000000001030307) 
Dec 06 10:01:06 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22835 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E81FB00000000001030307) 
Dec 06 10:01:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:07.447 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:01:07 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6032 DF PROTO=TCP SPT=43920 DPT=9102 SEQ=1837799133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E823EF0000000001030307) 
Dec 06 10:01:07 np0005548789.localdomain podman[284011]: 2025-12-06 10:01:07.910550802 +0000 UTC m=+0.070409625 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain podman[284011]: 2025-12-06 10:01:07.915133984 +0000 UTC m=+0.074992747 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.916 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '258a44cc-0198-4e0d-aa9d-e683f9c0f5e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.912407', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c2d8e8c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '218ef46d3f8b5b1adca4585be53b3fd032ee0b13d5db33b80438104f618e5130'}]}, 'timestamp': '2025-12-06 10:01:07.916841', '_unique_id': '38b03ccf8e754a4f9275c59fe3754dc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.918 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c05dbeb8-8783-4317-a66d-46d4d9fe6313', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.920004', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c2e1b86-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': 'e1cc059dfbcf2dbda5dbd81dc37502002ff56ce05390c8159f9f584adafe5585'}]}, 'timestamp': '2025-12-06 10:01:07.920325', '_unique_id': 'fd2f61003f0044c8be74f976da5e1a62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.920 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.953 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.954 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc84c27e-e785-4019-bec4-9ecb35fc5741', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.921793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c3353ee-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': 'e75a594705bf3e98bbebba385732c12fb028a60f8767387d20e88fb12e161ca1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.921793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c335ede-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '8dcae01d3ae4720f4c192e96f7b36dad358257978dfd77310b83f896923cb4a3'}]}, 'timestamp': '2025-12-06 10:01:07.954789', '_unique_id': '8727fa6882d041ef919bdcbd514202b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.955 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.956 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4f45fb8-de66-4fa1-93c8-01f1e2da95d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.956385', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c33a6f0-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '22dbe1f09cf79d61e4804913fa382f87b6b905b8a859ed99be277388e14abd1f'}]}, 'timestamp': '2025-12-06 10:01:07.956646', '_unique_id': '6e4f6b65901d4a4d888625877cee88ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.957 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.973 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ec405a9-9bcd-4760-8e65-4cf14da70755', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:01:07.957655', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7c365436-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.222934014, 'message_signature': 'f5fe25f0a97990b5ed58da243ab42732494746e38d0953790f84c23538e36667'}]}, 'timestamp': '2025-12-06 10:01:07.974161', '_unique_id': '85bf1305c7254a8889c2717cbb1c71dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.974 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e96dfaf2-16c2-47bf-89ca-5862ce23c416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.975344', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c368b2c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '31490a084a3d1d0029c9e6c3619b08757223386c34f28b6c09fa5352d136549c'}]}, 'timestamp': '2025-12-06 10:01:07.975562', '_unique_id': 'd95017d72184418f94a3204624264dba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.976 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.976 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7d2c608-a94b-43b6-aea5-573a84894a00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.976602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c36bc0a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '5c05028a6669809bfe6fa9d08db579e2df435fdfaffed62db04667538cb39dda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.976602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c36c420-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': 'eb060a96ea608b113031fe0cf588102812bd06ba6813e9909d945bef16251c9b'}]}, 'timestamp': '2025-12-06 10:01:07.977005', '_unique_id': '63835b498f714a458b725daae53ef434'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '977803ec-e483-463b-907d-f2ae1160d114', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.977985', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c36f224-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': 'e41074f8b085de0fcc4b675c0abdf0e4f55f7300b24dec06b73dc51d2becc8c3'}]}, 'timestamp': '2025-12-06 10:01:07.978195', '_unique_id': '87521b1ec2e2452f9de42e014eef9e5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.978 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4338034a-9555-47fd-8eeb-a738b220558f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.979179', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c3720c8-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '01abf6026e8f7dad628f9285ee9b974534a69b42b71ce676559634b3ed72de4d'}]}, 'timestamp': '2025-12-06 10:01:07.979389', '_unique_id': 'ecbb7cae1df64de78d0596f472a91bd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.980 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.980 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.980 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c83166-16de-497a-86c2-b9180b910ce7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.980537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c375598-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': 'a8b9208f8f470b06e90281c7ec42162bd8339eefb4e346fd3c460d27c64c9402'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.980537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c375d68-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '0bb702d7990f755920f394245e0e3c13376f486ee1af4a8a3be04e4b5415965d'}]}, 'timestamp': '2025-12-06 10:01:07.980929', '_unique_id': 'c2ad183fec0d49a8b295bf560776c8d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0df406af-c3d5-410d-8173-83926715636b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.982091', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c379364-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '2f5a8c024e7628e0ee59fbd6b5faf58d655d13e79ae66ca811b68b90f6247374'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.982091', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c379cc4-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '3b09e4f648a11454ef75c57b9d2e1009a85851bf985db4936eeecb003a876f1b'}]}, 'timestamp': '2025-12-06 10:01:07.982566', '_unique_id': '19f4ce6a674a45f3b40f4d972839aec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.990 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.990 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4899d366-c016-4e18-bb47-12debfabc43a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.983723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c38d8dc-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': 'b27ba5a66df63be68303264a188a6d249a5445c0a0b9994114187733d4c568f1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.983723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c38e084-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': '6e0cd521f6ac0a1b939677a4f3f86f879bdfc561a3ec19a3fc53c19481a6bad2'}]}, 'timestamp': '2025-12-06 10:01:07.990853', '_unique_id': 'e21da0b3e8394cda918c99b9bd99e093'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fce535c4-a4ce-46e7-92e9-18e62254ca13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.991940', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c39134c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '9f265208bfe4156d43391e13f0ef8ebdb8fba468357ec59f76dc951690302d27'}]}, 'timestamp': '2025-12-06 10:01:07.992152', '_unique_id': '214d15edb8fe4412b61de00fc0133448'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e6cbfbb-a981-4168-8793-906089c3d5ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.993282', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c39479a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '378bd04092d04ba23a4c0a4a001a684cf7cca52e5a132e7a1d7be34e5dc545e6'}]}, 'timestamp': '2025-12-06 10:01:07.993491', '_unique_id': '5781091bd6fd4632b53952730e938604'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89490718-648d-41fb-b555-d0820b76c29c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.994663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c397e4a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '7c77c99c2554394e226ebc600603a8da765e212f443fa58a166991595e5ad7f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.994663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c398598-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '2aafd242e34f07fd3d9a96c543a1819502552bb9500cb3bbad0261696a838b33'}]}, 'timestamp': '2025-12-06 10:01:07.995065', '_unique_id': '402fe09a43324b4fa4cf5dba15df3e15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '086e0e08-1baa-4ac2-91b1-c62d3587cb9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.996050', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c39b3ec-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': '965963769ae0c57eacd8d001051f8f37c85ad970da5fb50c9a689257a4a68e01'}]}, 'timestamp': '2025-12-06 10:01:07.996265', '_unique_id': '6e61b0f000d34aeb8b0943a2a7b10c92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f44b54da-1cdd-4984-9c52-5c964c83f225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:01:07.997227', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7c39e1c8-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.161738003, 'message_signature': 'ac0f84c4b99eb1849e702da2b72899f1baff7179024792496a1c66cea9fea687'}]}, 'timestamp': '2025-12-06 10:01:07.997438', '_unique_id': '47d5b35acd3d4c5985af2e79b9938453'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 11760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50307a0f-958d-4550-b21c-ff02bbd50f79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11760000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:01:07.998415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7c3a101c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.222934014, 'message_signature': '4c76916e4c317cb7f7cc1d2ae8379b0eefc5191030432e6111fe410941fec034'}]}, 'timestamp': '2025-12-06 10:01:07.998617', '_unique_id': 'c3eb6bf6e4fc4d84825eb6c98c282f22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '528f5847-a6f1-4f29-ba44-e033ecd120d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:07.999575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c3a3d62-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': '91fb821197ecde32b6c4260ff8162193e20d2a3264531c10167ba58eb41edd98'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:07.999575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c3a45d2-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.171142004, 'message_signature': 'c5f4d17585cdd705d52193bc022b2f1e6df1b8cf825909fcea021b794d49cdc0'}]}, 'timestamp': '2025-12-06 10:01:07.999987', '_unique_id': '2f081456da324ea4a0fcfe6f7c7de996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.000 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcb9829c-8c9b-4557-bc62-2a77966dde46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:08.000966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c3a73cc-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': '1d85d4e8afb7f15a440576d9233a2e124abca719dbe6dffbcfafb7493bba799f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:08.000966', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c3a7b74-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': '6341ba641e9c03787021d727108f1453868f09df3a77804854e9b6aabfbf5c2c'}]}, 'timestamp': '2025-12-06 10:01:08.001360', '_unique_id': '39899824bb7b4aa9b0bfc61be9be778f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2083999c-c44d-4804-8041-16e94c43e55d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:01:08.002364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7c3aaa68-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': '7d03e7738215526997dcdbe242e5860ea60c67106c7e2c43fcd9f2d126133fb5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:01:08.002364', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7c3ab1ac-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11686.233069796, 'message_signature': '607dd674feaaf3079d7fb824a5311c1289c4fa3d381cb1b8eff5c0f3c2b2d47b'}]}, 'timestamp': '2025-12-06 10:01:08.002747', '_unique_id': 'f501c860a9534fa889c6e57ca7463a69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:01:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:01:08.003 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:01:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:09.630 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:01:10 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22836 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E82F6F0000000001030307) 
Dec 06 10:01:10 np0005548789.localdomain podman[284030]: 2025-12-06 10:01:10.921015374 +0000 UTC m=+0.082148568 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:01:10 np0005548789.localdomain podman[284030]: 2025-12-06 10:01:10.929743784 +0000 UTC m=+0.090877048 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:01:10 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:01:12 np0005548789.localdomain sshd[284053]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:12 np0005548789.localdomain sshd[284053]: Accepted publickey for zuul from 38.102.83.114 port 35214 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:01:12 np0005548789.localdomain systemd-logind[766]: New session 62 of user zuul.
Dec 06 10:01:12 np0005548789.localdomain systemd[1]: Started Session 62 of User zuul.
Dec 06 10:01:12 np0005548789.localdomain sshd[284053]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:01:12 np0005548789.localdomain sudo[284073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzczrdkrtjotwedylgywxkheruvrhzth ; /usr/bin/python3
Dec 06 10:01:12 np0005548789.localdomain sudo[284073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:01:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:12.493 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:12 np0005548789.localdomain python3[284075]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:01:13 np0005548789.localdomain subscription-manager[284076]: Unregistered machine with identity: 49b9d3d6-359c-4738-9880-6751941cc8f8
Dec 06 10:01:13 np0005548789.localdomain systemd-journald[47810]: Field hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Dec 06 10:01:13 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 10:01:13 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:13 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:14.634 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:14 np0005548789.localdomain sudo[284073]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:01:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:01:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:01:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:01:16 np0005548789.localdomain systemd[1]: tmp-crun.PAJDiN.mount: Deactivated successfully.
Dec 06 10:01:16 np0005548789.localdomain podman[284079]: 2025-12-06 10:01:16.920860854 +0000 UTC m=+0.079426055 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 06 10:01:16 np0005548789.localdomain podman[284079]: 2025-12-06 10:01:16.932961508 +0000 UTC m=+0.091526649 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:01:16 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:01:16 np0005548789.localdomain podman[284080]: 2025-12-06 10:01:16.991119215 +0000 UTC m=+0.145238628 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:01:17 np0005548789.localdomain podman[284080]: 2025-12-06 10:01:17.005154398 +0000 UTC m=+0.159273821 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:01:17 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:01:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:17.539 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:19 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22837 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E84FEF0000000001030307) 
Dec 06 10:01:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:19.635 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:01:20 np0005548789.localdomain podman[284119]: 2025-12-06 10:01:20.915835845 +0000 UTC m=+0.065098203 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec 06 10:01:20 np0005548789.localdomain podman[284119]: 2025-12-06 10:01:20.955273433 +0000 UTC m=+0.104535821 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 10:01:20 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:01:21 np0005548789.localdomain sshd[284139]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:22.576 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:01:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:01:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:01:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149555 "" "Go-http-client/1.1"
Dec 06 10:01:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:01:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17742 "" "Go-http-client/1.1"
Dec 06 10:01:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:24.638 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:25 np0005548789.localdomain sshd[284139]: Received disconnect from 179.33.210.213 port 55640:11: Bye Bye [preauth]
Dec 06 10:01:25 np0005548789.localdomain sshd[284139]: Disconnected from authenticating user root 179.33.210.213 port 55640 [preauth]
Dec 06 10:01:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:01:25 np0005548789.localdomain podman[284141]: 2025-12-06 10:01:25.138680934 +0000 UTC m=+0.079689892 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:01:25 np0005548789.localdomain podman[284141]: 2025-12-06 10:01:25.151372376 +0000 UTC m=+0.092381334 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:01:25 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.719 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.719 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.720 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.720 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.837 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.838 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.838 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:01:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:26.839 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.236 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.255 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.256 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.257 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.258 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.258 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.259 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.260 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.260 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.260 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.261 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.276 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.276 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.277 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.277 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.278 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.615 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.741 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.957 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:01:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:27.959 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.147 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.148 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=12006MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.148 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.149 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.228 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.228 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.228 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.303 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.709 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.714 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.736 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.737 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:01:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:28.738 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:29.641 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:01:30 np0005548789.localdomain systemd[1]: tmp-crun.kNGkvn.mount: Deactivated successfully.
Dec 06 10:01:30 np0005548789.localdomain podman[284208]: 2025-12-06 10:01:30.928608978 +0000 UTC m=+0.093198730 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:01:30 np0005548789.localdomain podman[284208]: 2025-12-06 10:01:30.988385045 +0000 UTC m=+0.152974836 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:01:31 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:01:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:32.661 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:33 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26396 DF PROTO=TCP SPT=38450 DPT=9102 SEQ=4212794354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E888B70000000001030307) 
Dec 06 10:01:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:34.643 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:34 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26397 DF PROTO=TCP SPT=38450 DPT=9102 SEQ=4212794354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E88CAF0000000001030307) 
Dec 06 10:01:35 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22838 DF PROTO=TCP SPT=52292 DPT=9102 SEQ=1660943468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E88FEF0000000001030307) 
Dec 06 10:01:36 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26398 DF PROTO=TCP SPT=38450 DPT=9102 SEQ=4212794354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E894AF0000000001030307) 
Dec 06 10:01:36 np0005548789.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 10:01:37 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39156 DF PROTO=TCP SPT=53032 DPT=9102 SEQ=1349501459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E897EF0000000001030307) 
Dec 06 10:01:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:37.703 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:01:38 np0005548789.localdomain podman[284234]: 2025-12-06 10:01:38.893574628 +0000 UTC m=+0.055224518 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 06 10:01:38 np0005548789.localdomain podman[284234]: 2025-12-06 10:01:38.902445581 +0000 UTC m=+0.064095531 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:01:38 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:01:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:39.646 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:40 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26399 DF PROTO=TCP SPT=38450 DPT=9102 SEQ=4212794354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E8A46F0000000001030307) 
Dec 06 10:01:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:01:41 np0005548789.localdomain podman[284252]: 2025-12-06 10:01:41.913325714 +0000 UTC m=+0.077301319 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:01:41 np0005548789.localdomain podman[284252]: 2025-12-06 10:01:41.950262595 +0000 UTC m=+0.114238180 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:01:41 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:01:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:42.753 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:44.650 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:46 np0005548789.localdomain sudo[284273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:01:46 np0005548789.localdomain sudo[284273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:46 np0005548789.localdomain sudo[284273]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:46 np0005548789.localdomain sudo[284291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 10:01:46 np0005548789.localdomain sudo[284291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:01:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:01:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.134669867 +0000 UTC m=+0.095450060 container create 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, version=7, GIT_BRANCH=main, release=1763362218, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started libpod-conmon-3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203.scope.
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.096337802 +0000 UTC m=+0.057118005 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.196905719 +0000 UTC m=+0.157685892 container init 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=1763362218, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4)
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.207171316 +0000 UTC m=+0.167951469 container start 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.207375833 +0000 UTC m=+0.168156016 container attach 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, ceph=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:01:47 np0005548789.localdomain cranky_rhodes[284370]: 167 167
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: libpod-3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203.scope: Deactivated successfully.
Dec 06 10:01:47 np0005548789.localdomain podman[284350]: 2025-12-06 10:01:47.210244281 +0000 UTC m=+0.171024454 container died 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, version=7, release=1763362218, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Dec 06 10:01:47 np0005548789.localdomain podman[284365]: 2025-12-06 10:01:47.253725314 +0000 UTC m=+0.078369022 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Dec 06 10:01:47 np0005548789.localdomain podman[284366]: 2025-12-06 10:01:47.288828719 +0000 UTC m=+0.117250913 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3)
Dec 06 10:01:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:01:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:01:47.291 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:01:47.292 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:47 np0005548789.localdomain podman[284392]: 2025-12-06 10:01:47.344491208 +0000 UTC m=+0.124157397 container remove 3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_rhodes, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: libpod-conmon-3ed2c67f4a727043f3c5d7aec159250db8465307db8d8db8ed110a35ac357203.scope: Deactivated successfully.
Dec 06 10:01:47 np0005548789.localdomain podman[284366]: 2025-12-06 10:01:47.377638972 +0000 UTC m=+0.206061166 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:01:47 np0005548789.localdomain podman[284365]: 2025-12-06 10:01:47.394046858 +0000 UTC m=+0.218690546 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:47.514017074 +0000 UTC m=+0.062442089 container create 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, architecture=x86_64)
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started libpod-conmon-96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d.scope.
Dec 06 10:01:47 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:01:47 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/809ccad5b5d736bb19e7356ddf27631debaa53da8e797dc65df204e66d710152/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/809ccad5b5d736bb19e7356ddf27631debaa53da8e797dc65df204e66d710152/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/809ccad5b5d736bb19e7356ddf27631debaa53da8e797dc65df204e66d710152/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/809ccad5b5d736bb19e7356ddf27631debaa53da8e797dc65df204e66d710152/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:47.571789589 +0000 UTC m=+0.120214644 container init 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main)
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:47.582557032 +0000 UTC m=+0.130982077 container start 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1763362218, ceph=True, CEPH_POINT_RELEASE=)
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:47.582879852 +0000 UTC m=+0.131304907 container attach 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:01:47 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:47.494110499 +0000 UTC m=+0.042535524 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:01:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:47.788 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-310621d5af1248ba667dc29e3bd966a624ee0a93c11980cc5e219990815d7a9c-merged.mount: Deactivated successfully.
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]: [
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:     {
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "available": false,
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "ceph_device": false,
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "lsm_data": {},
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "lvs": [],
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "path": "/dev/sr0",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "rejected_reasons": [
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "Insufficient space (<5GB)",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "Has a FileSystem"
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         ],
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         "sys_api": {
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "actuators": null,
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "device_nodes": "sr0",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "human_readable_size": "482.00 KB",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "id_bus": "ata",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "model": "QEMU DVD-ROM",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "nr_requests": "2",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "partitions": {},
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "path": "/dev/sr0",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "removable": "1",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "rev": "2.5+",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "ro": "0",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "rotational": "1",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "sas_address": "",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "sas_device_handle": "",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "scheduler_mode": "mq-deadline",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "sectors": 0,
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "sectorsize": "2048",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "size": 493568.0,
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "support_discard": "0",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "type": "disk",
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:             "vendor": "QEMU"
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:         }
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]:     }
Dec 06 10:01:48 np0005548789.localdomain strange_noyce[284447]: ]
Dec 06 10:01:48 np0005548789.localdomain systemd[1]: libpod-96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d.scope: Deactivated successfully.
Dec 06 10:01:48 np0005548789.localdomain systemd[1]: libpod-96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d.scope: Consumed 1.127s CPU time.
Dec 06 10:01:48 np0005548789.localdomain podman[284432]: 2025-12-06 10:01:48.66703335 +0000 UTC m=+1.215458435 container died 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 10:01:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-809ccad5b5d736bb19e7356ddf27631debaa53da8e797dc65df204e66d710152-merged.mount: Deactivated successfully.
Dec 06 10:01:48 np0005548789.localdomain podman[286393]: 2025-12-06 10:01:48.768716211 +0000 UTC m=+0.085669807 container remove 96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_noyce, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:01:48 np0005548789.localdomain systemd[1]: libpod-conmon-96ec1a47279a6e4babe7c1a9375b54f2f4d4bbf7b494b570b422b5bd442aa01d.scope: Deactivated successfully.
Dec 06 10:01:48 np0005548789.localdomain sudo[284291]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:48 np0005548789.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:1e:1a:76 MACDST=fa:16:3e:66:7f:12 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.107 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26400 DF PROTO=TCP SPT=38450 DPT=9102 SEQ=4212794354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A52E8C3EF0000000001030307) 
Dec 06 10:01:49 np0005548789.localdomain sudo[286407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:01:49 np0005548789.localdomain sudo[286407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:49 np0005548789.localdomain sudo[286407]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:49 np0005548789.localdomain sudo[286425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:01:49 np0005548789.localdomain sudo[286425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:49 np0005548789.localdomain sudo[286425]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:49.651 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:49 np0005548789.localdomain sudo[286443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:01:49 np0005548789.localdomain sudo[286443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:50 np0005548789.localdomain sudo[286443]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:50 np0005548789.localdomain sshd[286493]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:51 np0005548789.localdomain sudo[286495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:01:51 np0005548789.localdomain sudo[286495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:01:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:01:51 np0005548789.localdomain sudo[286495]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:51 np0005548789.localdomain podman[286513]: 2025-12-06 10:01:51.200036182 +0000 UTC m=+0.093835769 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:01:51 np0005548789.localdomain podman[286513]: 2025-12-06 10:01:51.213405925 +0000 UTC m=+0.107205513 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 06 10:01:51 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:01:51 np0005548789.localdomain sshd[286493]: Received disconnect from 64.227.102.57 port 35760:11: Bye Bye [preauth]
Dec 06 10:01:51 np0005548789.localdomain sshd[286493]: Disconnected from authenticating user root 64.227.102.57 port 35760 [preauth]
Dec 06 10:01:51 np0005548789.localdomain sshd[286532]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:51 np0005548789.localdomain sshd[286534]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:52 np0005548789.localdomain sshd[286536]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:01:52 np0005548789.localdomain sshd[286536]: Accepted publickey for tripleo-admin from 192.168.122.11 port 44940 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:01:52 np0005548789.localdomain systemd-logind[766]: New session 63 of user tripleo-admin.
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Queued start job for default target Main User Target.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Created slice User Application Slice.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Reached target Paths.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Reached target Timers.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Starting D-Bus User Message Bus Socket...
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Starting Create User's Volatile Files and Directories...
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Reached target Sockets.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Finished Create User's Volatile Files and Directories.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Reached target Basic System.
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Reached target Main User Target.
Dec 06 10:01:52 np0005548789.localdomain systemd[286540]: Startup finished in 162ms.
Dec 06 10:01:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:52.838 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:52 np0005548789.localdomain systemd[1]: Started Session 63 of User tripleo-admin.
Dec 06 10:01:52 np0005548789.localdomain sshd[286536]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:01:53 np0005548789.localdomain sudo[286681]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwyoiqwgzarkbnhqycktmmprpihzuzma ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015312.9391618-60059-143052692344022/AnsiballZ_blockinfile.py
Dec 06 10:01:53 np0005548789.localdomain sudo[286681]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:01:53 np0005548789.localdomain sshd[286534]: Received disconnect from 118.219.234.233 port 43064:11: Bye Bye [preauth]
Dec 06 10:01:53 np0005548789.localdomain sshd[286534]: Disconnected from authenticating user root 118.219.234.233 port 43064 [preauth]
Dec 06 10:01:53 np0005548789.localdomain sshd[286532]: Received disconnect from 14.194.101.210 port 43830:11: Bye Bye [preauth]
Dec 06 10:01:53 np0005548789.localdomain sshd[286532]: Disconnected from authenticating user root 14.194.101.210 port 43830 [preauth]
Dec 06 10:01:53 np0005548789.localdomain python3[286683]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:01:53 np0005548789.localdomain sudo[286681]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:01:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:01:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:01:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149555 "" "Go-http-client/1.1"
Dec 06 10:01:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:01:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17746 "" "Go-http-client/1.1"
Dec 06 10:01:53 np0005548789.localdomain sudo[286825]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucjjjbgphcztzhxxbnymgpmjubebjkhe ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015313.5609236-60073-240986986801442/AnsiballZ_systemd.py
Dec 06 10:01:53 np0005548789.localdomain sudo[286825]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:01:54 np0005548789.localdomain python3[286827]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 10:01:54 np0005548789.localdomain systemd[1]: Stopping Netfilter Tables...
Dec 06 10:01:54 np0005548789.localdomain systemd[1]: nftables.service: Deactivated successfully.
Dec 06 10:01:54 np0005548789.localdomain systemd[1]: Stopped Netfilter Tables.
Dec 06 10:01:54 np0005548789.localdomain systemd[1]: Starting Netfilter Tables...
Dec 06 10:01:54 np0005548789.localdomain systemd[1]: Finished Netfilter Tables.
Dec 06 10:01:54 np0005548789.localdomain sudo[286825]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:54.653 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:01:55 np0005548789.localdomain podman[286851]: 2025-12-06 10:01:55.933971719 +0000 UTC m=+0.090658151 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:01:55 np0005548789.localdomain podman[286851]: 2025-12-06 10:01:55.942942706 +0000 UTC m=+0.099629138 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:01:55 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:01:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:57.894 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:01:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:01:59.657 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:00 np0005548789.localdomain sudo[286875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:00 np0005548789.localdomain sudo[286875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:00 np0005548789.localdomain sudo[286875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:02:01 np0005548789.localdomain podman[286893]: 2025-12-06 10:02:01.925991526 +0000 UTC m=+0.083389627 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:02:01 np0005548789.localdomain podman[286893]: 2025-12-06 10:02:01.990178479 +0000 UTC m=+0.147576560 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:02:02 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:02:02 np0005548789.localdomain sudo[286917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:02 np0005548789.localdomain sudo[286917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:02 np0005548789.localdomain sudo[286917]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:02.934 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:03 np0005548789.localdomain sudo[286935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:03 np0005548789.localdomain sudo[286935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:03 np0005548789.localdomain sudo[286935]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:04 np0005548789.localdomain sshd[286953]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:02:04 np0005548789.localdomain sudo[286955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:04 np0005548789.localdomain sudo[286955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:04 np0005548789.localdomain sudo[286955]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:04.659 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:05 np0005548789.localdomain sshd[286953]: Received disconnect from 154.113.10.34 port 55838:11: Bye Bye [preauth]
Dec 06 10:02:05 np0005548789.localdomain sshd[286953]: Disconnected from authenticating user root 154.113.10.34 port 55838 [preauth]
Dec 06 10:02:05 np0005548789.localdomain sudo[286973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:05 np0005548789.localdomain sudo[286973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:05 np0005548789.localdomain sudo[286973]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:07 np0005548789.localdomain sudo[286991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:07 np0005548789.localdomain sudo[286991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:07 np0005548789.localdomain sudo[286991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:07.975 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:09 np0005548789.localdomain sshd[287009]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:02:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:09.662 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:02:09 np0005548789.localdomain podman[287011]: 2025-12-06 10:02:09.926699135 +0000 UTC m=+0.082165065 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 10:02:09 np0005548789.localdomain podman[287011]: 2025-12-06 10:02:09.934971627 +0000 UTC m=+0.090437557 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:02:09 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:02:10 np0005548789.localdomain sudo[287030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:02:10 np0005548789.localdomain sudo[287030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:10 np0005548789.localdomain sudo[287030]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:10 np0005548789.localdomain sudo[287048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:02:10 np0005548789.localdomain sudo[287048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:11 np0005548789.localdomain sshd[287009]: Received disconnect from 45.78.222.162 port 52374:11: Bye Bye [preauth]
Dec 06 10:02:11 np0005548789.localdomain sshd[287009]: Disconnected from authenticating user root 45.78.222.162 port 52374 [preauth]
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.266404804 +0000 UTC m=+0.078799048 container create 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, vcs-type=git, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: Started libpod-conmon-94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28.scope.
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.232216181 +0000 UTC m=+0.044610455 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.363653437 +0000 UTC m=+0.176047681 container init 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.385963844 +0000 UTC m=+0.198358088 container start 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, release=1763362218)
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.387000816 +0000 UTC m=+0.199395100 container attach 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, release=1763362218, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container)
Dec 06 10:02:11 np0005548789.localdomain hungry_dirac[287122]: 167 167
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: libpod-94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28.scope: Deactivated successfully.
Dec 06 10:02:11 np0005548789.localdomain podman[287107]: 2025-12-06 10:02:11.392812221 +0000 UTC m=+0.205206495 container died 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, version=7, io.buildah.version=1.41.4, distribution-scope=public, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 10:02:11 np0005548789.localdomain podman[287128]: 2025-12-06 10:02:11.50319405 +0000 UTC m=+0.095708926 container remove 94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_dirac, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: libpod-conmon-94c2f988328a57cd55ef492179cdeb25f438a62c1071b04773f768324fe76e28.scope: Deactivated successfully.
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:02:11 np0005548789.localdomain systemd-sysv-generator[287172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:02:11 np0005548789.localdomain systemd-rc-local-generator[287168]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: tmp-crun.jo1AJq.mount: Deactivated successfully.
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e2a60bd4c290137f9d1be689aa19aba87cbb2a2866b2c9ba1410b595ca8495b9-merged.mount: Deactivated successfully.
Dec 06 10:02:11 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:02:12 np0005548789.localdomain systemd-rc-local-generator[287208]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:02:12 np0005548789.localdomain systemd-sysv-generator[287211]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: Starting Ceph mds.mds.np0005548789.vxwwsq for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:02:12 np0005548789.localdomain podman[287224]: 2025-12-06 10:02:12.395166549 +0000 UTC m=+0.093443122 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:02:12 np0005548789.localdomain podman[287224]: 2025-12-06 10:02:12.40470043 +0000 UTC m=+0.102977043 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:02:12 np0005548789.localdomain podman[287295]: 
Dec 06 10:02:12 np0005548789.localdomain podman[287295]: 2025-12-06 10:02:12.70820238 +0000 UTC m=+0.078517399 container create 96c037b833c1c97d0b308664988eb916ff49aa4e1f4308925de2d06b5c221fa3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548789-vxwwsq, release=1763362218, distribution-scope=public, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, ceph=True, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main)
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: tmp-crun.0mtcSA.mount: Deactivated successfully.
Dec 06 10:02:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83fca0385576280c6ade8d0d4d93ad0d6e7af09ff9b2873234bbb7d6beeec03/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83fca0385576280c6ade8d0d4d93ad0d6e7af09ff9b2873234bbb7d6beeec03/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83fca0385576280c6ade8d0d4d93ad0d6e7af09ff9b2873234bbb7d6beeec03/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83fca0385576280c6ade8d0d4d93ad0d6e7af09ff9b2873234bbb7d6beeec03/merged/var/lib/ceph/mds/ceph-mds.np0005548789.vxwwsq supports timestamps until 2038 (0x7fffffff)
Dec 06 10:02:12 np0005548789.localdomain podman[287295]: 2025-12-06 10:02:12.77003029 +0000 UTC m=+0.140345319 container init 96c037b833c1c97d0b308664988eb916ff49aa4e1f4308925de2d06b5c221fa3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548789-vxwwsq, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:02:12 np0005548789.localdomain podman[287295]: 2025-12-06 10:02:12.675049249 +0000 UTC m=+0.045364298 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:02:12 np0005548789.localdomain podman[287295]: 2025-12-06 10:02:12.776977979 +0000 UTC m=+0.147293008 container start 96c037b833c1c97d0b308664988eb916ff49aa4e1f4308925de2d06b5c221fa3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548789-vxwwsq, version=7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1763362218, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:02:12 np0005548789.localdomain bash[287295]: 96c037b833c1c97d0b308664988eb916ff49aa4e1f4308925de2d06b5c221fa3
Dec 06 10:02:12 np0005548789.localdomain systemd[1]: Started Ceph mds.mds.np0005548789.vxwwsq for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:02:12 np0005548789.localdomain sudo[287048]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:12 np0005548789.localdomain ceph-mds[287313]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:02:12 np0005548789.localdomain ceph-mds[287313]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Dec 06 10:02:12 np0005548789.localdomain ceph-mds[287313]: main not setting numa affinity
Dec 06 10:02:12 np0005548789.localdomain ceph-mds[287313]: pidfile_write: ignore empty --pid-file
Dec 06 10:02:12 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mds-mds-np0005548789-vxwwsq[287309]: starting mds.mds.np0005548789.vxwwsq at 
Dec 06 10:02:12 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Updating MDS map to version 7 from mon.0
Dec 06 10:02:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:13.026 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:13 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Updating MDS map to version 8 from mon.0
Dec 06 10:02:13 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Monitors have assigned me to become a standby.
Dec 06 10:02:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:14.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:14 np0005548789.localdomain sshd[284056]: Received disconnect from 38.102.83.114 port 35214:11: disconnected by user
Dec 06 10:02:14 np0005548789.localdomain sshd[284056]: Disconnected from user zuul 38.102.83.114 port 35214
Dec 06 10:02:14 np0005548789.localdomain sshd[284053]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:02:14 np0005548789.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Dec 06 10:02:14 np0005548789.localdomain systemd-logind[766]: Session 62 logged out. Waiting for processes to exit.
Dec 06 10:02:14 np0005548789.localdomain systemd-logind[766]: Removed session 62.
Dec 06 10:02:15 np0005548789.localdomain sudo[287333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:15 np0005548789.localdomain sudo[287333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:15 np0005548789.localdomain sudo[287333]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:15 np0005548789.localdomain sudo[287351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:02:15 np0005548789.localdomain sudo[287351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:15 np0005548789.localdomain sudo[287351]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:15 np0005548789.localdomain sudo[287369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:02:15 np0005548789.localdomain sudo[287369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:02:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:02:16 np0005548789.localdomain systemd[1]: tmp-crun.hhBGZd.mount: Deactivated successfully.
Dec 06 10:02:16 np0005548789.localdomain podman[287458]: 2025-12-06 10:02:16.859565221 +0000 UTC m=+0.094228428 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, version=7)
Dec 06 10:02:16 np0005548789.localdomain podman[287458]: 2025-12-06 10:02:16.995313473 +0000 UTC m=+0.229976650 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Dec 06 10:02:17 np0005548789.localdomain sudo[287369]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:02:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:02:17 np0005548789.localdomain podman[287541]: 2025-12-06 10:02:17.734532551 +0000 UTC m=+0.072316553 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:02:17 np0005548789.localdomain podman[287541]: 2025-12-06 10:02:17.745117647 +0000 UTC m=+0.082901639 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:02:17 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:02:17 np0005548789.localdomain podman[287540]: 2025-12-06 10:02:17.805443418 +0000 UTC m=+0.141494425 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public)
Dec 06 10:02:17 np0005548789.localdomain podman[287540]: 2025-12-06 10:02:17.820076633 +0000 UTC m=+0.156127640 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, release=1755695350, version=9.6, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Dec 06 10:02:17 np0005548789.localdomain sudo[287573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:17 np0005548789.localdomain sudo[287573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:17 np0005548789.localdomain sudo[287573]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:17 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:02:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:18.063 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:18 np0005548789.localdomain sudo[287598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:18 np0005548789.localdomain sudo[287598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:02:18 np0005548789.localdomain sudo[287598]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:19.668 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:02:21 np0005548789.localdomain systemd[1]: tmp-crun.5s7p4i.mount: Deactivated successfully.
Dec 06 10:02:21 np0005548789.localdomain podman[287616]: 2025-12-06 10:02:21.922740158 +0000 UTC m=+0.086470121 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd)
Dec 06 10:02:21 np0005548789.localdomain podman[287616]: 2025-12-06 10:02:21.937781915 +0000 UTC m=+0.101511878 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 06 10:02:21 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:02:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:23.125 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:02:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:02:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:02:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151703 "" "Go-http-client/1.1"
Dec 06 10:02:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:02:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18230 "" "Go-http-client/1.1"
Dec 06 10:02:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:24.672 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.196 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.197 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.216 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.217 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.217 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:02:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.861 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.861 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.861 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:02:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:26.862 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:02:26 np0005548789.localdomain podman[287635]: 2025-12-06 10:02:26.928494306 +0000 UTC m=+0.090453757 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:02:26 np0005548789.localdomain podman[287635]: 2025-12-06 10:02:26.962798693 +0000 UTC m=+0.124758174 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:02:26 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.288 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.314 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.315 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.316 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.316 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.316 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.316 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.317 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.317 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.317 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.317 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.338 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.338 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.339 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.339 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.340 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.794 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.876 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:02:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:27.877 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.114 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.116 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11984MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.116 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.117 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.169 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.199 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.199 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.200 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.241 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.714 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.722 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.738 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.741 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:02:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:28.741 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:29.675 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:02:32 np0005548789.localdomain systemd[1]: tmp-crun.LzSNNO.mount: Deactivated successfully.
Dec 06 10:02:32 np0005548789.localdomain podman[287700]: 2025-12-06 10:02:32.934121064 +0000 UTC m=+0.095935652 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:02:33 np0005548789.localdomain podman[287700]: 2025-12-06 10:02:33.021282517 +0000 UTC m=+0.183097105 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 10:02:33 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:02:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:33.171 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:34.678 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:38.221 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:39.680 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:02:40 np0005548789.localdomain podman[287723]: 2025-12-06 10:02:40.929324707 +0000 UTC m=+0.088226097 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:02:40 np0005548789.localdomain podman[287723]: 2025-12-06 10:02:40.963341515 +0000 UTC m=+0.122242925 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 10:02:40 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:02:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:02:42 np0005548789.localdomain podman[287740]: 2025-12-06 10:02:42.922356713 +0000 UTC m=+0.083840368 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:02:42 np0005548789.localdomain podman[287740]: 2025-12-06 10:02:42.93014038 +0000 UTC m=+0.091624075 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:02:42 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:02:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:43.266 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:44.683 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:02:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:02:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:02:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:02:47.292 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:02:47.292 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:02:47.293 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:02:47 np0005548789.localdomain podman[287764]: 2025-12-06 10:02:47.910830494 +0000 UTC m=+0.078135438 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:02:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:02:47 np0005548789.localdomain podman[287764]: 2025-12-06 10:02:47.92017674 +0000 UTC m=+0.087481624 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:02:47 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:02:48 np0005548789.localdomain podman[287783]: 2025-12-06 10:02:48.002897302 +0000 UTC m=+0.077020723 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 06 10:02:48 np0005548789.localdomain podman[287783]: 2025-12-06 10:02:48.042124955 +0000 UTC m=+0.116248386 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:02:48 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:02:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:48.306 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:49.686 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:51 np0005548789.localdomain sshd[287803]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:02:52 np0005548789.localdomain sshd[287803]: Received disconnect from 64.227.102.57 port 43472:11: Bye Bye [preauth]
Dec 06 10:02:52 np0005548789.localdomain sshd[287803]: Disconnected from authenticating user root 64.227.102.57 port 43472 [preauth]
Dec 06 10:02:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:02:52 np0005548789.localdomain systemd[1]: tmp-crun.0tOSuq.mount: Deactivated successfully.
Dec 06 10:02:52 np0005548789.localdomain podman[287805]: 2025-12-06 10:02:52.626991154 +0000 UTC m=+0.095760476 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:02:52 np0005548789.localdomain podman[287805]: 2025-12-06 10:02:52.667470587 +0000 UTC m=+0.136239879 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:02:52 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:02:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:53.347 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:02:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:02:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:02:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151703 "" "Go-http-client/1.1"
Dec 06 10:02:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:02:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18235 "" "Go-http-client/1.1"
Dec 06 10:02:54 np0005548789.localdomain sshd[286556]: Received disconnect from 192.168.122.11 port 44940:11: disconnected by user
Dec 06 10:02:54 np0005548789.localdomain sshd[286556]: Disconnected from user tripleo-admin 192.168.122.11 port 44940
Dec 06 10:02:54 np0005548789.localdomain sshd[286536]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 10:02:54 np0005548789.localdomain systemd[1]: session-63.scope: Deactivated successfully.
Dec 06 10:02:54 np0005548789.localdomain systemd[1]: session-63.scope: Consumed 1.198s CPU time.
Dec 06 10:02:54 np0005548789.localdomain systemd-logind[766]: Session 63 logged out. Waiting for processes to exit.
Dec 06 10:02:54 np0005548789.localdomain systemd-logind[766]: Removed session 63.
Dec 06 10:02:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:54.688 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:02:57 np0005548789.localdomain systemd[1]: tmp-crun.R1ySP1.mount: Deactivated successfully.
Dec 06 10:02:57 np0005548789.localdomain podman[287824]: 2025-12-06 10:02:57.923532959 +0000 UTC m=+0.084162538 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:02:57 np0005548789.localdomain podman[287824]: 2025-12-06 10:02:57.960142399 +0000 UTC m=+0.120771968 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:02:57 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:02:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:58.350 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:02:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:02:59.692 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:03.394 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:03:03 np0005548789.localdomain systemd[1]: tmp-crun.AKE2yp.mount: Deactivated successfully.
Dec 06 10:03:03 np0005548789.localdomain podman[287847]: 2025-12-06 10:03:03.936424896 +0000 UTC m=+0.097041865 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:03:03 np0005548789.localdomain podman[287847]: 2025-12-06 10:03:03.981235737 +0000 UTC m=+0.141852716 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:03:03 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Activating special unit Exit the Session...
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped target Main User Target.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped target Basic System.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped target Paths.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped target Sockets.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped target Timers.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Closed D-Bus User Message Bus Socket.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Removed slice User Application Slice.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Reached target Shutdown.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Finished Exit the Session.
Dec 06 10:03:04 np0005548789.localdomain systemd[286540]: Reached target Exit the Session.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 10:03:04 np0005548789.localdomain systemd[1]: user-1003.slice: Consumed 1.603s CPU time.
Dec 06 10:03:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:04.694 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.913 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.928 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.928 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13f6fdb2-456d-47ec-9330-a95c0ba65552', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:07.914836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3b5e894-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': '599b141497833b5385a273a06061ace21a76d5a54509ace503ca9d03fb5e461c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:07.914836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3b5feec-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': 'e8139afe6989ee3672d1f46d1dc4e53a3c49c115a978398e0d6b5925b3a57147'}]}, 'timestamp': '2025-12-06 10:03:07.929185', '_unique_id': '7439f0d9478b4000aaaf396437129d4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.962 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.962 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d3de8fa-e54a-4c6d-becc-584270e7a0e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:07.932093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3bb1a08-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '004ac31b0766345c8583d0ab3368d5085e0fd39f4048bca3d16cae5fdb0604fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:07.932093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3bb2f34-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'fddb347c98978c86506e3552fbc830319dec7092d83e62e3403d60fb6e5801af'}]}, 'timestamp': '2025-12-06 10:03:07.963210', '_unique_id': 'f6d1631778c04c6bbc5aecd99a184b94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9224698a-29b3-45aa-85e7-b711659659f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:07.966007', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3bc576a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': 'aa436643508d51a8c44e4551ea655fa6c807e094d4208d4d07771bbf10013ef5'}]}, 'timestamp': '2025-12-06 10:03:07.970835', '_unique_id': '88ee8194e2f94aab8fa0968137a9102d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.973 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.973 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.973 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f081c00a-bea2-40d3-9141-70657da5f795', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:07.973404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3bcd0dc-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '90e213660852df2e2d80c5d8c7d53428cb3cf4838ed84211f0146dc6a41efcd9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:07.973404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3bce5c2-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'a1d40775d4352163497852ca3c1a690b722cf170b93fda57123336d14f2b4888'}]}, 'timestamp': '2025-12-06 10:03:07.974404', '_unique_id': '3c729fc6ddab49e9aba8e05e6d553e55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.977 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.977 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7915351a-028c-49d5-8ab3-72a897c081dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:07.977099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3bd60d8-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': '9f521a7689f47ac55b931e3c3c5e5f939bf79e28ce0aa7698ffbad664832846e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:07.977099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3bd741a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': '96073340772ea14d395cebdda50b366584f298d607009d8bb6396c77444887c9'}]}, 'timestamp': '2025-12-06 10:03:07.978049', '_unique_id': '00b50c79648549c89c590c135bc04202'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.980 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.980 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c32e7842-b93a-47ea-afe2-090c524dd337', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:07.980318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3bdde5a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '87d0ddf1ac92e75028e156906933d6e14c258b6dcf360910201ea2b848ac6fad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:07.980318', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3bdf0a2-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '22f0a766fed45840276729b3eb6d207dfa24a1aa0619fa3b272147934a36d50c'}]}, 'timestamp': '2025-12-06 10:03:07.981228', '_unique_id': '7920bc547c374c948c490e0741b61a03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:07.983 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 12400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c98323f6-dd5b-433e-b4fe-dd0517e3c8e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12400000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:03:07.983510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c3c1184a-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.250542145, 'message_signature': '48088b4e6e73cedc944ed6e2ec84728dcadd5d3976add3ac35b36ec1f5011b73'}]}, 'timestamp': '2025-12-06 10:03:08.001999', '_unique_id': '0f352e46940842b4923e45d08a5d7d28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.003 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.004 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.004 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '230bf5b4-b2a1-41db-b0c8-7a4ca15e4970', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.004705', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c199a0-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '33fbd59f4055943ceb38102ea5a7174f3a649ac23ccfe28f30f03c6993597296'}]}, 'timestamp': '2025-12-06 10:03:08.005250', '_unique_id': '99564e31fc5d478c86624c0efa551b60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.007 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.007 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e423e29b-20fb-4894-9dcd-1d619c9bda96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:03:08.007550', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c3c207d2-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.250542145, 'message_signature': '431453fd96e57d3bee75b3c7f6d2b82352865861b04c04f925db9f9ded3ba1a6'}]}, 'timestamp': '2025-12-06 10:03:08.008083', '_unique_id': '8a5d455c2ee64345b11792ebf8395264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.011 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55c85834-0b52-47fc-aee9-04bc76dba77f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.011491', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c2a2f0-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '2616fb949d1b216a18df7d70faeb180a63cacd9a2a53abd04cfe151748eb1435'}]}, 'timestamp': '2025-12-06 10:03:08.012081', '_unique_id': 'b0c0fdf6f1ad46b7b8973b596ee1701d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd396d59-e155-49c2-8383-8f79dc236544', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.014381', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c310f0-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': 'ca0fa46a7088b2c1ec36d6f7cb5252f4985e265931f68abfcb65703e1945d36b'}]}, 'timestamp': '2025-12-06 10:03:08.014905', '_unique_id': '9a1beede942649f19c23a739f0ed12de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.017 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ca04f33-507c-4098-a09f-712ee608c7ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.017288', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c3829c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': 'bbd4e0ae5b49c0c5e3ea19906a1aea6a2cf94ac6229a4fa883e62d9ba944f6cc'}]}, 'timestamp': '2025-12-06 10:03:08.017798', '_unique_id': '2a4b0035372f4a39a2faec4701878d3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.020 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '141a82c5-a1aa-4e8d-bcc7-a3c55a940f81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.020298', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c3f8e4-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '3e64785280cc4b922fabffd7c1de21d311b0c674c20a52b5f00cc078648368e4'}]}, 'timestamp': '2025-12-06 10:03:08.020752', '_unique_id': 'a329ef7c029c42409f07268f4885a6c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfbcb9f2-2ed0-4212-8bf0-f08fb2e55f83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:08.022155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3c43d0e-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '10680dfb393f1265a7f7987ff4b45872f4b9debddc9d3dec4ae505c193222fe6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:08.022155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3c44858-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'a2b675ab7a2863f68740aaba4723caf84c4e201b10baa617f9d6c46ff709027e'}]}, 'timestamp': '2025-12-06 10:03:08.022719', '_unique_id': '2a4e1828e2f6468daf53357bff2514a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5436739a-0f65-4fdc-9365-8b4e94bae73e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:08.024353', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3c493ee-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': '393f2506226437dce79de810726af5d94092df0e16d65f9e40cf174af7abb0b7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:08.024353', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3c49f24-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.16420738, 'message_signature': 'a10b5d59de1fa971380bdd28047d690fb770dcb8d2e9246b309a987a2c0387bc'}]}, 'timestamp': '2025-12-06 10:03:08.024944', '_unique_id': 'fe4dadd7612a4bdc82f970993bce9bc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.026 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.026 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.026 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1155e441-713d-45fe-8163-7fd9e8928681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:08.026334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3c4e0b0-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'ffbc55d7d6924b3993b3559d084e7ef0fb935e59901694841d49b70bf1a2ea3b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:08.026334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3c4ec04-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'cec49a20ad55efe8a90a35129a5e1b0854c16a236914a2affd0f1dca097c1f41'}]}, 'timestamp': '2025-12-06 10:03:08.026911', '_unique_id': 'cfa26353ca6a48c282032cefa35a91a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb3afe6f-4774-4034-af17-a6a59e9074a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:03:08.028373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c3c53060-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': 'a4548b992e6cf06ee32434d6c4265207933658591ff5faca57b5f5680d8139ec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:03:08.028373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c3c53c04-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.181489178, 'message_signature': '71cb780ae7bb0c8549f9a8948b6570a25d142b644e53e52085ff17287654dfe0'}]}, 'timestamp': '2025-12-06 10:03:08.028958', '_unique_id': '42b107ddd35d4f0ca675bb8fcce868e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.030 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf351536-999b-493e-94a5-13a7003370a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.030330', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c57cbe-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '24da6cb5bc9014753a40c211f7ba99ddfbb1d7c54c641db5e15820a4b076b462'}]}, 'timestamp': '2025-12-06 10:03:08.030660', '_unique_id': 'deb04f4357ab4e5cbdcb6ca0f064ef9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.032 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac044595-a738-452a-8d31-55734b8513c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.032193', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c5c50c-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '81dcc5ef76a517226a749719117ab46b95c757da7b30057f6a3427e22923ff3e'}]}, 'timestamp': '2025-12-06 10:03:08.032482', '_unique_id': 'cd277826f3ae46c89336627b10d18813'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '509229f2-452d-4de3-92c8-aaf7c1b5b192', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.034008', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c60bfc-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '220da469df5371a0982ddd749b2c7c158d1815131859cc037fdb312722df66b4'}]}, 'timestamp': '2025-12-06 10:03:08.034296', '_unique_id': 'a7125c823c0c41ac9cfd46c275b624a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9124f985-a944-4423-90c7-4bfaf8fcfb28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:03:08.035646', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'c3c64e6e-d28a-11f0-aaf2-fa163e118844', 'monotonic_time': 11806.215378872, 'message_signature': '27a54efb36ef9767c0de17c31f631ff0b971a88e94ecd4a72723112bbe8f046d'}]}, 'timestamp': '2025-12-06 10:03:08.036001', '_unique_id': '154c0a29da6946a98082d0535517dc2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:03:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:03:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:03:08 np0005548789.localdomain sudo[287875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:08 np0005548789.localdomain sudo[287875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:08 np0005548789.localdomain sudo[287875]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:08.395 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:08 np0005548789.localdomain sudo[287893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:08 np0005548789.localdomain sudo[287893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:08 np0005548789.localdomain sudo[287893]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:08 np0005548789.localdomain sudo[287911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:03:08 np0005548789.localdomain sudo[287911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:09 np0005548789.localdomain sudo[287911]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:09.696 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:10 np0005548789.localdomain sshd[287961]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:11 np0005548789.localdomain sshd[287961]: Connection reset by authenticating user root 91.202.233.33 port 61570 [preauth]
Dec 06 10:03:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:03:11 np0005548789.localdomain sudo[287963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:11 np0005548789.localdomain sudo[287963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:11 np0005548789.localdomain sudo[287963]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:11 np0005548789.localdomain systemd[1]: tmp-crun.It7lNz.mount: Deactivated successfully.
Dec 06 10:03:11 np0005548789.localdomain podman[287979]: 2025-12-06 10:03:11.556925755 +0000 UTC m=+0.085736979 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:03:11 np0005548789.localdomain podman[287979]: 2025-12-06 10:03:11.562671887 +0000 UTC m=+0.091483181 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:03:11 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:03:11 np0005548789.localdomain sshd[288001]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:12 np0005548789.localdomain sudo[288003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:12 np0005548789.localdomain sudo[288003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:12 np0005548789.localdomain sudo[288003]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:12 np0005548789.localdomain sshd[288001]: Connection reset by authenticating user root 91.202.233.33 port 39278 [preauth]
Dec 06 10:03:12 np0005548789.localdomain sshd[288021]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:13.435 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:03:13 np0005548789.localdomain sshd[288021]: Invalid user oracle from 91.202.233.33 port 39292
Dec 06 10:03:13 np0005548789.localdomain podman[288023]: 2025-12-06 10:03:13.834123876 +0000 UTC m=+0.071867599 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:03:13 np0005548789.localdomain podman[288023]: 2025-12-06 10:03:13.840352383 +0000 UTC m=+0.078096086 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:03:13 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:03:14 np0005548789.localdomain sshd[288021]: Connection reset by invalid user oracle 91.202.233.33 port 39292 [preauth]
Dec 06 10:03:14 np0005548789.localdomain sshd[288047]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:14.699 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:15 np0005548789.localdomain sshd[288047]: Connection reset by authenticating user root 91.202.233.33 port 39304 [preauth]
Dec 06 10:03:15 np0005548789.localdomain sshd[288049]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:03:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:03:16 np0005548789.localdomain sshd[288049]: Connection reset by authenticating user root 91.202.233.33 port 39312 [preauth]
Dec 06 10:03:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:18.438 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:03:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:03:18 np0005548789.localdomain podman[288052]: 2025-12-06 10:03:18.944896823 +0000 UTC m=+0.092542194 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, version=9.6, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=)
Dec 06 10:03:19 np0005548789.localdomain podman[288053]: 2025-12-06 10:03:19.000875497 +0000 UTC m=+0.147194856 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:03:19 np0005548789.localdomain podman[288052]: 2025-12-06 10:03:19.017513305 +0000 UTC m=+0.165158706 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git)
Dec 06 10:03:19 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:03:19 np0005548789.localdomain podman[288053]: 2025-12-06 10:03:19.042261149 +0000 UTC m=+0.188580548 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:03:19 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.203 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.203 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.203 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.220 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:19.703 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:20.230 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:20 np0005548789.localdomain sshd[288092]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:22 np0005548789.localdomain sshd[288092]: Received disconnect from 14.194.101.210 port 37066:11: Bye Bye [preauth]
Dec 06 10:03:22 np0005548789.localdomain sshd[288092]: Disconnected from authenticating user root 14.194.101.210 port 37066 [preauth]
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.213 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.213 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.213 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.214 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.214 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.681 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.761 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:03:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:22.762 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:03:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:03:22 np0005548789.localdomain podman[288116]: 2025-12-06 10:03:22.945727472 +0000 UTC m=+0.104497363 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:03:22 np0005548789.localdomain podman[288116]: 2025-12-06 10:03:22.988316801 +0000 UTC m=+0.147086642 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:03:23 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.013 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.014 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11988MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.014 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.014 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.209 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.210 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.211 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.310 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.454 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.455 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.472 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.492 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:03:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:23.534 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:03:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:03:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:03:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151703 "" "Go-http-client/1.1"
Dec 06 10:03:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:03:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18234 "" "Go-http-client/1.1"
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.032 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.039 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.059 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.061 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.062 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:24.730 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.062 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.063 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.613 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.614 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.614 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:03:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:25.615 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.054 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.080 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.081 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.081 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.082 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.082 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:26.083 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:03:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:28.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:03:28 np0005548789.localdomain podman[288156]: 2025-12-06 10:03:28.920849963 +0000 UTC m=+0.077468397 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:03:28 np0005548789.localdomain podman[288156]: 2025-12-06 10:03:28.929337681 +0000 UTC m=+0.085956115 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:03:28 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:03:28 np0005548789.localdomain sshd[288179]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:29.766 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:30 np0005548789.localdomain sshd[288179]: Received disconnect from 118.219.234.233 port 44830:11: Bye Bye [preauth]
Dec 06 10:03:30 np0005548789.localdomain sshd[288179]: Disconnected from authenticating user root 118.219.234.233 port 44830 [preauth]
Dec 06 10:03:33 np0005548789.localdomain sudo[288181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:33 np0005548789.localdomain sudo[288181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:33 np0005548789.localdomain sudo[288181]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:33 np0005548789.localdomain sudo[288199]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:03:33 np0005548789.localdomain sudo[288199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:33.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:33 np0005548789.localdomain sudo[288199]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:34 np0005548789.localdomain sudo[288238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:34 np0005548789.localdomain sudo[288238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:03:34 np0005548789.localdomain sudo[288238]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:34 np0005548789.localdomain podman[288256]: 2025-12-06 10:03:34.364568662 +0000 UTC m=+0.095253400 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:03:34 np0005548789.localdomain podman[288256]: 2025-12-06 10:03:34.414263967 +0000 UTC m=+0.144948765 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:03:34 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:03:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:34.808 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:34 np0005548789.localdomain sudo[288282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:34 np0005548789.localdomain sudo[288282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:34 np0005548789.localdomain sudo[288282]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:36 np0005548789.localdomain sudo[288300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:36 np0005548789.localdomain sudo[288300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:36 np0005548789.localdomain sudo[288300]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:38.566 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:39 np0005548789.localdomain sudo[288318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:39 np0005548789.localdomain sudo[288318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:39 np0005548789.localdomain sudo[288318]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:39 np0005548789.localdomain sudo[288336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:39 np0005548789.localdomain sudo[288336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:39.811 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.336700177 +0000 UTC m=+0.081378350 container create 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, release=1763362218, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: Started libpod-conmon-78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6.scope.
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.302276586 +0000 UTC m=+0.046954779 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.4355586 +0000 UTC m=+0.180236743 container init 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.447734257 +0000 UTC m=+0.192412410 container start 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public)
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.448078347 +0000 UTC m=+0.192756690 container attach 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, ceph=True, architecture=x86_64, distribution-scope=public, name=rhceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:03:40 np0005548789.localdomain nostalgic_lederberg[288410]: 167 167
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: libpod-78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6.scope: Deactivated successfully.
Dec 06 10:03:40 np0005548789.localdomain podman[288395]: 2025-12-06 10:03:40.453420767 +0000 UTC m=+0.198098970 container died 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.)
Dec 06 10:03:40 np0005548789.localdomain podman[288415]: 2025-12-06 10:03:40.566832741 +0000 UTC m=+0.098783092 container remove 78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_lederberg, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: libpod-conmon-78ec4e6836e7745e82f90135969cf07095d81ccaa2e8338ecaa65bbec5056ef6.scope: Deactivated successfully.
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:03:40 np0005548789.localdomain systemd-rc-local-generator[288461]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:03:40 np0005548789.localdomain systemd-sysv-generator[288464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-eecc5831ee4c8a69142491310ecb4219d97aece2676b0501a63112feb2a6e873-merged.mount: Deactivated successfully.
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:03:41 np0005548789.localdomain systemd-rc-local-generator[288499]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:03:41 np0005548789.localdomain systemd-sysv-generator[288503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: Starting Ceph mgr.np0005548789.mzhmje for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:03:41 np0005548789.localdomain podman[288563]: 
Dec 06 10:03:41 np0005548789.localdomain podman[288563]: 2025-12-06 10:03:41.744537887 +0000 UTC m=+0.078929734 container create a5642346d3b7fe4cd6e5bbe87414b688ab8475bffb00a03636a048ebe7ffdc6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, release=1763362218, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64)
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:03:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abec372076723da270ee2433cd9138ae521a99d2e3c2e8c0743dbe456147a8f0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abec372076723da270ee2433cd9138ae521a99d2e3c2e8c0743dbe456147a8f0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abec372076723da270ee2433cd9138ae521a99d2e3c2e8c0743dbe456147a8f0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abec372076723da270ee2433cd9138ae521a99d2e3c2e8c0743dbe456147a8f0/merged/var/lib/ceph/mgr/ceph-np0005548789.mzhmje supports timestamps until 2038 (0x7fffffff)
Dec 06 10:03:41 np0005548789.localdomain podman[288563]: 2025-12-06 10:03:41.711905992 +0000 UTC m=+0.046297869 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:03:41 np0005548789.localdomain podman[288563]: 2025-12-06 10:03:41.815240827 +0000 UTC m=+0.149632684 container init a5642346d3b7fe4cd6e5bbe87414b688ab8475bffb00a03636a048ebe7ffdc6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7)
Dec 06 10:03:41 np0005548789.localdomain podman[288563]: 2025-12-06 10:03:41.830672036 +0000 UTC m=+0.165063883 container start a5642346d3b7fe4cd6e5bbe87414b688ab8475bffb00a03636a048ebe7ffdc6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_BRANCH=main, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, version=7, release=1763362218, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z)
Dec 06 10:03:41 np0005548789.localdomain bash[288563]: a5642346d3b7fe4cd6e5bbe87414b688ab8475bffb00a03636a048ebe7ffdc6d
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: Started Ceph mgr.np0005548789.mzhmje for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:03:41 np0005548789.localdomain sudo[288336]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: pidfile_write: ignore empty --pid-file
Dec 06 10:03:41 np0005548789.localdomain podman[288577]: 2025-12-06 10:03:41.896956107 +0000 UTC m=+0.112975351 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'alerts'
Dec 06 10:03:41 np0005548789.localdomain podman[288577]: 2025-12-06 10:03:41.926479703 +0000 UTC m=+0.142498957 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:03:41 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:03:41 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'balancer'
Dec 06 10:03:42 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:41.997+0000 7f047f28a140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'cephadm'
Dec 06 10:03:42 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:42.065+0000 7f047f28a140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548789.localdomain systemd[1]: tmp-crun.ndY5AH.mount: Deactivated successfully.
Dec 06 10:03:42 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'crash'
Dec 06 10:03:42 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:03:42 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'dashboard'
Dec 06 10:03:42 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:42.856+0000 7f047f28a140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'devicehealth'
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:43.421+0000 7f047f28a140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]:   from numpy import show_config as show_numpy_config
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:43.563+0000 7f047f28a140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'influx'
Dec 06 10:03:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:43.615 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'insights'
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:43.687+0000 7f047f28a140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'iostat'
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:03:43 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'k8sevents'
Dec 06 10:03:43 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:43.810+0000 7f047f28a140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'localpool'
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'mirroring'
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'nfs'
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'orchestrator'
Dec 06 10:03:44 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:44.575+0000 7f047f28a140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain sudo[288630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:44 np0005548789.localdomain sudo[288630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:03:44 np0005548789.localdomain sudo[288630]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 10:03:44 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:44.729+0000 7f047f28a140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain podman[288647]: 2025-12-06 10:03:44.780150984 +0000 UTC m=+0.080584745 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:03:44 np0005548789.localdomain podman[288647]: 2025-12-06 10:03:44.789096578 +0000 UTC m=+0.089530309 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:03:44 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'osd_support'
Dec 06 10:03:44 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:44.804+0000 7f047f28a140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:44.815 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:44 np0005548789.localdomain sudo[288671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:44 np0005548789.localdomain sudo[288671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:44 np0005548789.localdomain sudo[288671]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 10:03:44 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:44.870+0000 7f047f28a140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain sudo[288689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:03:44 np0005548789.localdomain sudo[288689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:03:44 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'progress'
Dec 06 10:03:44 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:44.939+0000 7f047f28a140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'prometheus'
Dec 06 10:03:45 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:45.000+0000 7f047f28a140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rbd_support'
Dec 06 10:03:45 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:45.308+0000 7f047f28a140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'restful'
Dec 06 10:03:45 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:45.393+0000 7f047f28a140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rgw'
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rook'
Dec 06 10:03:45 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:45.735+0000 7f047f28a140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:03:45 np0005548789.localdomain podman[288778]: 2025-12-06 10:03:45.744992924 +0000 UTC m=+0.101703785 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vendor=Red Hat, Inc., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, io.openshift.expose-services=)
Dec 06 10:03:45 np0005548789.localdomain podman[288778]: 2025-12-06 10:03:45.821506449 +0000 UTC m=+0.178217310 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4)
Dec 06 10:03:46 np0005548789.localdomain sudo[288689]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'selftest'
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.231+0000 7f047f28a140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'snap_schedule'
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.298+0000 7f047f28a140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'stats'
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'status'
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'telegraf'
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.520+0000 7f047f28a140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'telemetry'
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.583+0000 7f047f28a140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:03:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:03:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.750+0000 7f047f28a140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'volumes'
Dec 06 10:03:46 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:46.917+0000 7f047f28a140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:03:46 np0005548789.localdomain sudo[288879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:46 np0005548789.localdomain sudo[288879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:46 np0005548789.localdomain sudo[288879]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:46 np0005548789.localdomain sshd[288897]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:47 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:03:47 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'zabbix'
Dec 06 10:03:47 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:47.113+0000 7f047f28a140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:03:47 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:03:47 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:03:47.173+0000 7f047f28a140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:03:47 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed131e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Dec 06 10:03:47 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.103:6800/3108124117
Dec 06 10:03:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:03:47.293 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:03:47.293 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:03:47.294 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:47 np0005548789.localdomain sudo[288899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:47 np0005548789.localdomain sudo[288899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:47 np0005548789.localdomain sudo[288899]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:48 np0005548789.localdomain sshd[288897]: Received disconnect from 154.113.10.34 port 53846:11: Bye Bye [preauth]
Dec 06 10:03:48 np0005548789.localdomain sshd[288897]: Disconnected from authenticating user root 154.113.10.34 port 53846 [preauth]
Dec 06 10:03:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:48.663 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:48 np0005548789.localdomain sudo[288917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:48 np0005548789.localdomain sudo[288917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:48 np0005548789.localdomain sudo[288917]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:49.819 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:03:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:03:49 np0005548789.localdomain podman[288936]: 2025-12-06 10:03:49.930725473 +0000 UTC m=+0.083869069 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:03:49 np0005548789.localdomain podman[288936]: 2025-12-06 10:03:49.944555521 +0000 UTC m=+0.097699097 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:03:49 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:03:50 np0005548789.localdomain podman[288935]: 2025-12-06 10:03:50.03633739 +0000 UTC m=+0.189690493 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 06 10:03:50 np0005548789.localdomain podman[288935]: 2025-12-06 10:03:50.057205181 +0000 UTC m=+0.210558304 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:03:50 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:03:52 np0005548789.localdomain sudo[288974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:52 np0005548789.localdomain sudo[288974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:52 np0005548789.localdomain sudo[288974]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[288992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:03:53 np0005548789.localdomain sudo[288992]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:03:53 np0005548789.localdomain sudo[288992]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:03:53 np0005548789.localdomain sudo[289011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289011]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain podman[289010]: 2025-12-06 10:03:53.208824746 +0000 UTC m=+0.092346058 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:03:53 np0005548789.localdomain podman[289010]: 2025-12-06 10:03:53.252890732 +0000 UTC m=+0.136412034 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:03:53 np0005548789.localdomain sudo[289042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:53 np0005548789.localdomain sudo[289042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289042]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:03:53 np0005548789.localdomain sudo[289063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:53 np0005548789.localdomain sudo[289063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289063]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:53 np0005548789.localdomain sudo[289081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289081]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:53 np0005548789.localdomain sudo[289115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289115]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:03:53 np0005548789.localdomain sudo[289133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289133]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:53.707 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:53 np0005548789.localdomain sudo[289151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:03:53 np0005548789.localdomain sudo[289151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289151]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:53 np0005548789.localdomain sudo[289169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289169]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain sudo[289187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:53 np0005548789.localdomain sudo[289187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289187]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:03:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:03:53 np0005548789.localdomain sudo[289205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:03:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153839 "" "Go-http-client/1.1"
Dec 06 10:03:53 np0005548789.localdomain sudo[289205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:53 np0005548789.localdomain sudo[289205]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:03:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18721 "" "Go-http-client/1.1"
Dec 06 10:03:54 np0005548789.localdomain sudo[289223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:54 np0005548789.localdomain sudo[289223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289223]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289241]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289275]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289293]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:03:54 np0005548789.localdomain sudo[289311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289311]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:03:54 np0005548789.localdomain sudo[289329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289329]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:03:54 np0005548789.localdomain sudo[289347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289347]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289365]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:54 np0005548789.localdomain sudo[289383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289383]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289401]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:54.821 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:54 np0005548789.localdomain sudo[289435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289435]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:54 np0005548789.localdomain sudo[289453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:03:54 np0005548789.localdomain sudo[289453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:54 np0005548789.localdomain sudo[289453]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:03:55 np0005548789.localdomain sudo[289471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289471]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:55 np0005548789.localdomain sudo[289489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289489]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:03:55 np0005548789.localdomain sudo[289507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289507]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:55 np0005548789.localdomain sudo[289525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289525]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:03:55 np0005548789.localdomain sudo[289543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289543]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:55 np0005548789.localdomain sudo[289561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289561]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:55 np0005548789.localdomain sudo[289595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289595]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:03:55 np0005548789.localdomain sudo[289613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289613]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:55 np0005548789.localdomain sudo[289631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:03:55 np0005548789.localdomain sudo[289631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:55 np0005548789.localdomain sudo[289631]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:56 np0005548789.localdomain sudo[289649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:56 np0005548789.localdomain sudo[289649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:03:56 np0005548789.localdomain sudo[289649]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:58.761 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:59 np0005548789.localdomain sshd[289667]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:03:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:03:59.853 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:03:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:03:59 np0005548789.localdomain sshd[289667]: Received disconnect from 64.227.102.57 port 36220:11: Bye Bye [preauth]
Dec 06 10:03:59 np0005548789.localdomain sshd[289667]: Disconnected from authenticating user root 64.227.102.57 port 36220 [preauth]
Dec 06 10:03:59 np0005548789.localdomain systemd[1]: tmp-crun.HSgR5V.mount: Deactivated successfully.
Dec 06 10:03:59 np0005548789.localdomain podman[289669]: 2025-12-06 10:03:59.958018188 +0000 UTC m=+0.089931832 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:03:59 np0005548789.localdomain podman[289669]: 2025-12-06 10:03:59.969211772 +0000 UTC m=+0.101125446 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:03:59 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:04:02 np0005548789.localdomain sudo[289692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:02 np0005548789.localdomain sudo[289692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:02 np0005548789.localdomain sudo[289692]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:02 np0005548789.localdomain sudo[289710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:02 np0005548789.localdomain sudo[289710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:02 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed131e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.748072704 +0000 UTC m=+0.066257501 container create c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.expose-services=)
Dec 06 10:04:02 np0005548789.localdomain systemd[1]: Started libpod-conmon-c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2.scope.
Dec 06 10:04:02 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.710785703 +0000 UTC m=+0.028970520 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.816940647 +0000 UTC m=+0.135125444 container init c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:02 np0005548789.localdomain systemd[1]: tmp-crun.YgimfA.mount: Deactivated successfully.
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.831847409 +0000 UTC m=+0.150032216 container start c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.832103827 +0000 UTC m=+0.150288625 container attach c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, ceph=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:04:02 np0005548789.localdomain determined_cohen[289785]: 167 167
Dec 06 10:04:02 np0005548789.localdomain systemd[1]: libpod-c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2.scope: Deactivated successfully.
Dec 06 10:04:02 np0005548789.localdomain podman[289770]: 2025-12-06 10:04:02.837245401 +0000 UTC m=+0.155430198 container died c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 10:04:02 np0005548789.localdomain podman[289790]: 2025-12-06 10:04:02.921753079 +0000 UTC m=+0.077119345 container remove c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_cohen, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True)
Dec 06 10:04:02 np0005548789.localdomain systemd[1]: libpod-conmon-c0bc25dc24ef253950e6a7ebf32699ad905657f0b31ad1ed7e7973162177d4f2.scope: Deactivated successfully.
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.051888894 +0000 UTC m=+0.089843420 container create f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: Started libpod-conmon-f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31.scope.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.014280341 +0000 UTC m=+0.052234897 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb20ee5941d595a49a700bf299e37984ecbc14e42076b5a9d79ac3c38c286077/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb20ee5941d595a49a700bf299e37984ecbc14e42076b5a9d79ac3c38c286077/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb20ee5941d595a49a700bf299e37984ecbc14e42076b5a9d79ac3c38c286077/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb20ee5941d595a49a700bf299e37984ecbc14e42076b5a9d79ac3c38c286077/merged/var/lib/ceph/mon/ceph-np0005548789 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.125405054 +0000 UTC m=+0.163359580 container init f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4)
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.138813809 +0000 UTC m=+0.176768335 container start f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.139421738 +0000 UTC m=+0.177376264 container attach f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, release=1763362218, RELEASE=main, com.redhat.component=rhceph-container)
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: libpod-f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31.scope: Deactivated successfully.
Dec 06 10:04:03 np0005548789.localdomain podman[289807]: 2025-12-06 10:04:03.229706169 +0000 UTC m=+0.267660715 container died f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, release=1763362218, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 10:04:03 np0005548789.localdomain podman[289848]: 2025-12-06 10:04:03.320108444 +0000 UTC m=+0.081036470 container remove f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_fermi, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, ceph=True, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: libpod-conmon-f9763bd91c3dcd6ba1263087541332bac1010aafc65607c6776265c92137da31.scope: Deactivated successfully.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:04:03 np0005548789.localdomain systemd-sysv-generator[289891]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:04:03 np0005548789.localdomain systemd-rc-local-generator[289886]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6a38f3cd14cdcfbedfc2d0daf615495e6a4e29fe311517b312a3c102feb8548d-merged.mount: Deactivated successfully.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:04:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:03.786 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:03 np0005548789.localdomain systemd-sysv-generator[289935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:04:03 np0005548789.localdomain systemd-rc-local-generator[289932]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:03 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:04:04 np0005548789.localdomain systemd[1]: Starting Ceph mon.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:04:04 np0005548789.localdomain podman[289992]: 
Dec 06 10:04:04 np0005548789.localdomain podman[289992]: 2025-12-06 10:04:04.465845656 +0000 UTC m=+0.077295970 container create 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 10:04:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:04:04 np0005548789.localdomain systemd[1]: tmp-crun.kOrphq.mount: Deactivated successfully.
Dec 06 10:04:04 np0005548789.localdomain podman[289992]: 2025-12-06 10:04:04.42873075 +0000 UTC m=+0.040181094 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b056c2dffd8d19854bf632a4b426ef2949193a3440de3f8d9216685a5e7198/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b056c2dffd8d19854bf632a4b426ef2949193a3440de3f8d9216685a5e7198/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b056c2dffd8d19854bf632a4b426ef2949193a3440de3f8d9216685a5e7198/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58b056c2dffd8d19854bf632a4b426ef2949193a3440de3f8d9216685a5e7198/merged/var/lib/ceph/mon/ceph-np0005548789 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:04:04 np0005548789.localdomain podman[289992]: 2025-12-06 10:04:04.549854498 +0000 UTC m=+0.161304792 container init 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=)
Dec 06 10:04:04 np0005548789.localdomain podman[289992]: 2025-12-06 10:04:04.557483191 +0000 UTC m=+0.168933495 container start 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=)
Dec 06 10:04:04 np0005548789.localdomain bash[289992]: 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0
Dec 06 10:04:04 np0005548789.localdomain systemd[1]: Started Ceph mon.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:04:04 np0005548789.localdomain podman[290005]: 2025-12-06 10:04:04.603206359 +0000 UTC m=+0.097018635 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:04:04 np0005548789.localdomain sudo[289710]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pidfile_write: ignore empty --pid-file
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: load: jerasure load: lrc 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: RocksDB version: 7.9.2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Git sha 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: DB SUMMARY
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: DB Session ID:  ETDWGFPM6GCTACWNDM5G
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: CURRENT file:  CURRENT
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005548789/store.db dir, Total Num: 0, files: 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005548789/store.db: 000004.log size: 761 ; 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                         Options.error_if_exists: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.create_if_missing: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                                     Options.env: 0x55b170c829e0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                                Options.info_log: 0x55b173030d20
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                              Options.statistics: (nil)
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                               Options.use_fsync: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                              Options.db_log_dir: 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                                 Options.wal_dir: 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                    Options.write_buffer_manager: 0x55b173041540
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.unordered_write: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                               Options.row_cache: None
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                              Options.wal_filter: None
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.two_write_queues: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.wal_compression: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.atomic_flush: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.max_background_jobs: 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.max_background_compactions: -1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.max_subcompactions: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.max_total_wal_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                          Options.max_open_files: -1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:       Options.compaction_readahead_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Compression algorithms supported:
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kZSTD supported: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kXpressCompression supported: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kBZip2Compression supported: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kLZ4Compression supported: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kZlibCompression supported: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         kSnappyCompression supported: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005548789/store.db/MANIFEST-000005
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:           Options.merge_operator: 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:        Options.compaction_filter: None
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b173030980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x55b17302d350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:        Options.write_buffer_size: 33554432
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:  Options.max_write_buffer_number: 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.compression: NoCompression
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.num_levels: 7
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                           Options.bloom_locality: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                               Options.ttl: 2592000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                       Options.enable_blob_files: false
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                           Options.min_blob_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005548789/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 8b48a877-4508-4eb4-a052-67f753f228b0
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444621137, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444623570, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444623698, "job": 1, "event": "recovery_finished"}
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b173054e00
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: DB pointer 0x55b17314a000
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 does not exist in monmap, will attempt to join an existing cluster
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b17302d350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.9e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: using public_addr v2:172.18.0.107:0/0 -> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0]
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: starting mon.np0005548789 rank -1 at public addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] at bind addrs [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005548789 fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(???) e0 preinit fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:04 np0005548789.localdomain podman[290005]: 2025-12-06 10:04:04.643191747 +0000 UTC m=+0.137004073 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing) e4 sync_obtain_latest_monmap
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing) e4 sync_obtain_latest_monmap obtained monmap e4
Dec 06 10:04:04 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:04:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:04.887 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).mds e16 new map
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-06T08:18:49.925523+0000
                                                           modified        2025-12-06T10:03:02.051468+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        87
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26356}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26356 members: 26356
                                                           [mds.mds.np0005548790.vhcezv{0:26356} state up:active seq 16 addr [v2:172.18.0.108:6808/1621657194,v1:172.18.0.108:6809/1621657194] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005548789.vxwwsq{-1:16884} state up:standby seq 1 addr [v2:172.18.0.107:6808/3033303281,v1:172.18.0.107:6809/3033303281] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005548788.erzujf{-1:16890} state up:standby seq 1 addr [v2:172.18.0.106:6808/309324236,v1:172.18.0.106:6809/309324236] compat {c=[1],r=[1],i=[17ff]}]
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).osd e87 crush map has features 3314933000854323200, adjusting msgr requires
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).osd e87 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3859: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17040 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17061 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548785.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548785.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3860: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17067 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548785.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548785.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3861: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17079 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548786.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548786.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548786.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548786.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3862: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mgrmap e11: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17097 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548787.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548787.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Standby manager daemon np0005548789.mzhmje started
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3863: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17103 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548787.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548787.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mgrmap e12: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548789.mzhmje
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17109 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548788.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548788.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Standby manager daemon np0005548790.kvkfyr started
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3864: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17115 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548788.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548788.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3865: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17121 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548789.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548789.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17127 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548789.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548789.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3866: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17133 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548790.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label mon to host np0005548790.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3867: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17139 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005548790.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Added label _admin to host np0005548790.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17145 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Saving service mon spec with placement label:mon
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3868: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='client.17151 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3869: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: pgmap v3870: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Dec 06 10:04:05 np0005548789.localdomain sshd[290074]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:06 np0005548789.localdomain sshd[290076]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:08.834 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:08 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed12f20 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Dec 06 10:04:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:09.914 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:09 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 06 10:04:10 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(probing) e4  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 06 10:04:10 np0005548789.localdomain sshd[290076]: Received disconnect from 179.33.210.213 port 58616:11: Bye Bye [preauth]
Dec 06 10:04:10 np0005548789.localdomain sshd[290076]: Disconnected from authenticating user root 179.33.210.213 port 58616 [preauth]
Dec 06 10:04:10 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@-1(probing) e5  my rank is now 4 (was -1)
Dec 06 10:04:10 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:04:10 np0005548789.localdomain ceph-mon[290022]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 06 10:04:10 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:12 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 06 10:04:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:04:12 np0005548789.localdomain podman[290078]: 2025-12-06 10:04:12.928733591 +0000 UTC m=+0.087141682 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:04:12 np0005548789.localdomain podman[290078]: 2025-12-06 10:04:12.940242406 +0000 UTC m=+0.098650537 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:04:12 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:04:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:13.883 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e5 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 calling monitor election
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: pgmap v3871: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: pgmap v3872: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790 in quorum (ranks 0,1,2,3)
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: monmap epoch 4
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:13 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:04:02.181213+0000
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: pgmap v3873: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mgrc update_daemon_metadata mon.np0005548789 metadata {addrs=[v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005548789.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005548789.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: pgmap v3874: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: pgmap v3875: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: pgmap v3876: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3,4)
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: monmap epoch 5
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:04:08.937568+0000
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Dec 06 10:04:14 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed131e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: paxos.4).electionLogic(22) init, last seen epoch 22
Dec 06 10:04:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:14 np0005548789.localdomain sudo[290096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:14 np0005548789.localdomain sudo[290096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548789.localdomain sudo[290096]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:14 np0005548789.localdomain sudo[290114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:14 np0005548789.localdomain sudo[290114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548789.localdomain sudo[290114]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:14 np0005548789.localdomain sudo[290132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:04:14 np0005548789.localdomain sudo[290132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:04:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:14.959 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:14 np0005548789.localdomain podman[290150]: 2025-12-06 10:04:14.963364286 +0000 UTC m=+0.123518116 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:04:14 np0005548789.localdomain podman[290150]: 2025-12-06 10:04:14.967625801 +0000 UTC m=+0.127779571 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:04:14 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:04:15 np0005548789.localdomain podman[290244]: 2025-12-06 10:04:15.50714427 +0000 UTC m=+0.090473489 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Dec 06 10:04:15 np0005548789.localdomain podman[290244]: 2025-12-06 10:04:15.580153074 +0000 UTC m=+0.163482273 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, release=1763362218, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Dec 06 10:04:16 np0005548789.localdomain sudo[290132]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:04:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:04:17 np0005548789.localdomain sshd[290074]: ssh_dispatch_run_fatal: Connection from 123.160.164.187 port 60892: Connection timed out [preauth]
Dec 06 10:04:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:18.919 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: paxos.4).electionLogic(23) init, last seen epoch 23, mid-election, bumping
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='client.17165 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: pgmap v3877: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548788 calling monitor election
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: pgmap v3878: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: pgmap v3879: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548785 is new leader, mons np0005548785,np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4,5)
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: monmap epoch 6
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:04:14.235362+0000
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548785
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: osdmap e87: 6 total, 6 up, 6 in
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: mgrmap e13: np0005548785.vhqlsq(active, since 2h), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:04:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:19 np0005548789.localdomain sudo[290366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:19 np0005548789.localdomain sudo[290366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:19 np0005548789.localdomain sudo[290366]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:19 np0005548789.localdomain sudo[290384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:04:19 np0005548789.localdomain sudo[290384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:19.996 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:20 np0005548789.localdomain sudo[290384]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:20 np0005548789.localdomain sudo[290434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:04:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:04:20 np0005548789.localdomain sudo[290434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290434]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain sudo[290464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:20 np0005548789.localdomain sudo[290464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290464]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain podman[290452]: 2025-12-06 10:04:20.57637708 +0000 UTC m=+0.075727271 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 10:04:20 np0005548789.localdomain podman[290452]: 2025-12-06 10:04:20.587455211 +0000 UTC m=+0.086805422 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec 06 10:04:20 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:04:20 np0005548789.localdomain systemd[1]: tmp-crun.vSVj6z.mount: Deactivated successfully.
Dec 06 10:04:20 np0005548789.localdomain sudo[290499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548789.localdomain sudo[290499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290499]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain podman[290451]: 2025-12-06 10:04:20.628399139 +0000 UTC m=+0.128457772 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Dec 06 10:04:20 np0005548789.localdomain podman[290451]: 2025-12-06 10:04:20.642049212 +0000 UTC m=+0.142107795 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Dec 06 10:04:20 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:04:20 np0005548789.localdomain sudo[290527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:20 np0005548789.localdomain sudo[290527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290527]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain sudo[290545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548789.localdomain sudo[290545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290545]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain sudo[290579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548789.localdomain sudo[290579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290579]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain sudo[290597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:20 np0005548789.localdomain sudo[290597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:20 np0005548789.localdomain sudo[290615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:04:20 np0005548789.localdomain sudo[290615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:20 np0005548789.localdomain sudo[290615]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:21 np0005548789.localdomain sudo[290633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290633]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:21 np0005548789.localdomain sudo[290651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290651]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548789.localdomain sudo[290669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290669]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sshd[290703]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:21 np0005548789.localdomain sudo[290687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:21 np0005548789.localdomain sudo[290687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290687]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548789.localdomain sudo[290706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290706]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548789.localdomain sudo[290740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290740]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:21 np0005548789.localdomain sudo[290758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290758]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:21 np0005548789.localdomain sudo[290777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:21 np0005548789.localdomain sudo[290777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:21 np0005548789.localdomain sudo[290777]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: pgmap v3880: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='client.34103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:22 np0005548789.localdomain sudo[290795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:22 np0005548789.localdomain sudo[290795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:22 np0005548789.localdomain sudo[290795]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.078 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:23 np0005548789.localdomain ceph-mon[290022]: from='client.17178 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:23 np0005548789.localdomain sshd[290703]: Connection reset by authenticating user root 45.140.17.124 port 21236 [preauth]
Dec 06 10:04:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:04:23 np0005548789.localdomain podman[290813]: 2025-12-06 10:04:23.515465329 +0000 UTC m=+0.087552696 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:04:23 np0005548789.localdomain podman[290813]: 2025-12-06 10:04:23.529275847 +0000 UTC m=+0.101363214 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:04:23 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.555 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.555 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.556 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.556 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.557 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:23 np0005548789.localdomain sshd[290832]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:04:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:04:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:23.968 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:04:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.014 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:04:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19225 "" "Go-http-client/1.1"
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548785 (monmap changed)...
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548785 on np0005548785.localdomain
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: pgmap v3881: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548785.vhqlsq (monmap changed)...
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548785.vhqlsq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548785.vhqlsq on np0005548785.localdomain
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/2303863447' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.512 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.513 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.745 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.747 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11515MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.747 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:24.747 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.021 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='client.17190 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548785 (monmap changed)...
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548785.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548785 on np0005548785.localdomain
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.103:0/3114499803' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.469 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.470 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.470 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.507 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:25 np0005548789.localdomain sshd[290832]: Invalid user 12345 from 45.140.17.124 port 53314
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.964 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:25.970 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:04:26 np0005548789.localdomain sshd[290832]: Connection reset by invalid user 12345 45.140.17.124 port 53314 [preauth]
Dec 06 10:04:26 np0005548789.localdomain sshd[290878]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: pgmap v3882: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/4040142409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:26.467 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:04:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:26.470 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:04:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:26.470 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.103:0/1185796831' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.471 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.471 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.472 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:04:27 np0005548789.localdomain sshd[290878]: Invalid user postgres from 45.140.17.124 port 53336
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.914 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.914 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.915 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:04:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:27.915 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:04:28 np0005548789.localdomain sshd[290878]: Connection reset by invalid user postgres 45.140.17.124 port 53336 [preauth]
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0)
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/899954398' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e87 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e87 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e88 e88: 6 total, 6 up, 6 in
Dec 06 10:04:28 np0005548789.localdomain sshd[290880]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:28 np0005548789.localdomain sshd[26320]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain sshd[26339]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain sshd[26301]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain sshd[26394]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain sshd[26282]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain sshd[26222]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain sshd[26205]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain sshd[26358]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain sshd[26377]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 22 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 21 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 25 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 19 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 23 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 24 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 14 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 16 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 20 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain sshd[26244]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain sshd[26263]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 17 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 18 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain sshd[26413]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: pgmap v3883: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/2694903603' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.14120 172.18.0.103:0/3317297540' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 25.
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: Activating manager daemon np0005548788.yvwbqq
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.103:0/899954398' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:04:28 np0005548789.localdomain ceph-mon[290022]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: session-26.scope: Consumed 3min 21.222s CPU time.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 19.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Session 26 logged out. Waiting for processes to exit.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 20.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 16.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 14.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 23.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 24.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 21.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 22.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 17.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 18.
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: Removed session 26.
Dec 06 10:04:28 np0005548789.localdomain sshd[290882]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:28 np0005548789.localdomain sshd[290882]: Accepted publickey for ceph-admin from 192.168.122.106 port 38744 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:04:28 np0005548789.localdomain systemd-logind[766]: New session 65 of user ceph-admin.
Dec 06 10:04:28 np0005548789.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Dec 06 10:04:28 np0005548789.localdomain sshd[290882]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:04:28 np0005548789.localdomain sudo[290886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:29.003 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:29 np0005548789.localdomain sudo[290886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:29 np0005548789.localdomain sudo[290886]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:29 np0005548789.localdomain sudo[290904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:04:29 np0005548789.localdomain sudo[290904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: mgrmap e14: np0005548788.yvwbqq(active, starting, since 0.0774079s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548785"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548786.mczynb", "id": "np0005548786.mczynb"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: Manager daemon np0005548788.yvwbqq is now available
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/1147721137' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:29 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e88 _set_new_cache_sizes cache_size:1019645081 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:29 np0005548789.localdomain systemd[1]: tmp-crun.XROgqq.mount: Deactivated successfully.
Dec 06 10:04:29 np0005548789.localdomain podman[290993]: 2025-12-06 10:04:29.980224808 +0000 UTC m=+0.114806900 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 06 10:04:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:30.027 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:30 np0005548789.localdomain podman[290993]: 2025-12-06 10:04:30.111348394 +0000 UTC m=+0.245930486 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main)
Dec 06 10:04:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:04:30 np0005548789.localdomain sshd[290880]: Connection reset by authenticating user root 45.140.17.124 port 53352 [preauth]
Dec 06 10:04:30 np0005548789.localdomain podman[291027]: 2025-12-06 10:04:30.281883839 +0000 UTC m=+0.104989589 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:04:30 np0005548789.localdomain podman[291027]: 2025-12-06 10:04:30.290902714 +0000 UTC m=+0.114008494 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:04:30 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:04:30 np0005548789.localdomain sshd[291080]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:30 np0005548789.localdomain sudo[290904]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.011 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.025 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.025 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: mgrmap e15: np0005548788.yvwbqq(active, since 1.22065s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:04:30] ENGINE Bus STARTING
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:04:30] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:04:30] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:04:30] ENGINE Bus STARTED
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:04:30] ENGINE Client ('172.18.0.106', 43646) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:04:31 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/3456902707' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.026 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.027 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.028 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.029 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.029 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.030 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:04:31 np0005548789.localdomain sudo[291139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:31 np0005548789.localdomain sudo[291139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:31 np0005548789.localdomain sudo[291139]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:31 np0005548789.localdomain sudo[291157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:04:31 np0005548789.localdomain sudo[291157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:31.736 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:31 np0005548789.localdomain sudo[291157]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: mgrmap e16: np0005548788.yvwbqq(active, since 2s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:32 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/497756778' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:04:32 np0005548789.localdomain sshd[291080]: Connection reset by authenticating user root 45.140.17.124 port 53380 [preauth]
Dec 06 10:04:32 np0005548789.localdomain sudo[291207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:32 np0005548789.localdomain sudo[291207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:32 np0005548789.localdomain sudo[291207]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:32 np0005548789.localdomain sudo[291225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:04:32 np0005548789.localdomain sudo[291225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:32 np0005548789.localdomain sudo[291225]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:33 np0005548789.localdomain sudo[291262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291262]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548785", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548785", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:04:33 np0005548789.localdomain sudo[291280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:33 np0005548789.localdomain sudo[291280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291280]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291298]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:33 np0005548789.localdomain sudo[291316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291334]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291368]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291386]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:04:33 np0005548789.localdomain sudo[291404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291404]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:33 np0005548789.localdomain sudo[291422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291422]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:33 np0005548789.localdomain sudo[291440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291440]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291458]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:33 np0005548789.localdomain sudo[291476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291476]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:33 np0005548789.localdomain sudo[291494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:33 np0005548789.localdomain sudo[291494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:33 np0005548789.localdomain sudo[291494]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:34.040 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:34 np0005548789.localdomain sudo[291528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291528]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: mgrmap e17: np0005548788.yvwbqq(active, since 4s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: Standby manager daemon np0005548785.vhqlsq started
Dec 06 10:04:34 np0005548789.localdomain sudo[291546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291546]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:34 np0005548789.localdomain sudo[291564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291564]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:04:34 np0005548789.localdomain sudo[291582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291582]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:04:34 np0005548789.localdomain sudo[291600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291600]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291618]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:34 np0005548789.localdomain sudo[291636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291636]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291654]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e88 _set_new_cache_sizes cache_size:1020044797 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:34 np0005548789.localdomain sudo[291688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:04:34 np0005548789.localdomain sudo[291688]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain sudo[291707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:04:34 np0005548789.localdomain sudo[291707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291707]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain systemd[1]: tmp-crun.IqDGuR.mount: Deactivated successfully.
Dec 06 10:04:34 np0005548789.localdomain podman[291706]: 2025-12-06 10:04:34.86937747 +0000 UTC m=+0.116102500 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:04:34 np0005548789.localdomain sudo[291735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:34 np0005548789.localdomain sudo[291735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291735]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:34 np0005548789.localdomain podman[291706]: 2025-12-06 10:04:34.942106415 +0000 UTC m=+0.188831425 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:04:34 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:04:34 np0005548789.localdomain sudo[291765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:34 np0005548789.localdomain sudo[291765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:34 np0005548789.localdomain sudo[291765]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:35.028 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:35 np0005548789.localdomain sudo[291783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:04:35 np0005548789.localdomain sudo[291783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291783]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain sudo[291801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548789.localdomain sudo[291801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291801]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: mgrmap e18: np0005548788.yvwbqq(active, since 5s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain sudo[291819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:35 np0005548789.localdomain sudo[291819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291819]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain sudo[291837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548789.localdomain sudo[291837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291837]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain sudo[291871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548789.localdomain sudo[291871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291871]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain sudo[291889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:04:35 np0005548789.localdomain sudo[291889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291889]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:35 np0005548789.localdomain sudo[291907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:35 np0005548789.localdomain sudo[291907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:35 np0005548789.localdomain sudo[291907]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:36 np0005548789.localdomain sudo[291926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:36 np0005548789.localdomain sudo[291926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:36 np0005548789.localdomain sudo[291926]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:38 np0005548789.localdomain sshd[291944]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:39.078 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1383583435' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1383583435' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:39 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054486 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:40.029 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:04:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: from='client.17340 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:04:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:04:43 np0005548789.localdomain podman[291946]: 2025-12-06 10:04:43.929947818 +0000 UTC m=+0.089767846 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:04:43 np0005548789.localdomain podman[291946]: 2025-12-06 10:04:43.934667567 +0000 UTC m=+0.094487585 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:04:43 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:04:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:44.115 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:44 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed131e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@4(peon) e7  my rank is now 3 (was 4)
Dec 06 10:04:44 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:04:44 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:04:44 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed13080 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: paxos.3).electionLogic(26) init, last seen epoch 26
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:45.034 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:04:45 np0005548789.localdomain podman[291965]: 2025-12-06 10:04:45.924010125 +0000 UTC m=+0.086349637 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:04:45 np0005548789.localdomain podman[291965]: 2025-12-06 10:04:45.961248426 +0000 UTC m=+0.123587938 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:04:45 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:04:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:04:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:04:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:04:47.294 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:04:47.294 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:04:47.295 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:48 np0005548789.localdomain ceph-mds[287313]: mds.beacon.mds.np0005548789.vxwwsq missed beacon ack from the monitors
Dec 06 10:04:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:49.167 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='client.34161 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548785"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Remove daemons mon.np0005548785
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Safe to remove mon.np0005548785: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Removing monitor np0005548785 from monmap...
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon rm", "name": "np0005548785"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Removing daemon mon.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: monmap epoch 7
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:04:44.209099+0000
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mgrmap e18: np0005548788.yvwbqq(active, since 20s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Health check failed: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789 (MON_DOWN)
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]:     mon.np0005548788 (rank 4) addr [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] is down (out of quorum)
Dec 06 10:04:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054726 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:50.035 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:50 np0005548789.localdomain sshd[291989]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:50 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:04:50 np0005548789.localdomain ceph-mon[290022]: paxos.3).electionLogic(29) init, last seen epoch 29, mid-election, bumping
Dec 06 10:04:50 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548789.localdomain sudo[291990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:50 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:04:50 np0005548789.localdomain sudo[291990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:50 np0005548789.localdomain sudo[291990]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:50 np0005548789.localdomain sudo[292009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:50 np0005548789.localdomain sudo[292009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:04:50 np0005548789.localdomain podman[292042]: 2025-12-06 10:04:50.814347366 +0000 UTC m=+0.085372706 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:04:50 np0005548789.localdomain podman[292042]: 2025-12-06 10:04:50.830194179 +0000 UTC m=+0.101219509 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.842059645 +0000 UTC m=+0.081953779 container create b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: Started libpod-conmon-b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229.scope.
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.812626541 +0000 UTC m=+0.052520755 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:50 np0005548789.localdomain podman[292043]: 2025-12-06 10:04:50.932694767 +0000 UTC m=+0.203092397 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.947626441 +0000 UTC m=+0.187520615 container init b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.958747943 +0000 UTC m=+0.198642107 container start b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.959032762 +0000 UTC m=+0.198926926 container attach b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_BRANCH=main, version=7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7)
Dec 06 10:04:50 np0005548789.localdomain systemd[1]: libpod-b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229.scope: Deactivated successfully.
Dec 06 10:04:50 np0005548789.localdomain sharp_gauss[292090]: 167 167
Dec 06 10:04:50 np0005548789.localdomain podman[292056]: 2025-12-06 10:04:50.96655198 +0000 UTC m=+0.206446144 container died b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=7, io.buildah.version=1.41.4, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=)
Dec 06 10:04:51 np0005548789.localdomain podman[292043]: 2025-12-06 10:04:51.021407519 +0000 UTC m=+0.291805139 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:04:51 np0005548789.localdomain podman[292104]: 2025-12-06 10:04:51.05425386 +0000 UTC m=+0.078183979 container remove b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_gauss, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, build-date=2025-11-26T19:44:28Z, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: libpod-conmon-b6383cc55f58a79b8aebd067c80c1d30f31c6a97a7c2ed9460b58c98269d7229.scope: Deactivated successfully.
Dec 06 10:04:51 np0005548789.localdomain sudo[292009]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548788 calling monitor election
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='client.34193 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: Removed label mon from host np0005548785.localdomain
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4)
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: monmap epoch 7
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:04:44.209099+0000
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005548788
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: mgrmap e18: np0005548788.yvwbqq(active, since 22s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005548787,np0005548786,np0005548790,np0005548789)
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: Cluster is now healthy
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:04:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:51 np0005548789.localdomain sudo[292121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:51 np0005548789.localdomain sudo[292121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:51 np0005548789.localdomain sudo[292121]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:51 np0005548789.localdomain sudo[292139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:51 np0005548789.localdomain sudo[292139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.73485313 +0000 UTC m=+0.077867959 container create aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: Started libpod-conmon-aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac.scope.
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.796544185 +0000 UTC m=+0.139559034 container init aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, ceph=True, io.openshift.tags=rhceph ceph, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.703848408 +0000 UTC m=+0.046863287 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.806715887 +0000 UTC m=+0.149730736 container start aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1763362218, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.80711182 +0000 UTC m=+0.150126669 container attach aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:04:51 np0005548789.localdomain priceless_lederberg[292189]: 167 167
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: libpod-aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac.scope: Deactivated successfully.
Dec 06 10:04:51 np0005548789.localdomain podman[292174]: 2025-12-06 10:04:51.809444114 +0000 UTC m=+0.152458963 container died aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True)
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-804bc0cf7b72699221b747076d5b7b86d5e8c5022904974dc1238458c891a736-merged.mount: Deactivated successfully.
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5bd68a17d9d521561017dc0a056d4164fb9192d99c40fc3c6ad21859d8ccb73d-merged.mount: Deactivated successfully.
Dec 06 10:04:51 np0005548789.localdomain podman[292194]: 2025-12-06 10:04:51.909596788 +0000 UTC m=+0.087718721 container remove aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7)
Dec 06 10:04:51 np0005548789.localdomain systemd[1]: libpod-conmon-aab6d0e942556f090ce342b6a7c6f60155a525c26da8e5e430a68d0eaff10eac.scope: Deactivated successfully.
Dec 06 10:04:52 np0005548789.localdomain sshd[291989]: Received disconnect from 43.163.93.82 port 47260:11:  [preauth]
Dec 06 10:04:52 np0005548789.localdomain sshd[291989]: Disconnected from authenticating user root 43.163.93.82 port 47260 [preauth]
Dec 06 10:04:52 np0005548789.localdomain sudo[292139]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:52 np0005548789.localdomain sudo[292218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:52 np0005548789.localdomain sudo[292218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:52 np0005548789.localdomain sudo[292218]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:52 np0005548789.localdomain sudo[292236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:52 np0005548789.localdomain sudo[292236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='client.26575 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: Removed label mgr from host np0005548785.localdomain
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.78585355 +0000 UTC m=+0.082975971 container create 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Dec 06 10:04:52 np0005548789.localdomain systemd[1]: Started libpod-conmon-8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2.scope.
Dec 06 10:04:52 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.749833188 +0000 UTC m=+0.046955679 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.858575925 +0000 UTC m=+0.155698346 container init 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, version=7, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container)
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.868128377 +0000 UTC m=+0.165250798 container start 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, vcs-type=git, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.868458297 +0000 UTC m=+0.165580728 container attach 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:04:52 np0005548789.localdomain youthful_elion[292286]: 167 167
Dec 06 10:04:52 np0005548789.localdomain systemd[1]: libpod-8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2.scope: Deactivated successfully.
Dec 06 10:04:52 np0005548789.localdomain podman[292271]: 2025-12-06 10:04:52.874549 +0000 UTC m=+0.171671481 container died 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:04:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cec35d441eb029aee8c5824847ad289891d3696db27359c32a86aaa970b767e9-merged.mount: Deactivated successfully.
Dec 06 10:04:52 np0005548789.localdomain podman[292291]: 2025-12-06 10:04:52.989297887 +0000 UTC m=+0.097309114 container remove 8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_elion, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=7)
Dec 06 10:04:52 np0005548789.localdomain systemd[1]: libpod-conmon-8082bc65610e3e53fac9a91e09859ce502e342a3711b26d36ed35ec9d1b8ebe2.scope: Deactivated successfully.
Dec 06 10:04:53 np0005548789.localdomain sudo[292236]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:53 np0005548789.localdomain sudo[292316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:53 np0005548789.localdomain sudo[292316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:53 np0005548789.localdomain sudo[292316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='client.34189 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548785.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: Removed label _admin from host np0005548785.localdomain
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:53 np0005548789.localdomain sudo[292334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:53 np0005548789.localdomain sudo[292334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:04:53 np0005548789.localdomain podman[292367]: 2025-12-06 10:04:53.856025947 +0000 UTC m=+0.096859211 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:04:53 np0005548789.localdomain podman[292367]: 2025-12-06 10:04:53.898319117 +0000 UTC m=+0.139152351 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 06 10:04:53 np0005548789.localdomain podman[292375]: 
Dec 06 10:04:53 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:04:53 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:53.914032106 +0000 UTC m=+0.131285902 container create ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, version=7, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:04:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:04:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:04:53 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:53.881240456 +0000 UTC m=+0.098494262 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: Started libpod-conmon-ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49.scope.
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:54 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:54.048787176 +0000 UTC m=+0.266041042 container init ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:04:54 np0005548789.localdomain optimistic_tharp[292403]: 167 167
Dec 06 10:04:54 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:54.057072369 +0000 UTC m=+0.274326235 container start ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=)
Dec 06 10:04:54 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:54.057386969 +0000 UTC m=+0.274640755 container attach ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: libpod-ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49.scope: Deactivated successfully.
Dec 06 10:04:54 np0005548789.localdomain podman[292375]: 2025-12-06 10:04:54.059187475 +0000 UTC m=+0.276441301 container died ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., release=1763362218, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:04:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:04:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157891 "" "Go-http-client/1.1"
Dec 06 10:04:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:04:54 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19519 "" "Go-http-client/1.1"
Dec 06 10:04:54 np0005548789.localdomain podman[292408]: 2025-12-06 10:04:54.163533043 +0000 UTC m=+0.093290378 container remove ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_tharp, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, release=1763362218, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc.)
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: libpod-conmon-ebf42fe8296d28a77b1ca2d2ca4de8448ebce0959a1b1cdd730e7979adbcfd49.scope: Deactivated successfully.
Dec 06 10:04:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:54.205 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:54 np0005548789.localdomain sudo[292334]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:54 np0005548789.localdomain sudo[292424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:54 np0005548789.localdomain sudo[292424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:54 np0005548789.localdomain sudo[292424]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:54 np0005548789.localdomain sudo[292442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:54 np0005548789.localdomain sudo[292442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:54 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0945c963a2bef7b73a93aff65337751ac4be9b4623b6d9c58d9832793c21ad25-merged.mount: Deactivated successfully.
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.919489431 +0000 UTC m=+0.063157942 container create ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, version=7, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: Started libpod-conmon-ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2.scope.
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.985992429 +0000 UTC m=+0.129660940 container init ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1763362218, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.889213102 +0000 UTC m=+0.032881603 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.996491152 +0000 UTC m=+0.140159643 container start ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.996708529 +0000 UTC m=+0.140377050 container attach ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, io.openshift.expose-services=, version=7, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:04:54 np0005548789.localdomain great_kirch[292494]: 167 167
Dec 06 10:04:54 np0005548789.localdomain systemd[1]: libpod-ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2.scope: Deactivated successfully.
Dec 06 10:04:54 np0005548789.localdomain podman[292478]: 2025-12-06 10:04:54.999438305 +0000 UTC m=+0.143106836 container died ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, architecture=x86_64, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:04:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:55.041 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:55 np0005548789.localdomain podman[292499]: 2025-12-06 10:04:55.085061999 +0000 UTC m=+0.071521108 container remove ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_kirch, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=1763362218, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4)
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: libpod-conmon-ba921e336fee6edc024f52d8fdea366612287a7527acdb52d196de1f80200cf2.scope: Deactivated successfully.
Dec 06 10:04:55 np0005548789.localdomain sudo[292442]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:55 np0005548789.localdomain sudo[292513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:55 np0005548789.localdomain sudo[292513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:55 np0005548789.localdomain sudo[292513]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:55 np0005548789.localdomain sudo[292531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:04:55 np0005548789.localdomain sudo[292531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:04:55 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:55 np0005548789.localdomain sshd[292549]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.809988224 +0000 UTC m=+0.077787186 container create 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph, version=7)
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: Started libpod-conmon-89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4.scope.
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a99cc2b8add5af5710e00540b9ec2e0bf298bd394ee07e52eee333272053d4ca-merged.mount: Deactivated successfully.
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.874410806 +0000 UTC m=+0.142209768 container init 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7)
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.778206487 +0000 UTC m=+0.046005509 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.887327385 +0000 UTC m=+0.155126347 container start 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218)
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.887554072 +0000 UTC m=+0.155353034 container attach 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, RELEASE=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:04:55 np0005548789.localdomain exciting_germain[292583]: 167 167
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: libpod-89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4.scope: Deactivated successfully.
Dec 06 10:04:55 np0005548789.localdomain podman[292568]: 2025-12-06 10:04:55.889526385 +0000 UTC m=+0.157325377 container died 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main)
Dec 06 10:04:55 np0005548789.localdomain podman[292590]: 2025-12-06 10:04:55.993039616 +0000 UTC m=+0.091697178 container remove 89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_germain, io.buildah.version=1.41.4, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1763362218, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:04:55 np0005548789.localdomain systemd[1]: libpod-conmon-89998159898d054faf7ea20cf916cf5a8b4a1b67da02054bfbc0bb583f09d5b4.scope: Deactivated successfully.
Dec 06 10:04:56 np0005548789.localdomain sudo[292531]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:04:56 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4d7ea6e363a1e8d567b6eaeefcf1f0c98c4dfd00868b981e4f80fa4a2f4c63b2-merged.mount: Deactivated successfully.
Dec 06 10:04:57 np0005548789.localdomain sshd[292549]: Received disconnect from 14.194.101.210 port 45514:11: Bye Bye [preauth]
Dec 06 10:04:57 np0005548789.localdomain sshd[292549]: Disconnected from authenticating user root 14.194.101.210 port 45514 [preauth]
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.496864) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497496973, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11446, "num_deletes": 523, "total_data_size": 16360845, "memory_usage": 16981888, "flush_reason": "Manual Compaction"}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497576012, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11186426, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11451, "table_properties": {"data_size": 11133322, "index_size": 27526, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24517, "raw_key_size": 257404, "raw_average_key_size": 26, "raw_value_size": 10968763, "raw_average_value_size": 1120, "num_data_blocks": 1034, "num_entries": 9790, "num_filter_entries": 9790, "num_deletions": 522, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 1765015444, "file_creation_time": 1765015497, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 79223 microseconds, and 26042 cpu microseconds.
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.576090) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11186426 bytes OK
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.576118) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578088) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578111) EVENT_LOG_v1 {"time_micros": 1765015497578105, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.578130) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16284988, prev total WAL file size 16284988, number of live WAL files 2.
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.580708) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1887B)]
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497580856, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11188313, "oldest_snapshot_seqno": -1}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9271 keys, 11178507 bytes, temperature: kUnknown
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497667266, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11178507, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11126708, "index_size": 27506, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23237, "raw_key_size": 248828, "raw_average_key_size": 26, "raw_value_size": 10968821, "raw_average_value_size": 1183, "num_data_blocks": 1033, "num_entries": 9271, "num_filter_entries": 9271, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 0, "file_creation_time": 1765015497, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.667606) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11178507 bytes
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.669367) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.3 rd, 129.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.7, 0.0 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9795, records dropped: 524 output_compression: NoCompression
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.669397) EVENT_LOG_v1 {"time_micros": 1765015497669385, "job": 4, "event": "compaction_finished", "compaction_time_micros": 86504, "compaction_time_cpu_micros": 35981, "output_level": 6, "num_output_files": 1, "total_output_size": 11178507, "num_input_records": 9795, "num_output_records": 9271, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497670933, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015497670985, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 06 10:04:57 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:04:57.580451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:04:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:04:59.231 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:04:59 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:00.045 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:00 np0005548789.localdomain ceph-mon[290022]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:05:00 np0005548789.localdomain podman[292607]: 2025-12-06 10:05:00.94416045 +0000 UTC m=+0.099308748 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:05:00 np0005548789.localdomain podman[292607]: 2025-12-06 10:05:00.955489319 +0000 UTC m=+0.110637637 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:05:00 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:02 np0005548789.localdomain ceph-mon[290022]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:03 np0005548789.localdomain sudo[292630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:03 np0005548789.localdomain sudo[292630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292630]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:03 np0005548789.localdomain sudo[292648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292648]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548789.localdomain sudo[292666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292666]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:03 np0005548789.localdomain sudo[292684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292684]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548789.localdomain sudo[292702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292702]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548789.localdomain sudo[292736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292736]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:03 np0005548789.localdomain sudo[292754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292754]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:03 np0005548789.localdomain sudo[292772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:03 np0005548789.localdomain sudo[292772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:03 np0005548789.localdomain sudo[292772]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:04 np0005548789.localdomain sudo[292790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292790]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:04 np0005548789.localdomain sudo[292808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292808]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548789.localdomain sudo[292826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292826]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain sudo[292844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Removing np0005548785.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: Removing np0005548785.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:04 np0005548789.localdomain sudo[292844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292844]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:04.270 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:04 np0005548789.localdomain sudo[292862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548789.localdomain sudo[292862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292862]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548789.localdomain sudo[292896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292896]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:04 np0005548789.localdomain sudo[292914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292914]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain sudo[292932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:04 np0005548789.localdomain sudo[292932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:04 np0005548789.localdomain sudo[292932]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:05.048 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:05 np0005548789.localdomain sshd[292950]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='client.34194 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548785.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Added label _no_schedule to host np0005548785.localdomain
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548785.localdomain
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:05 np0005548789.localdomain sshd[292950]: Received disconnect from 64.227.102.57 port 43140:11: Bye Bye [preauth]
Dec 06 10:05:05 np0005548789.localdomain sshd[292950]: Disconnected from authenticating user root 64.227.102.57 port 43140 [preauth]
Dec 06 10:05:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:05:05 np0005548789.localdomain podman[292952]: 2025-12-06 10:05:05.775190802 +0000 UTC m=+0.093760473 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:05:05 np0005548789.localdomain podman[292952]: 2025-12-06 10:05:05.823249605 +0000 UTC m=+0.141819306 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:05:05 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:05:06 np0005548789.localdomain ceph-mon[290022]: Removing daemon crash.np0005548785 from np0005548785.localdomain -- ports []
Dec 06 10:05:06 np0005548789.localdomain ceph-mon[290022]: from='client.34199 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548785.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:05:06 np0005548789.localdomain sudo[292979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:06 np0005548789.localdomain sudo[292979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:06 np0005548789.localdomain sudo[292979]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:07 np0005548789.localdomain sudo[292997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:07 np0005548789.localdomain sudo[292997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:07 np0005548789.localdomain sudo[292997]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='client.26806 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548785.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain"}]': finished
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: Removed host np0005548785.localdomain
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: Removing key for client.crash.np0005548785.localdomain
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005548785.localdomain"}]': finished
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.913 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.914 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.934 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b60881d7-0300-408d-8bcb-dc23d4010ae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:05:07.914909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0b3d72c2-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.183417577, 'message_signature': '7a9c1a8ec781467465a8d39e26c7bed89f9a2be2d28e0567c896376f5ea850cd'}]}, 'timestamp': '2025-12-06 10:05:07.935302', '_unique_id': '94a26eab537246ed9683ea4c9b47d41c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.939 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.950 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.951 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b31586-5b2c-49b8-a284-ff31daf04a9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:07.939846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b3ff150-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': '7eab78081cab732a57eb01036fde85ecbfae4cc061c9d94c9cf1423a47f223b9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:07.939846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b400938-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': 'a84a0a5f764fe817fd5e22570c6af1eec4dba9ba27d39b0f12997426b8e48537'}]}, 'timestamp': '2025-12-06 10:05:07.952117', '_unique_id': 'bb8b931dca4546f886c915802439de08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.954 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.955 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff5c7c6f-15a9-4734-91b8-feaf5bedc52b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:07.954939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b408a98-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': 'dc933fb32434638c6d672f6736b46bf77b3f443f416be950ada2345dfb262dcf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:07.954939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b409ab0-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': 'd3bf84551b41cc7459ea055b091c883a411e475cf328fc867105742db4ef7212'}]}, 'timestamp': '2025-12-06 10:05:07.955831', '_unique_id': '4fdf674ab7614789b237db8569d98baf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.961 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '810fb076-e1a0-408d-a2e0-33f766343f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:07.958213', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4198b6-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': '7769664b1f732f5c999df54b0d526e7c4dc0c61c84de482e77f9efc1d930956d'}]}, 'timestamp': '2025-12-06 10:05:07.962375', '_unique_id': 'bcc1cd9acd184aa8a851143ed4c8bf32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.964 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.965 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a73fa79-20d5-434c-b161-027b4d3cb523', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:07.965035', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b421868-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'bae77179e7032572752434b17cb5452058bc4c7baee3112c7c2e9ab908f8e1d6'}]}, 'timestamp': '2025-12-06 10:05:07.965749', '_unique_id': 'd3244f2115fa45df921a3ee34f935b23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8809fc10-309f-43f3-9003-670ce523ab23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:07.968092', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b46b3c8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': 'e1a4c26d7199c90913aa8548cbd457ebc67e1f80adf37c5676ec7c839af301a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:07.968092', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b46cb4c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '3157a57e9a7ccaccd45e366fbc31958b5ad028e367df8648abd91fbbfae9f8b3'}]}, 'timestamp': '2025-12-06 10:05:07.996383', '_unique_id': '9f5f581b39984f8f955e27bfed665188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:05:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25d8d9cb-5080-4f25-87df-4c2720253048', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:07.999394', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4757a6-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'b78c6eeb25c5b931e3e887500883258c533434c0fa20087fd407cd3b382b0126'}]}, 'timestamp': '2025-12-06 10:05:08.000132', '_unique_id': '06d9000b047142aea97dc18a307a5dab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.004 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c611ed0-e7a5-410c-aa59-b6e0059bb150', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.003375', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b47f31e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': '87ef39184a5ccac35941c61dc17a2787f5a2f7a8a2a197851efaa1d827efdb4d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.003375', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b480c50-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.189234415, 'message_signature': 'f1536ff62b93301f5b4a1359ad5ff080a8bafa936f9c2c1faad017fa34588300'}]}, 'timestamp': '2025-12-06 10:05:08.004671', '_unique_id': 'a5de3d7cea544380844d6c11e5aa2840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.007 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6004663b-4437-4dbd-9ac3-e6e8a39f6726', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.007903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b48a430-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '7b19b2cfd7289590c073132387b4dee23c12b26cbab007c5dd50a937b4584bed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.007903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b48bd30-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '9bd90fc4d5a75fbbe8248e91506184155da7404a10932da89a7208378757e55e'}]}, 'timestamp': '2025-12-06 10:05:08.009204', '_unique_id': 'f0c3453de41d4508a0c1c8c998700408'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.012 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8c313c2-6c4b-4d16-9a31-b0eeb09f96ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.012649', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b495f56-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': '40347d1e6c5c67c6d80ce41e32cceb809ba3eb058faa484f9343465ac50f27f7'}]}, 'timestamp': '2025-12-06 10:05:08.013388', '_unique_id': '28d8745f240a422694393668954276e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbd774ac-6968-4222-8327-5f062e8e8698', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.015968', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b49db02-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'c0655b7d126770ce66153d0c830ee764aaddbc171c7977d5011f4b16c28a4f05'}]}, 'timestamp': '2025-12-06 10:05:08.016451', '_unique_id': 'a4294e5151d847eebde3dc35a239733a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03823540-9df5-4049-8cc7-70812847024c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.018627', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b4a4362-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '21ede74668d9454950e34f55621221b8ecbcc0cd205471e585f608c577c0a725'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.018627', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b4a5352-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': 'c00accf9da8ba208cce0d71f9cb04639f28331eae11ac01602262148045eb0df'}]}, 'timestamp': '2025-12-06 10:05:08.019498', '_unique_id': '8794eb034624482db942b105537cd2a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.021 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ce9acdd-17b1-4644-86a7-74407e7cfb4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.021629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b4ab82e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': 'c1483d2004f5b3878319882d22f2cd1a5ab86b5f7a1e5a26b2f9a327200c81b3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.021629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b4ac8c8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': 'eb83c71ee6eab426491a62d1c903ef3bc999d309c32ea3b6b9586db83638450e'}]}, 'timestamp': '2025-12-06 10:05:08.022499', '_unique_id': '43f937ab9f7a4dafa707ceb5e97e2cd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ed3742c-979c-4c5b-97fb-431175bbcf81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.024611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b4b2cb4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '4a29779e35d71d91c644e7738f12f54a0b5cc5f53c60702b4ca481a497ced313'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.024611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b4b3fb0-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '9475d14c395db178de827117646c21b0e3d144c198c13dfb05541563fe186f3c'}]}, 'timestamp': '2025-12-06 10:05:08.025548', '_unique_id': '11396ae3468c40d3bcf74d267a60a510'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 13030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ca591d5-4395-4451-b4d7-8232d4ec6278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13030000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:05:08.028399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0b4bc12e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.183417577, 'message_signature': 'ec13f51ed2510fe8c208785477cec605c2465dafa2ff9742e32c7283a5f1e119'}]}, 'timestamp': '2025-12-06 10:05:08.028908', '_unique_id': 'e751c279813c472489598bb3e142fb78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '454963c2-e32b-4470-b64b-4684c8d7896e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.031020', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4c23b2-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'c25a104bc593e2b4cd85fecd1b37ef05cbd1ade4de6ce7540569d7cb9654598e'}]}, 'timestamp': '2025-12-06 10:05:08.031331', '_unique_id': '0d407d273c3749c49b1205caa123af4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e50b25ec-8048-4c4a-94d6-e833f34fd044', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.032728', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4c675a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'b3573de2c6c54d557855d72ed6c9133ff5fdd6030918bfbd980be2883bea9f81'}]}, 'timestamp': '2025-12-06 10:05:08.033060', '_unique_id': '09b5e65864584de5ab688b56d27cf6c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.034 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19784a1e-f5d7-42c9-95b8-58f61e6b66da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.034462', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4ca9fe-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'f0332ada12b99bd9b6a41d25cb54644a5ba71f209546e833c51097c9c39330e7'}]}, 'timestamp': '2025-12-06 10:05:08.034787', '_unique_id': '2a380c8ed5d44d8f8cedf000919987bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.036 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7ced3a-c355-43cc-8ae5-c6f42d1b9454', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:05:08.036159', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0b4cec34-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': 'd99452fda81c5d7604401e87a1569b1ce2703bc82ef62cfa0a6a2ed238c52b3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:05:08.036159', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0b4cf6ac-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.217481403, 'message_signature': '4d2648c5d1351f8a329d91dab860c376df31824bd3a29214b76f2491ab480785'}]}, 'timestamp': '2025-12-06 10:05:08.036708', '_unique_id': '24c8df0ceef147a3ae368cff8597fea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.038 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b317da36-c575-4e90-88d0-92d76049a8b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.038202', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4d3cfc-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': '7266d786bd9468884c39d9f9118f5f909bb6359817444787866ddb7d5942c53f'}]}, 'timestamp': '2025-12-06 10:05:08.038526', '_unique_id': '45d463d54a824ad3abe9ed33307f9e09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd324bf9a-a9fc-4e0b-bfc9-4f2d78dc7a34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:05:08.039911', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '0b4d7ec4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 11926.207643981, 'message_signature': 'a2affc2359bc3e6a79d7f42917fa96b141292f53368226fb878894dc43cd5298'}]}, 'timestamp': '2025-12-06 10:05:08.040213', '_unique_id': 'fbc365b884d34f58b178ed7524b869e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:05:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:05:08.041 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:09.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:09 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:09 np0005548789.localdomain sshd[293015]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:10.051 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:05:10 np0005548789.localdomain ceph-mon[290022]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:11 np0005548789.localdomain sshd[293015]: Received disconnect from 118.219.234.233 port 46596:11: Bye Bye [preauth]
Dec 06 10:05:11 np0005548789.localdomain sshd[293015]: Disconnected from authenticating user root 118.219.234.233 port 46596 [preauth]
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:14.314 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:05:14 np0005548789.localdomain podman[293017]: 2025-12-06 10:05:14.9335594 +0000 UTC m=+0.092333338 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:05:14 np0005548789.localdomain podman[293017]: 2025-12-06 10:05:14.96448723 +0000 UTC m=+0.123261158 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:05:14 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:05:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:15.052 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='client.26610 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: Saving service mon spec with placement label:mon
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:05:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:16 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:05:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:05:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:05:16 np0005548789.localdomain podman[293035]: 2025-12-06 10:05:16.90662905 +0000 UTC m=+0.071085405 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:05:16 np0005548789.localdomain podman[293035]: 2025-12-06 10:05:16.937502918 +0000 UTC m=+0.101959263 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:05:16 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:05:17 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed13080 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 06 10:05:17 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:05:17 np0005548789.localdomain ceph-mon[290022]: paxos.3).electionLogic(32) init, last seen epoch 32
Dec 06 10:05:17 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:17 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:17 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:17 np0005548789.localdomain sshd[293058]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:18 np0005548789.localdomain sshd[293058]: Connection closed by authenticating user root 45.10.175.77 port 43950 [preauth]
Dec 06 10:05:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:19.347 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:20.056 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:05:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:05:21 np0005548789.localdomain systemd[1]: tmp-crun.LVWoyc.mount: Deactivated successfully.
Dec 06 10:05:21 np0005548789.localdomain podman[293060]: 2025-12-06 10:05:21.925848456 +0000 UTC m=+0.078924146 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, release=1755695350)
Dec 06 10:05:21 np0005548789.localdomain podman[293060]: 2025-12-06 10:05:21.934150711 +0000 UTC m=+0.087226371 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:05:21 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:05:21 np0005548789.localdomain podman[293061]: 2025-12-06 10:05:21.987577983 +0000 UTC m=+0.135840836 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:05:21 np0005548789.localdomain podman[293061]: 2025-12-06 10:05:21.995790355 +0000 UTC m=+0.144053148 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:22 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='client.26625 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548788"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Remove daemons mon.np0005548788
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Safe to remove mon.np0005548788: new quorum should be ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'] (from ['np0005548787', 'np0005548786', 'np0005548790', 'np0005548789'])
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Removing monitor np0005548788 from monmap...
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: Removing daemon mon.np0005548788 from np0005548788.localdomain -- ports []
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789 in quorum (ranks 0,2,3)
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: monmap epoch 8
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:05:17.086581+0000
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: mgrmap e18: np0005548788.yvwbqq(active, since 53s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:22 np0005548789.localdomain sudo[293099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:22 np0005548789.localdomain sudo[293099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:22 np0005548789.localdomain sudo[293099]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:22 np0005548789.localdomain sudo[293117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:22 np0005548789.localdomain sudo[293117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:22 np0005548789.localdomain podman[293151]: 
Dec 06 10:05:22 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:22.962514581 +0000 UTC m=+0.091520743 container create 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph)
Dec 06 10:05:23 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:22.92047548 +0000 UTC m=+0.049481692 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205.scope.
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:23 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:23.055956372 +0000 UTC m=+0.184962564 container init 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1763362218, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:05:23 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:23.067995872 +0000 UTC m=+0.197002034 container start 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.41.4, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:05:23 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:23.068327823 +0000 UTC m=+0.197334025 container attach 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, version=7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:05:23 np0005548789.localdomain priceless_tesla[293166]: 167 167
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: libpod-5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205.scope: Deactivated successfully.
Dec 06 10:05:23 np0005548789.localdomain podman[293151]: 2025-12-06 10:05:23.0763826 +0000 UTC m=+0.205388762 container died 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 06 10:05:23 np0005548789.localdomain podman[293171]: 2025-12-06 10:05:23.156900254 +0000 UTC m=+0.074324324 container remove 5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_tesla, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1763362218, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: libpod-conmon-5918be0bbd711da785c6cf0858733beb40f7be7a66c90ddd07235e6563123205.scope: Deactivated successfully.
Dec 06 10:05:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:23.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:23 np0005548789.localdomain sudo[293117]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:05:23 np0005548789.localdomain ceph-mon[290022]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:23 np0005548789.localdomain sudo[293187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:23 np0005548789.localdomain sudo[293187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:23 np0005548789.localdomain sudo[293187]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:23 np0005548789.localdomain sudo[293205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:23 np0005548789.localdomain sudo[293205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:05:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:05:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:05:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0a69c25864226b0bea7e77227e53807758885aa61e10a3dd69cff582964d9ab4-merged.mount: Deactivated successfully.
Dec 06 10:05:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:05:23 np0005548789.localdomain podman[293239]: 
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:24.007058749 +0000 UTC m=+0.136267028 container create c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container)
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:23.921608143 +0000 UTC m=+0.050816482 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65.scope.
Dec 06 10:05:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:05:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19215 "" "Go-http-client/1.1"
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:24.077118702 +0000 UTC m=+0.206326971 container init c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public)
Dec 06 10:05:24 np0005548789.localdomain busy_lewin[293266]: 167 167
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:24.090033068 +0000 UTC m=+0.219241337 container start c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, release=1763362218, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: libpod-c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65.scope: Deactivated successfully.
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:24.090696179 +0000 UTC m=+0.219904538 container attach c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, RELEASE=main)
Dec 06 10:05:24 np0005548789.localdomain podman[293239]: 2025-12-06 10:05:24.093906808 +0000 UTC m=+0.223115167 container died c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_BRANCH=main, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=)
Dec 06 10:05:24 np0005548789.localdomain podman[293253]: 2025-12-06 10:05:24.098163168 +0000 UTC m=+0.101831590 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:05:24 np0005548789.localdomain podman[293253]: 2025-12-06 10:05:24.117313077 +0000 UTC m=+0.120981539 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:05:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:24.178 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:24.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:24 np0005548789.localdomain podman[293275]: 2025-12-06 10:05:24.206779916 +0000 UTC m=+0.104934695 container remove c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_lewin, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: libpod-conmon-c095304d7f0d99f6728e18e9209c902b868a7fd533b6ff12eaf950f298e99e65.scope: Deactivated successfully.
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:05:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:24.349 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:24 np0005548789.localdomain sudo[293205]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:24 np0005548789.localdomain sudo[293301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:24 np0005548789.localdomain sudo[293301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:24 np0005548789.localdomain sudo[293301]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:24 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:24 np0005548789.localdomain sudo[293319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:24 np0005548789.localdomain sudo[293319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f78b92e4bb25f2cead10674e485ed40d7fb806b8e639da61ee5d91234c7cff2c-merged.mount: Deactivated successfully.
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.058 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.208 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.209 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.210 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.210 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.211 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.222051334 +0000 UTC m=+0.083881969 container create 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main)
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: Started libpod-conmon-8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0.scope.
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.187264245 +0000 UTC m=+0.049094920 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.31855859 +0000 UTC m=+0.180389215 container init 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.341970069 +0000 UTC m=+0.203800714 container start 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.342390082 +0000 UTC m=+0.204220757 container attach 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, io.openshift.expose-services=)
Dec 06 10:05:25 np0005548789.localdomain blissful_panini[293371]: 167 167
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: libpod-8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0.scope: Deactivated successfully.
Dec 06 10:05:25 np0005548789.localdomain podman[293355]: 2025-12-06 10:05:25.345566329 +0000 UTC m=+0.207396954 container died 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, vcs-type=git, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:05:25 np0005548789.localdomain podman[293378]: 2025-12-06 10:05:25.455779617 +0000 UTC m=+0.094622369 container remove 8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_panini, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64)
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: libpod-conmon-8d07218d62a798c0e9ec89c9e6582d688371a72d8667dbdabcb3628a99cfb2e0.scope: Deactivated successfully.
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:05:25 np0005548789.localdomain sudo[293319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:05:25 np0005548789.localdomain ceph-mon[290022]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3699539753' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.708 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:25 np0005548789.localdomain sudo[293421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:25 np0005548789.localdomain sudo[293421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:25 np0005548789.localdomain sudo[293421]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.793 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.794 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:05:25 np0005548789.localdomain sudo[293439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:25 np0005548789.localdomain sudo[293439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: tmp-crun.O1FR7V.mount: Deactivated successfully.
Dec 06 10:05:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-61141d3f6fd5c2203c20226adbd423a0da21902efe75cfd4383c42cfb7f8103a-merged.mount: Deactivated successfully.
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.989 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.991 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11559MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.992 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:25.992 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.075 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.076 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.076 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.170 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.312263136 +0000 UTC m=+0.072743617 container create 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:05:26 np0005548789.localdomain systemd[1]: Started libpod-conmon-03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9.scope.
Dec 06 10:05:26 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.273891696 +0000 UTC m=+0.034372247 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.377074227 +0000 UTC m=+0.137554688 container init 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1763362218, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7)
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.385711562 +0000 UTC m=+0.146192023 container start 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7)
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.386095784 +0000 UTC m=+0.146576235 container attach 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, ceph=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:05:26 np0005548789.localdomain cool_roentgen[293509]: 167 167
Dec 06 10:05:26 np0005548789.localdomain systemd[1]: libpod-03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9.scope: Deactivated successfully.
Dec 06 10:05:26 np0005548789.localdomain podman[293475]: 2025-12-06 10:05:26.394586785 +0000 UTC m=+0.155067276 container died 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:05:26 np0005548789.localdomain podman[293515]: 2025-12-06 10:05:26.48196141 +0000 UTC m=+0.081565268 container remove 03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_roentgen, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z)
Dec 06 10:05:26 np0005548789.localdomain systemd[1]: libpod-conmon-03150583466d44ebe09e2a3504a1ab2a2d349dd181c41c3ca8b6f33964ff39f9.scope: Deactivated successfully.
Dec 06 10:05:26 np0005548789.localdomain sudo[293439]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.672 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/3699539753' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:26 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/2165954404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.683 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.698 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:05:26 np0005548789.localdomain sudo[293531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.700 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:05:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:26.701 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:26 np0005548789.localdomain sudo[293531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:26 np0005548789.localdomain sudo[293531]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:26 np0005548789.localdomain sudo[293551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:26 np0005548789.localdomain sudo[293551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f735fd3a4ab614b24b165107e92fd20284281e4316d6ae58435ca9bed9328c06-merged.mount: Deactivated successfully.
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.287483173 +0000 UTC m=+0.085318183 container create e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.41.4, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:05:27 np0005548789.localdomain systemd[1]: Started libpod-conmon-e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7.scope.
Dec 06 10:05:27 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.255278203 +0000 UTC m=+0.053113213 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.359782474 +0000 UTC m=+0.157617484 container init e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., RELEASE=main)
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.373390692 +0000 UTC m=+0.171225712 container start e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_CLEAN=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.373824405 +0000 UTC m=+0.171659455 container attach e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, distribution-scope=public, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=)
Dec 06 10:05:27 np0005548789.localdomain objective_perlman[293600]: 167 167
Dec 06 10:05:27 np0005548789.localdomain systemd[1]: libpod-e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7.scope: Deactivated successfully.
Dec 06 10:05:27 np0005548789.localdomain podman[293585]: 2025-12-06 10:05:27.378551031 +0000 UTC m=+0.176386091 container died e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, name=rhceph, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:05:27 np0005548789.localdomain podman[293605]: 2025-12-06 10:05:27.487941172 +0000 UTC m=+0.100590342 container remove e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_perlman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:05:27 np0005548789.localdomain systemd[1]: libpod-conmon-e3942d85554a3517be780e1c43256bebb32749f7f5dd285453312c6ce63af0d7.scope: Deactivated successfully.
Dec 06 10:05:27 np0005548789.localdomain sudo[293551]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.702 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.703 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.704 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.923 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.924 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.924 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:05:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:27.925 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:05:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5feace6bb6912bc716bafe85ee0d4dae479d7611357a1c27e18a2dcbabd1eaf4-merged.mount: Deactivated successfully.
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/2317264595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.295 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.435 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.436 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.437 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.438 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.438 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:28.438 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:05:28 np0005548789.localdomain ceph-mon[290022]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2669262318' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/2669262318' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:29.394 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:29 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:05:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:30.061 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:31 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:05:31 np0005548789.localdomain podman[293622]: 2025-12-06 10:05:31.931377283 +0000 UTC m=+0.089073807 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:05:31 np0005548789.localdomain podman[293622]: 2025-12-06 10:05:31.971406244 +0000 UTC m=+0.129102778 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:05:31 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/556966387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:32 np0005548789.localdomain sudo[293647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:32 np0005548789.localdomain sudo[293647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:32 np0005548789.localdomain sudo[293647]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:32 np0005548789.localdomain sudo[293665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:05:32 np0005548789.localdomain sudo[293665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/2556899461' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:33 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:33 np0005548789.localdomain sudo[293665]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:33 np0005548789.localdomain sshd[293715]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:34.395 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:34 np0005548789.localdomain sshd[293715]: Received disconnect from 154.113.10.34 port 47802:11: Bye Bye [preauth]
Dec 06 10:05:34 np0005548789.localdomain sshd[293715]: Disconnected from authenticating user root 154.113.10.34 port 47802 [preauth]
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='client.26676 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548788.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: Deploying daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:34 np0005548789.localdomain ceph-mon[290022]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:35.065 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:35 np0005548789.localdomain sudo[293717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:35 np0005548789.localdomain sudo[293717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293717]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:35 np0005548789.localdomain sudo[293735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293735]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:35 np0005548789.localdomain sudo[293753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293753]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:35 np0005548789.localdomain sudo[293771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293771]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:35 np0005548789.localdomain sudo[293789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293789]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:35 np0005548789.localdomain sudo[293823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293823]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain sudo[293841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:35 np0005548789.localdomain sudo[293841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293841]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:35 np0005548789.localdomain sudo[293859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:05:35 np0005548789.localdomain sudo[293859]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain sudo[293883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:35 np0005548789.localdomain sudo[293883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:35 np0005548789.localdomain sudo[293883]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:35 np0005548789.localdomain podman[293877]: 2025-12-06 10:05:35.962052571 +0000 UTC m=+0.080274668 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:05:36 np0005548789.localdomain sudo[293913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:36 np0005548789.localdomain sudo[293913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[293913]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain podman[293877]: 2025-12-06 10:05:36.03006004 +0000 UTC m=+0.148282117 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 06 10:05:36 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:05:36 np0005548789.localdomain sudo[293938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548789.localdomain sudo[293938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[293938]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain sudo[293956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:36 np0005548789.localdomain sudo[293956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[293956]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain sudo[293974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548789.localdomain sudo[293974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[293974]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain sudo[294008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548789.localdomain sudo[294008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[294008]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain sudo[294026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:36 np0005548789.localdomain sudo[294026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[294026]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain sudo[294044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:36 np0005548789.localdomain sudo[294044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:36 np0005548789.localdomain sudo[294044]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:36 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 06 10:05:36 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 06 10:05:37 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Dec 06 10:05:37 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x561418816000 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 06 10:05:37 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:05:37 np0005548789.localdomain ceph-mon[290022]: paxos.3).electionLogic(38) init, last seen epoch 38
Dec 06 10:05:37 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:37 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:39.433 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:40.069 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548786 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548788 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789 in quorum (ranks 0,2,3)
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548786,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3,4)
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: monmap epoch 9
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:05:37.030029+0000
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548786
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: osdmap e88: 6 total, 6 up, 6 in
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: mgrmap e18: np0005548788.yvwbqq(active, since 73s), standbys: np0005548787.umwsra, np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: overall HEALTH_OK
Dec 06 10:05:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:42 np0005548789.localdomain sudo[294062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:42 np0005548789.localdomain sudo[294062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:42 np0005548789.localdomain sudo[294062]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1008829953' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1008829953' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:44.445 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548786.mczynb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:45.072 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548786.mczynb (monmap changed)...
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548786.mczynb on np0005548786.localdomain
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:05:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:05:45 np0005548789.localdomain podman[294080]: 2025-12-06 10:05:45.925998311 +0000 UTC m=+0.085479958 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 06 10:05:45 np0005548789.localdomain podman[294080]: 2025-12-06 10:05:45.956504479 +0000 UTC m=+0.115986096 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:05:45 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:05:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:05:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:05:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:05:47.295 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:05:47.295 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:05:47.296 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:05:47 np0005548789.localdomain podman[294098]: 2025-12-06 10:05:47.725584179 +0000 UTC m=+0.068402572 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:05:47 np0005548789.localdomain podman[294098]: 2025-12-06 10:05:47.738240109 +0000 UTC m=+0.081058502 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:05:47 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:05:48 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.200:0/3950593935' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:49.481 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:49 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:50.075 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='client.44109 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: Reconfig service osd.default_drive_group
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:50 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 e89: 6 total, 6 up, 6 in
Dec 06 10:05:51 np0005548789.localdomain sshd[290882]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:05:51 np0005548789.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Dec 06 10:05:51 np0005548789.localdomain systemd[1]: session-65.scope: Consumed 18.331s CPU time.
Dec 06 10:05:51 np0005548789.localdomain systemd-logind[766]: Session 65 logged out. Waiting for processes to exit.
Dec 06 10:05:51 np0005548789.localdomain systemd-logind[766]: Removed session 65.
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 ' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.17055 172.18.0.106:0/93848351' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.200:0/3205170338' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: Activating manager daemon np0005548787.umwsra
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: mgrmap e19: np0005548787.umwsra(active, starting, since 0.0519269s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548786"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548786.mczynb", "id": "np0005548786.mczynb"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: Manager daemon np0005548787.umwsra is now available
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"}]': finished
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"} : dispatch
Dec 06 10:05:51 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548785.localdomain.devices.0"}]': finished
Dec 06 10:05:51 np0005548789.localdomain sshd[294121]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:05:51 np0005548789.localdomain sshd[294121]: Accepted publickey for ceph-admin from 192.168.122.105 port 51844 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:05:51 np0005548789.localdomain systemd-logind[766]: New session 66 of user ceph-admin.
Dec 06 10:05:51 np0005548789.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Dec 06 10:05:51 np0005548789.localdomain sshd[294121]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:05:51 np0005548789.localdomain sudo[294125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:51 np0005548789.localdomain sudo[294125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:51 np0005548789.localdomain sudo[294125]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:51 np0005548789.localdomain sudo[294143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:05:52 np0005548789.localdomain sudo[294143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:05:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:05:52 np0005548789.localdomain podman[294162]: 2025-12-06 10:05:52.128159765 +0000 UTC m=+0.099521830 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec 06 10:05:52 np0005548789.localdomain systemd[1]: tmp-crun.7Z0KSX.mount: Deactivated successfully.
Dec 06 10:05:52 np0005548789.localdomain podman[294161]: 2025-12-06 10:05:52.179894954 +0000 UTC m=+0.154379915 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:05:52 np0005548789.localdomain podman[294162]: 2025-12-06 10:05:52.197693102 +0000 UTC m=+0.169055077 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 10:05:52 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:05:52 np0005548789.localdomain podman[294161]: 2025-12-06 10:05:52.218102098 +0000 UTC m=+0.192587039 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:05:52 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:05:52 np0005548789.localdomain ceph-mon[290022]: removing stray HostCache host record np0005548785.localdomain.devices.0
Dec 06 10:05:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548787.umwsra/mirror_snapshot_schedule"} : dispatch
Dec 06 10:05:52 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548787.umwsra/trash_purge_schedule"} : dispatch
Dec 06 10:05:52 np0005548789.localdomain ceph-mon[290022]: mgrmap e20: np0005548787.umwsra(active, since 1.0797s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:52 np0005548789.localdomain podman[294271]: 2025-12-06 10:05:52.789861788 +0000 UTC m=+0.097146137 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, release=1763362218, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main)
Dec 06 10:05:52 np0005548789.localdomain podman[294271]: 2025-12-06 10:05:52.899872138 +0000 UTC m=+0.207156517 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, RELEASE=main, version=7, vcs-type=git, distribution-scope=public, release=1763362218, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:05:53 np0005548789.localdomain systemd[1]: tmp-crun.wIj4F5.mount: Deactivated successfully.
Dec 06 10:05:53 np0005548789.localdomain ceph-mon[290022]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:53 np0005548789.localdomain ceph-mon[290022]: mgrmap e21: np0005548787.umwsra(active, since 2s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:53 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:53 np0005548789.localdomain sudo[294143]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:53 np0005548789.localdomain sudo[294392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:53 np0005548789.localdomain sudo[294392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:53 np0005548789.localdomain sudo[294392]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:53 np0005548789.localdomain sudo[294410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:05:53 np0005548789.localdomain sudo[294410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:05:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:05:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:05:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:05:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:05:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19227 "" "Go-http-client/1.1"
Dec 06 10:05:54 np0005548789.localdomain sudo[294410]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:54.494 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:05:52] ENGINE Bus STARTING
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:05:52] ENGINE Serving on https://172.18.0.105:7150
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:05:52] ENGINE Client ('172.18.0.105', 55368) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:05:53] ENGINE Serving on http://172.18.0.105:8765
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: [06/Dec/2025:10:05:53] ENGINE Bus STARTED
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:54 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:54 np0005548789.localdomain sudo[294459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:54 np0005548789.localdomain sudo[294459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:05:54 np0005548789.localdomain sudo[294459]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:54 np0005548789.localdomain sudo[294478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:05:54 np0005548789.localdomain sudo[294478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:54 np0005548789.localdomain podman[294477]: 2025-12-06 10:05:54.811903393 +0000 UTC m=+0.103610095 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:05:54 np0005548789.localdomain podman[294477]: 2025-12-06 10:05:54.828427621 +0000 UTC m=+0.120134323 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:05:54 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:05:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:55.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:55 np0005548789.localdomain sudo[294478]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:55 np0005548789.localdomain sudo[294534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294534]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:55 np0005548789.localdomain sudo[294552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294552]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548789.localdomain sudo[294570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294570]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:55 np0005548789.localdomain sudo[294588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294588]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548789.localdomain sudo[294606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294606]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548789.localdomain sudo[294640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294640]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:05:55 np0005548789.localdomain sudo[294658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294658]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:55 np0005548789.localdomain sudo[294676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:05:55 np0005548789.localdomain sudo[294676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:55 np0005548789.localdomain sudo[294676]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:56 np0005548789.localdomain sudo[294694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294694]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config rm", "who": "osd/host:np0005548786", "name": "osd_memory_target"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain sudo[294712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:56 np0005548789.localdomain ceph-mon[290022]: mgrmap e22: np0005548787.umwsra(active, since 4s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq
Dec 06 10:05:56 np0005548789.localdomain sudo[294712]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548789.localdomain sudo[294730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294730]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:56 np0005548789.localdomain sudo[294748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294748]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548789.localdomain sudo[294766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294766]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548789.localdomain sudo[294800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294800]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:05:56 np0005548789.localdomain sudo[294818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294818]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:56 np0005548789.localdomain sudo[294836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294836]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:05:56 np0005548789.localdomain sudo[294854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294854]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:05:56 np0005548789.localdomain sudo[294872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294872]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:56 np0005548789.localdomain sudo[294890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:56 np0005548789.localdomain sudo[294890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:56 np0005548789.localdomain sudo[294890]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[294908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:57 np0005548789.localdomain sudo[294908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[294908]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[294926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[294926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[294926]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:05:57 np0005548789.localdomain ceph-mon[290022]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:05:57 np0005548789.localdomain sudo[294960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[294960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[294960]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[294978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[294978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[294978]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[294996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:57 np0005548789.localdomain sudo[294996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[294996]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:57 np0005548789.localdomain sudo[295014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295014]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:05:57 np0005548789.localdomain sudo[295032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295032]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[295050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295050]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:05:57 np0005548789.localdomain sudo[295068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295068]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295086]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[295086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295086]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:57 np0005548789.localdomain sudo[295120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:57 np0005548789.localdomain sudo[295120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:57 np0005548789.localdomain sudo[295120]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548789.localdomain sudo[295138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:05:58 np0005548789.localdomain sudo[295138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:58 np0005548789.localdomain sudo[295138]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548789.localdomain sudo[295156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain sudo[295156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:58 np0005548789.localdomain sudo[295156]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: mgrmap e23: np0005548787.umwsra(active, since 5s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:58 np0005548789.localdomain sudo[295174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:05:58 np0005548789.localdomain sudo[295174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:05:58 np0005548789.localdomain sudo[295174]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:05:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:05:59.509 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.583406) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559583457, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2849, "num_deletes": 255, "total_data_size": 9200145, "memory_usage": 9797712, "flush_reason": "Manual Compaction"}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559627207, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5563548, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11456, "largest_seqno": 14300, "table_properties": {"data_size": 5551678, "index_size": 7415, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 30447, "raw_average_key_size": 22, "raw_value_size": 5525769, "raw_average_value_size": 4129, "num_data_blocks": 320, "num_entries": 1338, "num_filter_entries": 1338, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015498, "oldest_key_time": 1765015498, "file_creation_time": 1765015559, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 43896 microseconds, and 11919 cpu microseconds.
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.627291) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5563548 bytes OK
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.627329) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.629110) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.629143) EVENT_LOG_v1 {"time_micros": 1765015559629134, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.629170) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 9186391, prev total WAL file size 9202615, number of live WAL files 2.
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.631151) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5433KB)], [15(10MB)]
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559631211, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 16742055, "oldest_snapshot_seqno": -1}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10057 keys, 15396843 bytes, temperature: kUnknown
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559753281, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 15396843, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15338725, "index_size": 31905, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25157, "raw_key_size": 267726, "raw_average_key_size": 26, "raw_value_size": 15166053, "raw_average_value_size": 1508, "num_data_blocks": 1223, "num_entries": 10057, "num_filter_entries": 10057, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 0, "file_creation_time": 1765015559, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.753874) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 15396843 bytes
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.756000) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.8 rd, 125.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.3, 10.7 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(5.8) write-amplify(2.8) OK, records in: 10609, records dropped: 552 output_compression: NoCompression
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.756035) EVENT_LOG_v1 {"time_micros": 1765015559756018, "job": 6, "event": "compaction_finished", "compaction_time_micros": 122377, "compaction_time_cpu_micros": 30183, "output_level": 6, "num_output_files": 1, "total_output_size": 15396843, "num_input_records": 10609, "num_output_records": 10057, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559757520, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015559759648, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.630912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.759815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.759821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.759824) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.759827) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:59 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:05:59.759830) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:00.330 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:00 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:00 np0005548789.localdomain sudo[295192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:00 np0005548789.localdomain sudo[295192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:00 np0005548789.localdomain sudo[295192]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:00 np0005548789.localdomain sudo[295210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:00 np0005548789.localdomain sudo[295210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.305467213 +0000 UTC m=+0.062206682 container create d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=)
Dec 06 10:06:01 np0005548789.localdomain systemd[1]: Started libpod-conmon-d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b.scope.
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:01 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:01 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.369918503 +0000 UTC m=+0.126658012 container init d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True)
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.380982593 +0000 UTC m=+0.137722142 container start d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, distribution-scope=public, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, release=1763362218, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.381364935 +0000 UTC m=+0.138104444 container attach d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, release=1763362218, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.buildah.version=1.41.4, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64)
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.285362135 +0000 UTC m=+0.042101694 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:01 np0005548789.localdomain sleepy_greider[295260]: 167 167
Dec 06 10:06:01 np0005548789.localdomain systemd[1]: libpod-d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b.scope: Deactivated successfully.
Dec 06 10:06:01 np0005548789.localdomain podman[295245]: 2025-12-06 10:06:01.386528964 +0000 UTC m=+0.143268513 container died d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1763362218, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:01 np0005548789.localdomain podman[295265]: 2025-12-06 10:06:01.47428667 +0000 UTC m=+0.080492994 container remove d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_greider, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:01 np0005548789.localdomain systemd[1]: libpod-conmon-d2b7436b69f18c1ac5dddf9cc8daada30eb1dc07b9c72be25c6812486e98de2b.scope: Deactivated successfully.
Dec 06 10:06:01 np0005548789.localdomain sudo[295210]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:01 np0005548789.localdomain sudo[295280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:01 np0005548789.localdomain sudo[295280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:01 np0005548789.localdomain sudo[295280]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:01 np0005548789.localdomain sudo[295298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:01 np0005548789.localdomain sudo[295298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:06:02 np0005548789.localdomain sshd[291944]: ssh_dispatch_run_fatal: Connection from 45.78.222.162 port 43020: Connection timed out [preauth]
Dec 06 10:06:02 np0005548789.localdomain podman[295332]: 2025-12-06 10:06:02.147828178 +0000 UTC m=+0.082336842 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:06:02 np0005548789.localdomain podman[295332]: 2025-12-06 10:06:02.161988313 +0000 UTC m=+0.096496987 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.176280892 +0000 UTC m=+0.091845673 container create 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, io.openshift.expose-services=, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: Started libpod-conmon-27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8.scope.
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.236062279 +0000 UTC m=+0.151627060 container init 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, version=7, ceph=True, vcs-type=git, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7)
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.144561517 +0000 UTC m=+0.060126338 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.243948621 +0000 UTC m=+0.159513382 container start 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, version=7, architecture=x86_64, ceph=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.244146957 +0000 UTC m=+0.159711728 container attach 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:02 np0005548789.localdomain trusting_cerf[295372]: 167 167
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: libpod-27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8.scope: Deactivated successfully.
Dec 06 10:06:02 np0005548789.localdomain podman[295340]: 2025-12-06 10:06:02.247212131 +0000 UTC m=+0.162776952 container died 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1763362218, vcs-type=git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-35e8464fa50cf9a0c483b6f77be42c86b5d1684205d337dab60ed9d5f56d512a-merged.mount: Deactivated successfully.
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b495a4f05e0dc37ec094e7c8bf98335362978e7cce241a47835b5dfb87962f36-merged.mount: Deactivated successfully.
Dec 06 10:06:02 np0005548789.localdomain podman[295377]: 2025-12-06 10:06:02.341716466 +0000 UTC m=+0.082146276 container remove 27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True)
Dec 06 10:06:02 np0005548789.localdomain systemd[1]: libpod-conmon-27728b71d424cec212ea7f06f18a262351af4d75463fd3c5baa0c399e67e5fd8.scope: Deactivated successfully.
Dec 06 10:06:02 np0005548789.localdomain sudo[295298]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:06:02 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:02 np0005548789.localdomain sudo[295399]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:02 np0005548789.localdomain sudo[295399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:02 np0005548789.localdomain sudo[295399]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:02 np0005548789.localdomain sudo[295417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:02 np0005548789.localdomain sudo[295417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.182029728 +0000 UTC m=+0.082378393 container create 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:03 np0005548789.localdomain systemd[1]: Started libpod-conmon-424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307.scope.
Dec 06 10:06:03 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.148687763 +0000 UTC m=+0.049036478 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.262995626 +0000 UTC m=+0.163344261 container init 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.272668493 +0000 UTC m=+0.173017198 container start 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, release=1763362218, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:03 np0005548789.localdomain reverent_einstein[295466]: 167 167
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.274871661 +0000 UTC m=+0.175220286 container attach 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:03 np0005548789.localdomain systemd[1]: libpod-424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307.scope: Deactivated successfully.
Dec 06 10:06:03 np0005548789.localdomain podman[295451]: 2025-12-06 10:06:03.279259695 +0000 UTC m=+0.179608400 container died 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1763362218, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, CEPH_POINT_RELEASE=)
Dec 06 10:06:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8f520eb1d6688811ab96f45b6932462450546e4d717702c788f390db174d8ebf-merged.mount: Deactivated successfully.
Dec 06 10:06:03 np0005548789.localdomain podman[295471]: 2025-12-06 10:06:03.376002068 +0000 UTC m=+0.085798288 container remove 424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_einstein, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:06:03 np0005548789.localdomain systemd[1]: libpod-conmon-424b5e056fec469cf17d237fee5d6f0e6b98151fe7205ceb8269f649a2060307.scope: Deactivated successfully.
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:06:03 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:03 np0005548789.localdomain sudo[295417]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:03 np0005548789.localdomain sudo[295494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:03 np0005548789.localdomain sudo[295494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:03 np0005548789.localdomain sudo[295494]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:03 np0005548789.localdomain sudo[295512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:03 np0005548789.localdomain sudo[295512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.245244869 +0000 UTC m=+0.083295080 container create 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218)
Dec 06 10:06:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da.scope.
Dec 06 10:06:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.212626848 +0000 UTC m=+0.050677109 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.314120806 +0000 UTC m=+0.152171027 container init 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1763362218, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.324198795 +0000 UTC m=+0.162249016 container start 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, architecture=x86_64, io.buildah.version=1.41.4, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, ceph=True, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.324497715 +0000 UTC m=+0.162547926 container attach 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7)
Dec 06 10:06:04 np0005548789.localdomain elegant_matsumoto[295561]: 167 167
Dec 06 10:06:04 np0005548789.localdomain systemd[1]: libpod-39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da.scope: Deactivated successfully.
Dec 06 10:06:04 np0005548789.localdomain podman[295546]: 2025-12-06 10:06:04.328332593 +0000 UTC m=+0.166382874 container died 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main)
Dec 06 10:06:04 np0005548789.localdomain podman[295566]: 2025-12-06 10:06:04.429179652 +0000 UTC m=+0.088023146 container remove 39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_matsumoto, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218)
Dec 06 10:06:04 np0005548789.localdomain systemd[1]: libpod-conmon-39fa33ad0c258ccb4f1f30dee83495adefc8cb07665c59cd57bd8ec282ac60da.scope: Deactivated successfully.
Dec 06 10:06:04 np0005548789.localdomain sudo[295512]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:04.532 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='client.44134 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:06:04 np0005548789.localdomain sudo[295583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:04 np0005548789.localdomain sudo[295583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:04 np0005548789.localdomain sudo[295583]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:04 np0005548789.localdomain sudo[295601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:04 np0005548789.localdomain sudo[295601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:04 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:05.023896066 +0000 UTC m=+0.052614358 container create 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, release=1763362218, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7)
Dec 06 10:06:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c.scope.
Dec 06 10:06:05 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:05.085 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:05.087321895 +0000 UTC m=+0.116040227 container init 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main)
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:04.99699106 +0000 UTC m=+0.025709452 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:05.097357833 +0000 UTC m=+0.126076165 container start 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1763362218, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container)
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:05.097705295 +0000 UTC m=+0.126423677 container attach 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, architecture=x86_64, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:05 np0005548789.localdomain hungry_carver[295652]: 167 167
Dec 06 10:06:05 np0005548789.localdomain systemd[1]: libpod-2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c.scope: Deactivated successfully.
Dec 06 10:06:05 np0005548789.localdomain podman[295637]: 2025-12-06 10:06:05.100121998 +0000 UTC m=+0.128840350 container died 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, CEPH_POINT_RELEASE=)
Dec 06 10:06:05 np0005548789.localdomain podman[295657]: 2025-12-06 10:06:05.194325203 +0000 UTC m=+0.084703913 container remove 2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_carver, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, release=1763362218, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, vcs-type=git)
Dec 06 10:06:05 np0005548789.localdomain systemd[1]: libpod-conmon-2afb96b3c5e2b359b811b8ec902c3934fae9f86e8f85c96316b51b589a22ae5c.scope: Deactivated successfully.
Dec 06 10:06:05 np0005548789.localdomain sudo[295601]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-59db1e8215f0fe5400974701a833d93b40b3de2c44c8e0932413596395e39162-merged.mount: Deactivated successfully.
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: Saving service mon spec with placement label:mon
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:06:06 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:06:06 np0005548789.localdomain systemd[1]: tmp-crun.FX5wOE.mount: Deactivated successfully.
Dec 06 10:06:06 np0005548789.localdomain podman[295673]: 2025-12-06 10:06:06.910919262 +0000 UTC m=+0.072450857 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Dec 06 10:06:07 np0005548789.localdomain podman[295673]: 2025-12-06 10:06:07.01824654 +0000 UTC m=+0.179778105 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:06:07 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:06:07 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:08 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.200:0/2211595861' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:09.585 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:09 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:10.087 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:10 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:11 np0005548789.localdomain sudo[295698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:11 np0005548789.localdomain sudo[295698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:11 np0005548789.localdomain sudo[295698]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:11 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:12 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:13 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.200:0/2329222999' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:06:14 np0005548789.localdomain sudo[295716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:14 np0005548789.localdomain sudo[295716]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:14 np0005548789.localdomain sudo[295716]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:14 np0005548789.localdomain sudo[295734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:14 np0005548789.localdomain sudo[295734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:14.586 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:14 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.756930201 +0000 UTC m=+0.079988930 container create e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, version=7)
Dec 06 10:06:14 np0005548789.localdomain systemd[1]: Started libpod-conmon-e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60.scope.
Dec 06 10:06:14 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.725328409 +0000 UTC m=+0.048387178 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.83732259 +0000 UTC m=+0.160381289 container init e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.848573196 +0000 UTC m=+0.171631935 container start e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, distribution-scope=public)
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.848912016 +0000 UTC m=+0.171970735 container attach e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, architecture=x86_64, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Dec 06 10:06:14 np0005548789.localdomain clever_blackwell[295783]: 167 167
Dec 06 10:06:14 np0005548789.localdomain systemd[1]: libpod-e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60.scope: Deactivated successfully.
Dec 06 10:06:14 np0005548789.localdomain podman[295768]: 2025-12-06 10:06:14.854104066 +0000 UTC m=+0.177162805 container died e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, release=1763362218, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:06:14 np0005548789.localdomain podman[295788]: 2025-12-06 10:06:14.964719876 +0000 UTC m=+0.100334495 container remove e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_blackwell, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, RELEASE=main, release=1763362218, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=)
Dec 06 10:06:14 np0005548789.localdomain systemd[1]: libpod-conmon-e790297f1d12ae2c0d0be942b8e0d538a7152314fb9c74e0b299aed536ed7e60.scope: Deactivated successfully.
Dec 06 10:06:15 np0005548789.localdomain sudo[295734]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:15.092 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:15 np0005548789.localdomain sshd[295805]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:06:15 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:06:15 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:06:15 np0005548789.localdomain ceph-mon[290022]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:15 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:15 np0005548789.localdomain systemd[1]: tmp-crun.Wd2UIq.mount: Deactivated successfully.
Dec 06 10:06:15 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1f909b356b073b3bb180e8c5d856c28d9d626a91a2582c8acd167fdd174544bc-merged.mount: Deactivated successfully.
Dec 06 10:06:15 np0005548789.localdomain sshd[295805]: Received disconnect from 64.227.102.57 port 42570:11: Bye Bye [preauth]
Dec 06 10:06:15 np0005548789.localdomain sshd[295805]: Disconnected from authenticating user root 64.227.102.57 port 42570 [preauth]
Dec 06 10:06:16 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed13600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0
Dec 06 10:06:16 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@3(peon) e10  my rank is now 2 (was 3)
Dec 06 10:06:16 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 06 10:06:16 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0
Dec 06 10:06:16 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed131e0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Dec 06 10:06:16 np0005548789.localdomain ceph-mon[290022]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:06:16 np0005548789.localdomain ceph-mon[290022]: paxos.2).electionLogic(44) init, last seen epoch 44
Dec 06 10:06:16 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:16 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:06:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:06:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:06:16 np0005548789.localdomain podman[295807]: 2025-12-06 10:06:16.935343059 +0000 UTC m=+0.089308915 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:06:16 np0005548789.localdomain podman[295807]: 2025-12-06 10:06:16.969312683 +0000 UTC m=+0.123278529 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:16 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:06:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:06:17 np0005548789.localdomain podman[295825]: 2025-12-06 10:06:17.917077988 +0000 UTC m=+0.079856596 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:06:17 np0005548789.localdomain podman[295825]: 2025-12-06 10:06:17.925808056 +0000 UTC m=+0.088586704 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:06:17 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: from='client.26906 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548786"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: Remove daemons mon.np0005548786
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: Safe to remove mon.np0005548786: new quorum should be ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: Removing monitor np0005548786 from monmap...
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: Removing daemon mon.np0005548786 from np0005548786.localdomain -- ports []
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789 calling monitor election
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548790 calling monitor election
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548788 calling monitor election
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 calling monitor election
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3)
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: monmap epoch 10
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: last_changed 2025-12-06T10:06:16.211793+0000
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: min_mon_release 18 (reef)
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: election_strategy: 1
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: mgrmap e23: np0005548787.umwsra(active, since 26s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:06:18 np0005548789.localdomain ceph-mon[290022]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:06:19 np0005548789.localdomain sudo[295849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:06:19 np0005548789.localdomain sudo[295849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295849]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain sudo[295867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:06:19 np0005548789.localdomain sudo[295867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295867]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:19 np0005548789.localdomain sudo[295885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548789.localdomain sudo[295885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295885]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain sudo[295903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:19 np0005548789.localdomain sudo[295903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295903]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain sudo[295921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548789.localdomain sudo[295921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:19.624 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:19 np0005548789.localdomain sudo[295921]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:19 np0005548789.localdomain sudo[295955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548789.localdomain sudo[295955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295955]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain sudo[295973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:19 np0005548789.localdomain sudo[295973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295973]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:19 np0005548789.localdomain sudo[295991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:06:19 np0005548789.localdomain sudo[295991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:19 np0005548789.localdomain sudo[295991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:20 np0005548789.localdomain sudo[296009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296009]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:20 np0005548789.localdomain sudo[296027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296027]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:20.098 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:20 np0005548789.localdomain sudo[296045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548789.localdomain sudo[296045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296045]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:20 np0005548789.localdomain sudo[296063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296063]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548789.localdomain sudo[296081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296081]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: Removed label mon from host np0005548786.localdomain
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:20 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:20 np0005548789.localdomain sudo[296115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548789.localdomain sudo[296115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296115]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:20 np0005548789.localdomain sudo[296133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296133]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:20 np0005548789.localdomain sudo[296151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:20 np0005548789.localdomain sudo[296151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:20 np0005548789.localdomain sudo[296151]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.041340) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581041435, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1243, "num_deletes": 256, "total_data_size": 2170248, "memory_usage": 2212160, "flush_reason": "Manual Compaction"}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581059393, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1158430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14305, "largest_seqno": 15543, "table_properties": {"data_size": 1152840, "index_size": 2869, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13681, "raw_average_key_size": 20, "raw_value_size": 1140793, "raw_average_value_size": 1731, "num_data_blocks": 120, "num_entries": 659, "num_filter_entries": 659, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015559, "oldest_key_time": 1765015559, "file_creation_time": 1765015581, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 18090 microseconds, and 4560 cpu microseconds.
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.059446) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1158430 bytes OK
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.059469) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.061182) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.061203) EVENT_LOG_v1 {"time_micros": 1765015581061197, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.061226) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 2163775, prev total WAL file size 2163775, number of live WAL files 2.
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.061975) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303330' seq:72057594037927935, type:22 .. '6B760031323837' seq:0, type:0; will stop at (end)
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1131KB)], [18(14MB)]
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581062055, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 16555273, "oldest_snapshot_seqno": -1}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10182 keys, 15590708 bytes, temperature: kUnknown
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581144671, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 15590708, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15531949, "index_size": 32226, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25477, "raw_key_size": 272518, "raw_average_key_size": 26, "raw_value_size": 15357091, "raw_average_value_size": 1508, "num_data_blocks": 1220, "num_entries": 10182, "num_filter_entries": 10182, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 0, "file_creation_time": 1765015581, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.145057) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 15590708 bytes
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.147232) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 200.0 rd, 188.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 14.7 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(27.7) write-amplify(13.5) OK, records in: 10716, records dropped: 534 output_compression: NoCompression
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.147264) EVENT_LOG_v1 {"time_micros": 1765015581147250, "job": 8, "event": "compaction_finished", "compaction_time_micros": 82759, "compaction_time_cpu_micros": 44371, "output_level": 6, "num_output_files": 1, "total_output_size": 15590708, "num_input_records": 10716, "num_output_records": 10182, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581147548, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015581149955, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.061862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.149990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.149996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.149998) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.150001) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:21.150004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='client.34289 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: Removed label mgr from host np0005548786.localdomain
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:21 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:22 np0005548789.localdomain ceph-mon[290022]: Removing daemon mgr.np0005548786.mczynb from np0005548786.localdomain -- ports [8765]
Dec 06 10:06:22 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:06:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:06:22 np0005548789.localdomain systemd[1]: tmp-crun.Nv1yZ4.mount: Deactivated successfully.
Dec 06 10:06:22 np0005548789.localdomain podman[296170]: 2025-12-06 10:06:22.949849648 +0000 UTC m=+0.099534659 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:06:22 np0005548789.localdomain podman[296170]: 2025-12-06 10:06:22.963376403 +0000 UTC m=+0.113061394 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:06:22 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:06:23 np0005548789.localdomain podman[296169]: 2025-12-06 10:06:23.044723813 +0000 UTC m=+0.196159489 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7)
Dec 06 10:06:23 np0005548789.localdomain podman[296169]: 2025-12-06 10:06:23.061664884 +0000 UTC m=+0.213100530 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, version=9.6)
Dec 06 10:06:23 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:06:23 np0005548789.localdomain sudo[296206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:23 np0005548789.localdomain sudo[296206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:23 np0005548789.localdomain sudo[296206]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='client.44183 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: Removed label _admin from host np0005548786.localdomain
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"} : dispatch
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"}]': finished
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:23 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:06:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:06:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:06:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:06:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:06:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19219 "" "Go-http-client/1.1"
Dec 06 10:06:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:24.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:24.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:24 np0005548789.localdomain ceph-mon[290022]: Removing key for mgr.np0005548786.mczynb
Dec 06 10:06:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:24.658 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:24 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.097 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.202 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.202 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.202 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.203 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.203 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:25 np0005548789.localdomain sudo[296224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:25 np0005548789.localdomain sudo[296224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:06:25 np0005548789.localdomain sudo[296224]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:25 np0005548789.localdomain podman[296243]: 2025-12-06 10:06:25.351983558 +0000 UTC m=+0.085739836 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:25 np0005548789.localdomain podman[296243]: 2025-12-06 10:06:25.363407013 +0000 UTC m=+0.097163271 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:06:25 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:25 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.692 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.786 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.786 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.989 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.991 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11546MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.991 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:25.992 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.106 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.107 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.107 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.159 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: Removing np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/544071182' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:06:26 np0005548789.localdomain ceph-mon[290022]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1605363813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.617 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.625 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.644 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.647 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:06:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:26.647 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.107:0/1605363813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:27 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.643 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.644 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.644 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.645 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.947 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.948 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.948 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:06:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:27.948 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.292 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.315 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.315 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.316 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.317 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.317 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:28.317 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:06:28 np0005548789.localdomain ceph-mon[290022]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:29.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:29.689 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:29 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/308530152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:30.099 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.108:0/3072124168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:06:30 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:31 np0005548789.localdomain sshd[296303]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/406415283' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:32 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:32.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:06:32 np0005548789.localdomain podman[296305]: 2025-12-06 10:06:32.90160308 +0000 UTC m=+0.067962851 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:06:32 np0005548789.localdomain podman[296305]: 2025-12-06 10:06:32.910360944 +0000 UTC m=+0.076720725 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:06:32 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:06:33 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:06:33 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:06:33 np0005548789.localdomain ceph-mon[290022]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:33 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.106:0/3079514338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:06:33 np0005548789.localdomain sshd[296303]: Received disconnect from 14.194.101.210 port 40170:11: Bye Bye [preauth]
Dec 06 10:06:33 np0005548789.localdomain sshd[296303]: Disconnected from authenticating user root 14.194.101.210 port 40170 [preauth]
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:34 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:34.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:35 np0005548789.localdomain sudo[296328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:35 np0005548789.localdomain sudo[296328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:35 np0005548789.localdomain sudo[296328]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:35.104 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:35 np0005548789.localdomain sudo[296346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:35 np0005548789.localdomain sudo[296346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: from='client.34370 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548786.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: Added label _no_schedule to host np0005548786.localdomain
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548786.localdomain
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:35 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.571489249 +0000 UTC m=+0.079783016 container create 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z)
Dec 06 10:06:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e.scope.
Dec 06 10:06:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.632340703 +0000 UTC m=+0.140634500 container init 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1763362218, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.541028291 +0000 UTC m=+0.049322118 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:35 np0005548789.localdomain systemd[1]: tmp-crun.kfwi4w.mount: Deactivated successfully.
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.644515711 +0000 UTC m=+0.152809478 container start 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.644741537 +0000 UTC m=+0.153035394 container attach 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, release=1763362218, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, version=7, io.openshift.expose-services=)
Dec 06 10:06:35 np0005548789.localdomain beautiful_cannon[296395]: 167 167
Dec 06 10:06:35 np0005548789.localdomain systemd[1]: libpod-7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e.scope: Deactivated successfully.
Dec 06 10:06:35 np0005548789.localdomain podman[296380]: 2025-12-06 10:06:35.64815002 +0000 UTC m=+0.156443847 container died 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, release=1763362218, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 06 10:06:35 np0005548789.localdomain podman[296400]: 2025-12-06 10:06:35.750047891 +0000 UTC m=+0.092302763 container remove 7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_cannon, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7)
Dec 06 10:06:35 np0005548789.localdomain systemd[1]: libpod-conmon-7acbf198423f655343a8463600bc90307233376140e2f24c9876a660ff4d6c9e.scope: Deactivated successfully.
Dec 06 10:06:35 np0005548789.localdomain sudo[296346]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:35 np0005548789.localdomain sudo[296416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:35 np0005548789.localdomain sudo[296416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:35 np0005548789.localdomain sudo[296416]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:36 np0005548789.localdomain sudo[296434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:36 np0005548789.localdomain sudo[296434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: from='client.44219 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548786.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:06:36 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.48350542 +0000 UTC m=+0.076506997 container create 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, ceph=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8.scope.
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.456139745 +0000 UTC m=+0.049141352 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.557669166 +0000 UTC m=+0.150670743 container init 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.568945716 +0000 UTC m=+0.161947263 container start 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, release=1763362218, com.redhat.component=rhceph-container)
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.569294327 +0000 UTC m=+0.162295904 container attach 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, ceph=True, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4)
Dec 06 10:06:36 np0005548789.localdomain laughing_hertz[296485]: 167 167
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: libpod-9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8.scope: Deactivated successfully.
Dec 06 10:06:36 np0005548789.localdomain podman[296470]: 2025-12-06 10:06:36.57273028 +0000 UTC m=+0.165732027 container died 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, release=1763362218, vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-252214fefd9f3b08647c332f2c01ba82bae23c6bd44d8548be826b87712bb741-merged.mount: Deactivated successfully.
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3f50b70123f50032802f0b24776f76228a9f8438e9eaf8f09694e1551019184e-merged.mount: Deactivated successfully.
Dec 06 10:06:36 np0005548789.localdomain podman[296490]: 2025-12-06 10:06:36.662732133 +0000 UTC m=+0.082677253 container remove 9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hertz, vendor=Red Hat, Inc., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, RELEASE=main, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4)
Dec 06 10:06:36 np0005548789.localdomain systemd[1]: libpod-conmon-9c2ed5345c63f305eb307d7337e26b651042829d16ea7dd849ab319eae129ee8.scope: Deactivated successfully.
Dec 06 10:06:36 np0005548789.localdomain sudo[296434]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:36 np0005548789.localdomain sudo[296513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:36 np0005548789.localdomain sudo[296513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:36 np0005548789.localdomain sudo[296513]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:37 np0005548789.localdomain sudo[296531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:37 np0005548789.localdomain sudo[296531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:06:37 np0005548789.localdomain podman[296549]: 2025-12-06 10:06:37.157658282 +0000 UTC m=+0.090051866 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"} : dispatch
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"}]': finished
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:06:37 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:37 np0005548789.localdomain podman[296549]: 2025-12-06 10:06:37.260286995 +0000 UTC m=+0.192680589 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.505005082 +0000 UTC m=+0.055545266 container create 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, architecture=x86_64, version=7, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: Started libpod-conmon-75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae.scope.
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.572392363 +0000 UTC m=+0.122932507 container init 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.476534634 +0000 UTC m=+0.027074778 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.583760175 +0000 UTC m=+0.134300279 container start 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, version=7, GIT_BRANCH=main, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.58392323 +0000 UTC m=+0.134463414 container attach 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph)
Dec 06 10:06:37 np0005548789.localdomain loving_aryabhata[296605]: 167 167
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: libpod-75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae.scope: Deactivated successfully.
Dec 06 10:06:37 np0005548789.localdomain podman[296590]: 2025-12-06 10:06:37.587071065 +0000 UTC m=+0.137611279 container died 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bc37b8670e214a7c485435781f98089c5d147ee00813179213a5c932f625f18c-merged.mount: Deactivated successfully.
Dec 06 10:06:37 np0005548789.localdomain podman[296610]: 2025-12-06 10:06:37.687246395 +0000 UTC m=+0.083362524 container remove 75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_aryabhata, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, architecture=x86_64, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:37 np0005548789.localdomain systemd[1]: libpod-conmon-75c31ed3aeea2d333dd05ffdc15716aead2ea96a7458de6b34153b81816aceae.scope: Deactivated successfully.
Dec 06 10:06:37 np0005548789.localdomain sudo[296531]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:37 np0005548789.localdomain sudo[296633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:37 np0005548789.localdomain sudo[296633]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:37 np0005548789.localdomain sudo[296633]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:38 np0005548789.localdomain sudo[296651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:38 np0005548789.localdomain sudo[296651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: from='client.26868 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548786.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: Removed host np0005548786.localdomain
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:38 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.469993389 +0000 UTC m=+0.081085525 container create 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Started libpod-conmon-0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c.scope.
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.526512694 +0000 UTC m=+0.137604840 container init 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.536181395 +0000 UTC m=+0.147273541 container start 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.536458343 +0000 UTC m=+0.147550499 container attach 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:06:38 np0005548789.localdomain sleepy_antonelli[296699]: 167 167
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: libpod-0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c.scope: Deactivated successfully.
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.439637075 +0000 UTC m=+0.050729261 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:38 np0005548789.localdomain podman[296685]: 2025-12-06 10:06:38.54033897 +0000 UTC m=+0.151431156 container died 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9ee284d2c4681523b33fd0049a275ef040599e2e7234f36d380e2e79a5c11755-merged.mount: Deactivated successfully.
Dec 06 10:06:38 np0005548789.localdomain podman[296705]: 2025-12-06 10:06:38.651530942 +0000 UTC m=+0.098031186 container remove 0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_antonelli, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1763362218, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: libpod-conmon-0859935a88f3f5b52c014ec0400d08fd34e06d89cbcce056ce2b8d384b8a8f9c.scope: Deactivated successfully.
Dec 06 10:06:38 np0005548789.localdomain sshd[296719]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:06:38 np0005548789.localdomain sudo[296651]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:38 np0005548789.localdomain sshd[296719]: Accepted publickey for tripleo-admin from 192.168.122.11 port 38310 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:06:38 np0005548789.localdomain systemd-logind[766]: New session 67 of user tripleo-admin.
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 1003.
Dec 06 10:06:38 np0005548789.localdomain sudo[296723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Dec 06 10:06:38 np0005548789.localdomain sudo[296723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:38 np0005548789.localdomain sudo[296723]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Dec 06 10:06:38 np0005548789.localdomain systemd[1]: Starting User Manager for UID 1003...
Dec 06 10:06:38 np0005548789.localdomain systemd[296743]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:06:38 np0005548789.localdomain sudo[296744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:38 np0005548789.localdomain sudo[296744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Queued start job for default target Main User Target.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Created slice User Application Slice.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Reached target Paths.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Reached target Timers.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Starting D-Bus User Message Bus Socket...
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Starting Create User's Volatile Files and Directories...
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Reached target Sockets.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Finished Create User's Volatile Files and Directories.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Reached target Basic System.
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: Started User Manager for UID 1003.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Reached target Main User Target.
Dec 06 10:06:39 np0005548789.localdomain systemd[296743]: Startup finished in 164ms.
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: Started Session 67 of User tripleo-admin.
Dec 06 10:06:39 np0005548789.localdomain sshd[296719]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.564277205 +0000 UTC m=+0.086235180 container create d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, description=Red Hat Ceph Storage 7, RELEASE=main, version=7)
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: Started libpod-conmon-d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0.scope.
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.528309191 +0000 UTC m=+0.050267186 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.650178455 +0000 UTC m=+0.172136440 container init d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.663742263 +0000 UTC m=+0.185700248 container start d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.664115755 +0000 UTC m=+0.186073740 container attach d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, release=1763362218, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:06:39 np0005548789.localdomain amazing_cori[296920]: 167 167
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: libpod-d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0.scope: Deactivated successfully.
Dec 06 10:06:39 np0005548789.localdomain podman[296868]: 2025-12-06 10:06:39.669259369 +0000 UTC m=+0.191217374 container died d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, io.buildah.version=1.41.4, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph)
Dec 06 10:06:39 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:39 np0005548789.localdomain sudo[296938]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqgetsyhjsxaipzwyxjqswcieuauldjq ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015599.2226965-61181-101629605349701/AnsiballZ_lineinfile.py
Dec 06 10:06:39 np0005548789.localdomain sudo[296938]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:06:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:39.771 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:39 np0005548789.localdomain podman[296940]: 2025-12-06 10:06:39.804342052 +0000 UTC m=+0.121833104 container remove d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_cori, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1763362218, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph)
Dec 06 10:06:39 np0005548789.localdomain systemd[1]: libpod-conmon-d196ebc78888e7951ba278087dd8889e53ed189cfebf386fb454bde91eb13be0.scope: Deactivated successfully.
Dec 06 10:06:39 np0005548789.localdomain sudo[296744]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:39 np0005548789.localdomain python3[296949]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.104/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:06:39 np0005548789.localdomain sudo[296938]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:40 np0005548789.localdomain sudo[296960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:40 np0005548789.localdomain sudo[296960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:40 np0005548789.localdomain sudo[296960]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:40 np0005548789.localdomain sudo[296995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:40 np0005548789.localdomain sudo[296995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:40.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:06:40 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.584065075 +0000 UTC m=+0.077146527 container create 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, version=7, release=1763362218, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:40 np0005548789.localdomain sudo[297167]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yivpirrkillhzabtyqletpsvseecnusd ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015600.2009828-61197-80587068021358/AnsiballZ_command.py
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: Started libpod-conmon-63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540.scope.
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: tmp-crun.6bMIOE.mount: Deactivated successfully.
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-07655610074ce032d6afad17c811578a09539b32e1d7c3f1db53f383da4f24d5-merged.mount: Deactivated successfully.
Dec 06 10:06:40 np0005548789.localdomain sudo[297167]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.546687449 +0000 UTC m=+0.039768891 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.657074186 +0000 UTC m=+0.150155638 container init 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, architecture=x86_64, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: tmp-crun.0lWVCy.mount: Deactivated successfully.
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.669018646 +0000 UTC m=+0.162100068 container start 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.)
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.669833241 +0000 UTC m=+0.162914693 container attach 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public)
Dec 06 10:06:40 np0005548789.localdomain clever_hodgkin[297172]: 167 167
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: libpod-63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540.scope: Deactivated successfully.
Dec 06 10:06:40 np0005548789.localdomain podman[297124]: 2025-12-06 10:06:40.675675967 +0000 UTC m=+0.168757449 container died 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=1763362218, vcs-type=git)
Dec 06 10:06:40 np0005548789.localdomain python3[297175]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.104/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:06:40 np0005548789.localdomain sudo[297167]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:40 np0005548789.localdomain podman[297178]: 2025-12-06 10:06:40.784437846 +0000 UTC m=+0.094906002 container remove 63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_hodgkin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, ceph=True, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-11-26T19:44:28Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:40 np0005548789.localdomain systemd[1]: libpod-conmon-63c9e658dd6d40cb9cf151b4bf28dd978ea740f5e8ebe5c01b412e86c1f74540.scope: Deactivated successfully.
Dec 06 10:06:40 np0005548789.localdomain sudo[296995]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.048354) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601048426, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1047, "num_deletes": 258, "total_data_size": 1528014, "memory_usage": 1554672, "flush_reason": "Manual Compaction"}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601057877, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 889561, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15549, "largest_seqno": 16590, "table_properties": {"data_size": 884640, "index_size": 2328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12862, "raw_average_key_size": 21, "raw_value_size": 873892, "raw_average_value_size": 1442, "num_data_blocks": 100, "num_entries": 606, "num_filter_entries": 606, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015581, "oldest_key_time": 1765015581, "file_creation_time": 1765015601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 9563 microseconds, and 3484 cpu microseconds.
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.057926) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 889561 bytes OK
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.057947) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.059741) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.059782) EVENT_LOG_v1 {"time_micros": 1765015601059776, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.059809) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1522395, prev total WAL file size 1522719, number of live WAL files 2.
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.060441) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353135' seq:72057594037927935, type:22 .. '6C6F676D0033373638' seq:0, type:0; will stop at (end)
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(868KB)], [21(14MB)]
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601060501, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16480269, "oldest_snapshot_seqno": -1}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10244 keys, 16342357 bytes, temperature: kUnknown
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601165159, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 16342357, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16281772, "index_size": 33860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25669, "raw_key_size": 275366, "raw_average_key_size": 26, "raw_value_size": 16104502, "raw_average_value_size": 1572, "num_data_blocks": 1290, "num_entries": 10244, "num_filter_entries": 10244, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015444, "oldest_key_time": 0, "file_creation_time": 1765015601, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8b48a877-4508-4eb4-a052-67f753f228b0", "db_session_id": "ETDWGFPM6GCTACWNDM5G", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.165536) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 16342357 bytes
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.167908) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.3 rd, 156.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 14.9 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(36.9) write-amplify(18.4) OK, records in: 10788, records dropped: 544 output_compression: NoCompression
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.167943) EVENT_LOG_v1 {"time_micros": 1765015601167928, "job": 10, "event": "compaction_finished", "compaction_time_micros": 104778, "compaction_time_cpu_micros": 46973, "output_level": 6, "num_output_files": 1, "total_output_size": 16342357, "num_input_records": 10788, "num_output_records": 10244, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601168204, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015601170519, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.060371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.170618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.170626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.170630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.170634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: rocksdb: (Original Log Time 2025/12/06-10:06:41.170638) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:06:41 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:41 np0005548789.localdomain sudo[297334]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlrfstwswdojzeombkifvntooeqijwnj ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765015600.9399917-61208-114998855006748/AnsiballZ_command.py
Dec 06 10:06:41 np0005548789.localdomain sudo[297334]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Dec 06 10:06:41 np0005548789.localdomain python3[297336]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.104 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:06:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c76e2a4d7f790ac67491a32c3668c14c81f35f2fe10b54013ab303d6cc47425d-merged.mount: Deactivated successfully.
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:06:42 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:06:43 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:43 np0005548789.localdomain sudo[297334]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:44 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:06:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:44.805 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:45.112 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:06:45 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:46 np0005548789.localdomain sudo[297355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:46 np0005548789.localdomain sudo[297355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:46 np0005548789.localdomain sudo[297355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='client.44243 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: Saving service mon spec with placement label:mon
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:46 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:06:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:06:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:06:47 np0005548789.localdomain ceph-mon[290022]: from='client.44251 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:06:47 np0005548789.localdomain ceph-mon[290022]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:06:47 np0005548789.localdomain ceph-mon[290022]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:06:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:06:47.296 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:06:47.296 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:06:47.298 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:47 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed13080 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0
Dec 06 10:06:47 np0005548789.localdomain ceph-mon[290022]: mon.np0005548789@2(peon) e11  removed from monmap, suicide.
Dec 06 10:06:47 np0005548789.localdomain sudo[297373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:47 np0005548789.localdomain sudo[297373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:06:47 np0005548789.localdomain sudo[297373]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:47 np0005548789.localdomain podman[297384]: 2025-12-06 10:06:47.633404436 +0000 UTC m=+0.068828176 container died 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7)
Dec 06 10:06:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-58b056c2dffd8d19854bf632a4b426ef2949193a3440de3f8d9216685a5e7198-merged.mount: Deactivated successfully.
Dec 06 10:06:47 np0005548789.localdomain podman[297384]: 2025-12-06 10:06:47.678547986 +0000 UTC m=+0.113971656 container remove 8db79eb988f6401c6530def208a6cc95f5f6889aff146370f866953a0dd24fb0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:06:47 np0005548789.localdomain sudo[297404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 --name mon.np0005548789 --force
Dec 06 10:06:47 np0005548789.localdomain sudo[297404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:47 np0005548789.localdomain systemd[1]: tmp-crun.DbiYf8.mount: Deactivated successfully.
Dec 06 10:06:47 np0005548789.localdomain podman[297400]: 2025-12-06 10:06:47.728214454 +0000 UTC m=+0.101144150 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:06:47 np0005548789.localdomain podman[297400]: 2025-12-06 10:06:47.738095572 +0000 UTC m=+0.111025288 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 06 10:06:47 np0005548789.localdomain ceph-mgr[288591]: --2- 172.18.0.107:0/2196335751 >> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] conn(0x56140edb3000 0x56140ecb9600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Dec 06 10:06:47 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:06:47 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:06:47 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed12f20 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:06:47 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:06:48 np0005548789.localdomain podman[297490]: 2025-12-06 10:06:48.196691565 +0000 UTC m=+0.089327723 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:06:48 np0005548789.localdomain podman[297490]: 2025-12-06 10:06:48.203843441 +0000 UTC m=+0.096479609 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8@mon.np0005548789.service: Deactivated successfully.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: Stopped Ceph mon.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8@mon.np0005548789.service: Consumed 7.232s CPU time.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:06:48 np0005548789.localdomain systemd-rc-local-generator[297585]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:06:48 np0005548789.localdomain systemd-sysv-generator[297588]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:06:48 np0005548789.localdomain sudo[297404]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:49 np0005548789.localdomain sudo[297594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:06:49 np0005548789.localdomain sudo[297594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:49 np0005548789.localdomain sudo[297594]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:49 np0005548789.localdomain sudo[297612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:06:49 np0005548789.localdomain sudo[297612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:49 np0005548789.localdomain sshd[297666]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:06:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:49.833 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:49 np0005548789.localdomain podman[297700]: 2025-12-06 10:06:49.930514169 +0000 UTC m=+0.144031372 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, release=1763362218, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:06:50 np0005548789.localdomain podman[297700]: 2025-12-06 10:06:50.034241095 +0000 UTC m=+0.247758278 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, architecture=x86_64, release=1763362218, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:06:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:50.117 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:50 np0005548789.localdomain sudo[297612]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:50 np0005548789.localdomain sshd[297666]: Received disconnect from 118.219.234.233 port 48368:11: Bye Bye [preauth]
Dec 06 10:06:50 np0005548789.localdomain sshd[297666]: Disconnected from authenticating user root 118.219.234.233 port 48368 [preauth]
Dec 06 10:06:52 np0005548789.localdomain sudo[297802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:06:52 np0005548789.localdomain sudo[297802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548789.localdomain sudo[297802]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548789.localdomain sudo[297820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:06:52 np0005548789.localdomain sudo[297820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548789.localdomain sudo[297820]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548789.localdomain sudo[297838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:52 np0005548789.localdomain sudo[297838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:52 np0005548789.localdomain sudo[297838]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 np0005548789.localdomain sudo[297856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:53 np0005548789.localdomain sudo[297856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:06:53 np0005548789.localdomain sudo[297856]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[297880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[297880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:06:53 np0005548789.localdomain sudo[297880]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain podman[297874]: 2025-12-06 10:06:53.126917569 +0000 UTC m=+0.099379616 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:53 np0005548789.localdomain podman[297874]: 2025-12-06 10:06:53.164286596 +0000 UTC m=+0.136748643 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:06:53 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:06:53 np0005548789.localdomain podman[297907]: 2025-12-06 10:06:53.21950869 +0000 UTC m=+0.095169710 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Dec 06 10:06:53 np0005548789.localdomain podman[297907]: 2025-12-06 10:06:53.235215444 +0000 UTC m=+0.110876454 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec 06 10:06:53 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:06:53 np0005548789.localdomain sudo[297940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[297940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[297940]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[297966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[297966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[297966]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[297984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:06:53 np0005548789.localdomain sudo[297984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[297984]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[298002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:53 np0005548789.localdomain sudo[298002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298002]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[298020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:06:53 np0005548789.localdomain sudo[298020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298020]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[298039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[298039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298039]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[298057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:06:53 np0005548789.localdomain sudo[298057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298057]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain sudo[298075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[298075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298075]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:06:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:06:53 np0005548789.localdomain sudo[298109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:53 np0005548789.localdomain sudo[298109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:53 np0005548789.localdomain sudo[298109]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:06:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153839 "" "Go-http-client/1.1"
Dec 06 10:06:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:06:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1"
Dec 06 10:06:54 np0005548789.localdomain sudo[298127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:06:54 np0005548789.localdomain sudo[298127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:54 np0005548789.localdomain sudo[298127]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:54 np0005548789.localdomain sudo[298145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:06:54 np0005548789.localdomain sudo[298145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:54 np0005548789.localdomain sudo[298145]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:54 np0005548789.localdomain sudo[298163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:54 np0005548789.localdomain sudo[298163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:06:54 np0005548789.localdomain sudo[298163]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:54.864 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:55.117 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:06:55 np0005548789.localdomain sshd[298181]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:06:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:06:55 np0005548789.localdomain podman[298183]: 2025-12-06 10:06:55.932110088 +0000 UTC m=+0.091183240 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:06:55 np0005548789.localdomain podman[298183]: 2025-12-06 10:06:55.970387811 +0000 UTC m=+0.129460943 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:06:55 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:06:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:06:59.888 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:00.120 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:00 np0005548789.localdomain sshd[298181]: Received disconnect from 179.33.210.213 port 60474:11: Bye Bye [preauth]
Dec 06 10:07:00 np0005548789.localdomain sshd[298181]: Disconnected from authenticating user root 179.33.210.213 port 60474 [preauth]
Dec 06 10:07:00 np0005548789.localdomain sudo[298202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:00 np0005548789.localdomain sudo[298202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:00 np0005548789.localdomain sudo[298202]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:00 np0005548789.localdomain sudo[298220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:00 np0005548789.localdomain sudo[298220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.167789021 +0000 UTC m=+0.064172516 container create 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True)
Dec 06 10:07:01 np0005548789.localdomain sudo[298293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: Started libpod-conmon-468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e.scope.
Dec 06 10:07:01 np0005548789.localdomain sudo[298293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:01 np0005548789.localdomain sudo[298293]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.132427574 +0000 UTC m=+0.028811099 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.250511084 +0000 UTC m=+0.146894579 container init 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.265989271 +0000 UTC m=+0.162372756 container start 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.266266209 +0000 UTC m=+0.162649764 container attach 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=)
Dec 06 10:07:01 np0005548789.localdomain friendly_mccarthy[298312]: 167 167
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: libpod-468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e.scope: Deactivated successfully.
Dec 06 10:07:01 np0005548789.localdomain podman[298280]: 2025-12-06 10:07:01.270090614 +0000 UTC m=+0.166474169 container died 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, ceph=True, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:01 np0005548789.localdomain sudo[298316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:01 np0005548789.localdomain sudo[298316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:01 np0005548789.localdomain podman[298334]: 2025-12-06 10:07:01.387959767 +0000 UTC m=+0.105497011 container remove 468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_mccarthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: libpod-conmon-468c4f70a3ff9fbd27850c17b3e27676a0d21c5a99b901d4860f38945e28a73e.scope: Deactivated successfully.
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.478040332 +0000 UTC m=+0.063873646 container create e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: Started libpod-conmon-e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663.scope.
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74214f1039b49a14641ac13d0ecf4dc532c465ec1c075256312d635b09f82e6b/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74214f1039b49a14641ac13d0ecf4dc532c465ec1c075256312d635b09f82e6b/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74214f1039b49a14641ac13d0ecf4dc532c465ec1c075256312d635b09f82e6b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:01 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74214f1039b49a14641ac13d0ecf4dc532c465ec1c075256312d635b09f82e6b/merged/var/lib/ceph/mon/ceph-np0005548789 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.540798104 +0000 UTC m=+0.126631388 container init e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-11-26T19:44:28Z, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.443538532 +0000 UTC m=+0.029371856 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.548101244 +0000 UTC m=+0.133934498 container start e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True)
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.548442225 +0000 UTC m=+0.134275559 container attach e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, release=1763362218, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: libpod-e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663.scope: Deactivated successfully.
Dec 06 10:07:01 np0005548789.localdomain podman[298352]: 2025-12-06 10:07:01.654553903 +0000 UTC m=+0.240387207 container died e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container)
Dec 06 10:07:01 np0005548789.localdomain podman[298406]: 2025-12-06 10:07:01.756813486 +0000 UTC m=+0.087960842 container remove e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_panini, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: libpod-conmon-e7f2a5d8f5039835ad07dc73e30ae6a9c876fbf00071afec023234b27507e663.scope: Deactivated successfully.
Dec 06 10:07:01 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:07:01 np0005548789.localdomain systemd-sysv-generator[298451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:07:01 np0005548789.localdomain systemd-rc-local-generator[298448]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3def008430c718dad15f8d9856f3d5283bd470351db279e370c4b8357a48142c-merged.mount: Deactivated successfully.
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: Reloading.
Dec 06 10:07:02 np0005548789.localdomain systemd-sysv-generator[298493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:07:02 np0005548789.localdomain systemd-rc-local-generator[298487]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Dec 06 10:07:02 np0005548789.localdomain systemd[1]: Starting Ceph mon.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8...
Dec 06 10:07:02 np0005548789.localdomain podman[298552]: 
Dec 06 10:07:02 np0005548789.localdomain podman[298552]: 2025-12-06 10:07:02.991216765 +0000 UTC m=+0.073943420 container create fc31a9b04a3a29a488005d8205bdc8f4100d6f62af8b9293fa7a15910a34b090 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218)
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: tmp-crun.LQ7hUr.mount: Deactivated successfully.
Dec 06 10:07:03 np0005548789.localdomain podman[298552]: 2025-12-06 10:07:02.966414588 +0000 UTC m=+0.049141303 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bfadbe6baaced7cb4e86831519444fa904b127541fff2989a7149ffd9fa7d8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bfadbe6baaced7cb4e86831519444fa904b127541fff2989a7149ffd9fa7d8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bfadbe6baaced7cb4e86831519444fa904b127541fff2989a7149ffd9fa7d8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14bfadbe6baaced7cb4e86831519444fa904b127541fff2989a7149ffd9fa7d8/merged/var/lib/ceph/mon/ceph-np0005548789 supports timestamps until 2038 (0x7fffffff)
Dec 06 10:07:03 np0005548789.localdomain podman[298552]: 2025-12-06 10:07:03.07732159 +0000 UTC m=+0.160048275 container init fc31a9b04a3a29a488005d8205bdc8f4100d6f62af8b9293fa7a15910a34b090 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, release=1763362218, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:07:03 np0005548789.localdomain podman[298552]: 2025-12-06 10:07:03.084068844 +0000 UTC m=+0.166795519 container start fc31a9b04a3a29a488005d8205bdc8f4100d6f62af8b9293fa7a15910a34b090 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mon-np0005548789, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 06 10:07:03 np0005548789.localdomain bash[298552]: fc31a9b04a3a29a488005d8205bdc8f4100d6f62af8b9293fa7a15910a34b090
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: Started Ceph mon.np0005548789 for 1939e851-b10c-5c3b-9bb7-8e7f380233e8.
Dec 06 10:07:03 np0005548789.localdomain podman[298565]: 2025-12-06 10:07:03.121583315 +0000 UTC m=+0.083680054 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:07:03 np0005548789.localdomain sudo[298220]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pidfile_write: ignore empty --pid-file
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: load: jerasure load: lrc 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: RocksDB version: 7.9.2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Git sha 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Compile date 2025-09-23 00:00:00
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: DB SUMMARY
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: DB Session ID:  JMBO5KX1IJCJ8FWC64EX
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: CURRENT file:  CURRENT
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005548789/store.db dir, Total Num: 0, files: 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005548789/store.db: 000004.log size: 761 ; 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                         Options.error_if_exists: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.create_if_missing: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                                     Options.env: 0x55b607e7d9e0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                                Options.info_log: 0x55b608f2cd20
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                              Options.statistics: (nil)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                               Options.use_fsync: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                              Options.db_log_dir: 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                                 Options.wal_dir: 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                    Options.write_buffer_manager: 0x55b608f3d540
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.unordered_write: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                               Options.row_cache: None
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                              Options.wal_filter: None
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.two_write_queues: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.wal_compression: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.atomic_flush: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.max_background_jobs: 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.max_background_compactions: -1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.max_subcompactions: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.max_total_wal_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                          Options.max_open_files: -1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:       Options.compaction_readahead_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Compression algorithms supported:
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kZSTD supported: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kXpressCompression supported: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kBZip2Compression supported: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kLZ4Compression supported: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kZlibCompression supported: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         kSnappyCompression supported: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005548789/store.db/MANIFEST-000005
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:           Options.merge_operator: 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:        Options.compaction_filter: None
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b608f2c980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x55b608f29350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:        Options.write_buffer_size: 33554432
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:  Options.max_write_buffer_number: 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.compression: NoCompression
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.num_levels: 7
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.table_properties_collectors: 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                           Options.bloom_locality: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                               Options.ttl: 2592000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                       Options.enable_blob_files: false
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                           Options.min_blob_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005548789/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623136019, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623138512, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623138622, "job": 1, "event": "recovery_finished"}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b608f50e00
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: DB pointer 0x55b609046000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 does not exist in monmap, will attempt to join an existing cluster
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b608f29350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.8e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: using public_addr v2:172.18.0.104:0/0 -> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: starting mon.np0005548789 rank -1 at public addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] at bind addrs [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005548789 fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(???) e0 preinit fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain podman[298565]: 2025-12-06 10:07:03.155637181 +0000 UTC m=+0.117733890 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing) e11 sync_obtain_latest_monmap
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing) e11 sync_obtain_latest_monmap obtained monmap e11
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.234683633 +0000 UTC m=+0.064480784 container create 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7)
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: Started libpod-conmon-5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d.scope.
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.297888389 +0000 UTC m=+0.127685540 container init 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z)
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: tmp-crun.6w0dbN.mount: Deactivated successfully.
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: libpod-5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d.scope: Deactivated successfully.
Dec 06 10:07:03 np0005548789.localdomain great_colden[298651]: 167 167
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.217203517 +0000 UTC m=+0.047000688 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.318849491 +0000 UTC m=+0.148646652 container start 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.319023927 +0000 UTC m=+0.148821108 container attach 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, release=1763362218, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:03 np0005548789.localdomain podman[298636]: 2025-12-06 10:07:03.32013119 +0000 UTC m=+0.149928341 container died 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:03 np0005548789.localdomain podman[298656]: 2025-12-06 10:07:03.406927966 +0000 UTC m=+0.080493308 container remove 5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_colden, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:07:03 np0005548789.localdomain systemd[1]: libpod-conmon-5b2578171d9c75a479c1559e942a76336a44dc660c216cd71d2b9264fd37055d.scope: Deactivated successfully.
Dec 06 10:07:03 np0005548789.localdomain sudo[298316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:03 np0005548789.localdomain sudo[298671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).mds e16 new map
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2025-12-06T08:18:49.925523+0000
                                                           modified        2025-12-06T10:03:02.051468+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        87
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26356}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26356 members: 26356
                                                           [mds.mds.np0005548790.vhcezv{0:26356} state up:active seq 16 addr [v2:172.18.0.108:6808/1621657194,v1:172.18.0.108:6809/1621657194] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005548789.vxwwsq{-1:16884} state up:standby seq 1 addr [v2:172.18.0.107:6808/3033303281,v1:172.18.0.107:6809/3033303281] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005548788.erzujf{-1:16890} state up:standby seq 1 addr [v2:172.18.0.106:6808/309324236,v1:172.18.0.106:6809/309324236] compat {c=[1],r=[1],i=[17ff]}]
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 crush map has features 3314933000854323200, adjusting msgr requires
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 crush map has features 432629239337189376, adjusting msgr requires
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.570689) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623570797, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10565, "num_deletes": 254, "total_data_size": 13575200, "memory_usage": 14243736, "flush_reason": "Manual Compaction"}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44134 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Saving service mon spec with placement label:mon
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548788", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/2211595861' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain sudo[298671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548786 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548786 on np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/2329222999' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain sudo[298671]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.26906 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548786"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Remove daemons mon.np0005548786
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Safe to remove mon.np0005548786: new quorum should be ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'] (from ['np0005548787', 'np0005548790', 'np0005548789', 'np0005548788'])
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing monitor np0005548786 from monmap...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing daemon mon.np0005548786 from np0005548786.localdomain -- ports []
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548789,np0005548788 in quorum (ranks 0,1,2,3)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: monmap epoch 10
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:06:16.211793+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005548789
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mgrmap e23: np0005548787.umwsra(active, since 26s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44175 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548786.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removed label mon from host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.34289 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removed label mgr from host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing daemon mgr.np0005548786.mczynb from np0005548786.localdomain -- ports [8765]
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44183 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548786.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removed label _admin from host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005548786.mczynb"}]': finished
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing key for mgr.np0005548786.mczynb
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548786.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing np0005548786.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removing np0005548786.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548786 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548786 on np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/544071182' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1605363813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/308530152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3072124168' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/406415283' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3079514338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.34370 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548786.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Added label _no_schedule to host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44219 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548786.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain"}]': finished
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.26868 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548786.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Removed host np0005548786.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1005735917' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44243 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Saving service mon spec with placement label:mon
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44251 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: monmap epoch 11
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:06:47.518948+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mgrmap e23: np0005548787.umwsra(active, since 61s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1/3 mons down, quorum np0005548787,np0005548790 (MON_DOWN)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 calling monitor election
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548788 in quorum (ranks 0,1,2)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: monmap epoch 11
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:06:47.518948+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: osdmap e89: 6 total, 6 up, 6 in
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mgrmap e23: np0005548787.umwsra(active, since 61s), standbys: np0005548786.mczynb, np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548787,np0005548790)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='client.44258 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548789.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Deploying daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623639395, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13470744, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10570, "table_properties": {"data_size": 13410806, "index_size": 32440, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26437, "raw_key_size": 282537, "raw_average_key_size": 26, "raw_value_size": 13230942, "raw_average_value_size": 1253, "num_data_blocks": 1228, "num_entries": 10554, "num_filter_entries": 10554, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 1765015623, "file_creation_time": 1765015623, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 68757 microseconds, and 18247 cpu microseconds.
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).paxosservice(auth 1..38) refresh upgraded, format 0 -> 3
Dec 06 10:07:03 np0005548789.localdomain sudo[298690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.639451) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13470744 bytes OK
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.639474) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.641197) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.641214) EVENT_LOG_v1 {"time_micros": 1765015623641210, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.641232) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 13501231, prev total WAL file size 13501231, number of live WAL files 2.
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548789.localdomain sudo[298690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.642973) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1887B)]
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623643079, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13472631, "oldest_snapshot_seqno": -1}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10305 keys, 13467468 bytes, temperature: kUnknown
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623735710, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 13467468, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13408207, "index_size": 32408, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25797, "raw_key_size": 277795, "raw_average_key_size": 26, "raw_value_size": 13231593, "raw_average_value_size": 1283, "num_data_blocks": 1227, "num_entries": 10305, "num_filter_entries": 10305, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015623, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.736138) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 13467468 bytes
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.737994) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.2 rd, 145.1 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.8, 0.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10559, records dropped: 254 output_compression: NoCompression
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.738023) EVENT_LOG_v1 {"time_micros": 1765015623738011, "job": 4, "event": "compaction_finished", "compaction_time_micros": 92803, "compaction_time_cpu_micros": 37440, "output_level": 6, "num_output_files": 1, "total_output_size": 13467468, "num_input_records": 10559, "num_output_records": 10305, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623740084, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015623740155, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 06 10:07:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:07:03.642884) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.043788763 +0000 UTC m=+0.090157019 container create 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218)
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe.scope.
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.004262412 +0000 UTC m=+0.050630708 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.119234887 +0000 UTC m=+0.165603133 container init 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.127719943 +0000 UTC m=+0.174088159 container start 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, ceph=True, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, release=1763362218, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4)
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.127991841 +0000 UTC m=+0.174360097 container attach 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1763362218, version=7, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 10:07:04 np0005548789.localdomain awesome_bell[298741]: 167 167
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: libpod-1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe.scope: Deactivated successfully.
Dec 06 10:07:04 np0005548789.localdomain podman[298725]: 2025-12-06 10:07:04.132520418 +0000 UTC m=+0.178888644 container died 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main)
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1e459a916f5587573c12b07987f77c2114d8738bd5da4dde2c80fb1f51c79aef-merged.mount: Deactivated successfully.
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-709a1348378cc1ce32c956b4ca6e0317ba8edd21e8a24e892fd2a6acf0761bf1-merged.mount: Deactivated successfully.
Dec 06 10:07:04 np0005548789.localdomain podman[298746]: 2025-12-06 10:07:04.243804203 +0000 UTC m=+0.097723747 container remove 1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_bell, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Dec 06 10:07:04 np0005548789.localdomain systemd[1]: libpod-conmon-1c47cdca4720e7c1521a521e400d75678e3cc3373bfa8298b0d876aaf9b1dcfe.scope: Deactivated successfully.
Dec 06 10:07:04 np0005548789.localdomain sudo[298690]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:04 np0005548789.localdomain sudo[298770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:04 np0005548789.localdomain sudo[298770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:04 np0005548789.localdomain sudo[298770]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:04 np0005548789.localdomain sudo[298788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:04 np0005548789.localdomain sudo[298788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:04.919 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.036747795 +0000 UTC m=+0.064088853 container create 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, vcs-type=git, architecture=x86_64)
Dec 06 10:07:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f.scope.
Dec 06 10:07:05 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.101409184 +0000 UTC m=+0.128750222 container init 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, ceph=True, RELEASE=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z)
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.003699668 +0000 UTC m=+0.031040766 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.108163087 +0000 UTC m=+0.135504145 container start 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main)
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.108400124 +0000 UTC m=+0.135741182 container attach 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True)
Dec 06 10:07:05 np0005548789.localdomain reverent_gould[298837]: 167 167
Dec 06 10:07:05 np0005548789.localdomain systemd[1]: libpod-1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f.scope: Deactivated successfully.
Dec 06 10:07:05 np0005548789.localdomain podman[298822]: 2025-12-06 10:07:05.112296822 +0000 UTC m=+0.139637910 container died 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-type=git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:07:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:05.123 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:05 np0005548789.localdomain podman[298842]: 2025-12-06 10:07:05.207147811 +0000 UTC m=+0.089729775 container remove 1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_gould, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, GIT_BRANCH=main)
Dec 06 10:07:05 np0005548789.localdomain systemd[1]: libpod-conmon-1068ab8665323a1dd198fb433dc2a5f9d3b9f76d18377601e5540405d926a74f.scope: Deactivated successfully.
Dec 06 10:07:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-925a159f2cffa5eaea442bbe0bb15122915e04eeb6941a7d1c3e124eab1b8933-merged.mount: Deactivated successfully.
Dec 06 10:07:05 np0005548789.localdomain sudo[298788]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:05 np0005548789.localdomain sudo[298865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:05 np0005548789.localdomain sudo[298865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:05 np0005548789.localdomain sudo[298865]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:05 np0005548789.localdomain sudo[298883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:05 np0005548789.localdomain sudo[298883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:05 np0005548789.localdomain podman[298917]: 
Dec 06 10:07:05 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:05.972265915 +0000 UTC m=+0.082544300 container create e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: Started libpod-conmon-e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1.scope.
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:06 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:05.937781385 +0000 UTC m=+0.048059820 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:06 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:06.040525382 +0000 UTC m=+0.150803767 container init e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:07:06 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:06.049157792 +0000 UTC m=+0.159436187 container start e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Dec 06 10:07:06 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:06.050146311 +0000 UTC m=+0.160424746 container attach e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, release=1763362218, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Dec 06 10:07:06 np0005548789.localdomain epic_thompson[298932]: 167 167
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: libpod-e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1.scope: Deactivated successfully.
Dec 06 10:07:06 np0005548789.localdomain podman[298917]: 2025-12-06 10:07:06.055887405 +0000 UTC m=+0.166165820 container died e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, version=7, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218)
Dec 06 10:07:06 np0005548789.localdomain podman[298937]: 2025-12-06 10:07:06.15557043 +0000 UTC m=+0.087259832 container remove e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_thompson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.41.4, release=1763362218)
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: libpod-conmon-e06626dc7aa398a0e3bc4102d673ae36d62c5d9e4117ad83b9e912fc8a7084f1.scope: Deactivated successfully.
Dec 06 10:07:06 np0005548789.localdomain sudo[298883]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8f4cd1ebd77010274d9e0d70325ae0673bb0ef5291b04d390f9cb0e3289de6b9-merged.mount: Deactivated successfully.
Dec 06 10:07:06 np0005548789.localdomain sshd[298955]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:07:06 np0005548789.localdomain sudo[298956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:06 np0005548789.localdomain sudo[298956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:06 np0005548789.localdomain sudo[298956]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:06 np0005548789.localdomain sudo[298975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:06 np0005548789.localdomain sudo[298975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.885822132 +0000 UTC m=+0.079568070 container create c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True)
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: Started libpod-conmon-c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212.scope.
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.85291184 +0000 UTC m=+0.046657808 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.960115771 +0000 UTC m=+0.153861739 container init c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, vendor=Red Hat, Inc., release=1763362218, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.969901697 +0000 UTC m=+0.163647645 container start c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph)
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.970157144 +0000 UTC m=+0.163903082 container attach c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main)
Dec 06 10:07:06 np0005548789.localdomain intelligent_brahmagupta[299025]: 167 167
Dec 06 10:07:06 np0005548789.localdomain systemd[1]: libpod-c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212.scope: Deactivated successfully.
Dec 06 10:07:06 np0005548789.localdomain podman[299010]: 2025-12-06 10:07:06.97433261 +0000 UTC m=+0.168078628 container died c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7)
Dec 06 10:07:07 np0005548789.localdomain podman[299030]: 2025-12-06 10:07:07.082224202 +0000 UTC m=+0.092576651 container remove c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_brahmagupta, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:07:07 np0005548789.localdomain systemd[1]: libpod-conmon-c2af45cce646b2f8578525332813b9dfbd806d797ea58cdc3129a408110c8212.scope: Deactivated successfully.
Dec 06 10:07:07 np0005548789.localdomain sudo[298975]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-fc6f69aced3e7f8dff087557f0f70186d9a2b12da92e58b124dcaec055a2c15c-merged.mount: Deactivated successfully.
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:07:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.913 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.923 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain podman[299047]: 2025-12-06 10:07:07.92378358 +0000 UTC m=+0.081380304 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47aeece0-e4c0-483a-a016-12f020f41712', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.914468', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52c2699a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '9df2c86bcc3963e5ecc8e84c9f1bb45ef3fc5312858997808d1e3a3f9736530b'}]}, 'timestamp': '2025-12-06 10:07:07.924928', '_unique_id': '3b0641f2f2774f6eadba0bbcc7715f69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.927 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.929 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.929 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.930 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aa5fa3d-9693-49ec-8a21-f32332cb98fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.930110', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52c34e78-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': 'ce15bfe8ac74efae40d5b297edec05adbc83cd7e69eb2ad13fc0d0d1947cff70'}]}, 'timestamp': '2025-12-06 10:07:07.930647', '_unique_id': '221deea63e47400e9590223fe0df1485'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.933 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19c61693-4e31-4577-96a6-7289da114e23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.933184', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52c3c574-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '7b6cf1328bc355f16e1bba6275f291bf892e5b0d56b7d8a92d01efb3fcac1486'}]}, 'timestamp': '2025-12-06 10:07:07.933663', '_unique_id': 'e4f92ff7b3b24711b8908678f557de1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain podman[299047]: 2025-12-06 10:07:07.958046013 +0000 UTC m=+0.115642767 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.959 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.960 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5b786b6-7792-4235-82c6-e47d33a57f57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:07.936013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52c7d93e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': 'f4de788b1886a1e94d1d95a60f2c5e7a74b413bffb229f9214729656802fffc1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:07.936013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52c7ecd0-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': 'efff69699c1eaf2ff8269a76c2504737b43b52142b99a4faf4d83efc233a2a03'}]}, 'timestamp': '2025-12-06 10:07:07.960898', '_unique_id': '2cc6d37ad82a469e9669463a43d8eacb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.962 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '366c9c85-0239-4f74-af23-38c37ed100ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.963895', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52c875c4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': 'f2c890f52661e91c8c7cabf35442a6381ecda47b15a55f16874e905d65b50c28'}]}, 'timestamp': '2025-12-06 10:07:07.964474', '_unique_id': '48f021bf456e4817ab25b8d1a942c6be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.966 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.966 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '181a0bc5-31e8-4967-8914-d1e99c6dd6bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.966715', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52c8e5d6-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '41bcfad1c70daca735e12623cff0867492f976420d7a8d731e8fb966f142071c'}]}, 'timestamp': '2025-12-06 10:07:07.967266', '_unique_id': '653bf89300534705b8392a2f9811a0f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.969 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.978 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.979 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69e8e7ea-9f9a-419b-a8f4-45a4ca85c848', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:07.969384', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52cabc80-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': '3b1acce951daeb8f2df45fdb37c669f4b6e8784839305f96ea0e71c45d21aeb2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:07.969384', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52cad44a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': '787994e7d589db1c16d97fec03494b1aa7832f643ccbb2937eb29e0be9944251'}]}, 'timestamp': '2025-12-06 10:07:07.980006', '_unique_id': '66fdd1e49dda442c8d446ec1d8659a09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.983 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b76ba004-8bec-4940-ad6c-2d772732ed07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:07.983186', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52cb6860-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '4a9228bf33b64cb4a02fd747e2cb6a9feaede9180858a850c8d22a4b222dccf6'}]}, 'timestamp': '2025-12-06 10:07:07.983799', '_unique_id': 'ef572655a1d940ad9b07c774bd294a91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.986 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.987 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9747ef9b-6cc9-47d2-8179-8c723eb84c1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:07.986692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52cbf230-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '544b904a258ed0f47d778c3e8ba7e9ccbf5ad56b617eb9fb8c0ce4a1830e00bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:07.986692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52cc05b8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '552e1901d2ea4d7b71ca8c5616ce7576f997590aee4f84cd768e1906d1a84c22'}]}, 'timestamp': '2025-12-06 10:07:07.987805', '_unique_id': '1b3eae1e797842719f7fc35956c1a9ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.990 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd65cf7b0-1c58-4a60-8988-60c94180c835', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:07.990807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52cc9384-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': 'a57e63e5bcd451e8438a6042df4b2205753ef145f5de862e3a4b8b65465b64cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:07.990807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52cca946-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '36c731f72f04ca2876d8316b6e24c69c31b9feb9d0a6e79988321146d4c1e23d'}]}, 'timestamp': '2025-12-06 10:07:07.992185', '_unique_id': '25757193fe4a4e298a53984d0f654000'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e48e2965-5612-4722-896e-9bb8a3f84e2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:07:07.994980', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '52cf05b0-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.25626355, 'message_signature': 'fd1a18b72fb96db055035bc7713555f59f97d23098e40bfa64bd1fe85ed62056'}]}, 'timestamp': '2025-12-06 10:07:08.007283', '_unique_id': '82f35eb76d5840feb6532773f863c9ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.008 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.008 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bf035a5-de72-45ac-868b-f63d1cff6898', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:08.008552', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52cf4048-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '664176e37d12d22f2c44bc873ee2bc0fc4b61b4867a68538a53d254183b2f5cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:08.008552', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52cf4890-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '01e06cb25b31774a357500f488d2410b6aa9f461dc7aa5af9fc029d3ef72f98f'}]}, 'timestamp': '2025-12-06 10:07:08.008966', '_unique_id': '7c66c6e5ee7944769d3b55ae14b21215'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de83d252-486e-430d-a008-3f58dbc5cb58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:08.009968', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52cf7798-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '7fe4f6e1f0b8ac5ac2071b024875808265d82accd4c3e90930d7e9a78c5786c3'}]}, 'timestamp': '2025-12-06 10:07:08.010182', '_unique_id': 'c91fcc958f6b49a9988e3a3666fd9389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 13670000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8c6139-6268-4de2-bd97-3b63156d2cb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13670000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:07:08.011240', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '52cfa920-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.25626355, 'message_signature': '68e736bf6745c7355e638f3829b03f6bf6eab0a874f7d10f3d9b42a31ec0562d'}]}, 'timestamp': '2025-12-06 10:07:08.011445', '_unique_id': '6212b64a1ad4491fb1574aab97e4beb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5072cef7-c845-496f-a522-a21854996f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:08.012564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52cfdcec-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '27d404b395208f8e877a8ed987cdc8ad9d69372ad430fc0f5215f5da8b462806'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:08.012564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52cfe5f2-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': 'a518218deb226d4ae0711f5988166ffbe189305d9458b65fd72801d087f37846'}]}, 'timestamp': '2025-12-06 10:07:08.012997', '_unique_id': 'ffebc53351d5460a8f409eedc70486d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08bdcd33-17f6-401d-8a4a-f459a1a2e34f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:08.014001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52d01504-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': '5fd38e117d64e7affdf65b24021e75eed25af34d4630ead6a2fc4bffe0bac8fb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:08.014001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52d01c7a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': 'a23372ac6a336bf7026dcc3f60cdce7420823c85b087b4a54ffd3f068f5f4070'}]}, 'timestamp': '2025-12-06 10:07:08.014389', '_unique_id': '0fa6aad46944445480fdc614ef3c0cfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c736b2d-ae7f-438e-afa3-d2b5c0414910', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:08.015498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52d04f6a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '6c457674122fbdfc6856651d05c19f3ea638951eb7632f1df738915d15da0c61'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:08.015498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52d05758-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.185389074, 'message_signature': '9c25934eaf7ac0f78990159a7f822e57121e6c8d3370b4c800f3a6f0e502ec9e'}]}, 'timestamp': '2025-12-06 10:07:08.015897', '_unique_id': 'eb4c72d13593417ca28dafb68450d6fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9294a527-e638-4bbb-9fd7-7a7a4f0a11bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:08.016883', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52d085ac-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '1c0f2e7d629ed50e88425a4277fbf380b8df158e8996488ca43d293173325742'}]}, 'timestamp': '2025-12-06 10:07:08.017097', '_unique_id': '43f90849c27640dd895276aa66782ac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ce4dd8d-f8a6-4fd1-87ac-db8156d93ea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:08.018068', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52d0b3d8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '5c4745602151b45dafbf4342c4ab953d584e21266576a62041485fc393539af9'}]}, 'timestamp': '2025-12-06 10:07:08.018278', '_unique_id': 'b4cbf195fcec4c3bbd88f964d3b48d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39ec4d80-0b26-440b-96e4-8389f7669a88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:07:08.019288', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '52d0e376-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': '7e815b6366e8b3c25972e2176d3a8ba766c034a1f4342b8e3dd63ce838ec0bc2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:07:08.019288', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '52d0eb00-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.21875784, 'message_signature': 'f63d245b5934bf0908f1a305a7258ebdd07abd195b70b3541714fcdc547486b7'}]}, 'timestamp': '2025-12-06 10:07:08.019676', '_unique_id': '24b7aec310e246d19b7d594c1c8ac6e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.020 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a9aa897-6a87-444c-8c35-0ea4886203f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:07:08.020671', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '52d11a8a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12046.163835124, 'message_signature': '9fe26e5f08f52558c860a3ac1f5ac0ebfe4cb9b842e026119194719f03230108'}]}, 'timestamp': '2025-12-06 10:07:08.020909', '_unique_id': 'e27de6cf164846758c4e8f74a4a59b2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:07:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:07:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:07:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:09.957 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:10.126 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548790.vhcezv", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:11 np0005548789.localdomain sudo[299072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:11 np0005548789.localdomain sudo[299072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:11 np0005548789.localdomain sudo[299072]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:12 np0005548789.localdomain sudo[299090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:07:12 np0005548789.localdomain sudo[299090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:12 np0005548789.localdomain systemd[1]: tmp-crun.vYInN6.mount: Deactivated successfully.
Dec 06 10:07:12 np0005548789.localdomain podman[299181]: 2025-12-06 10:07:12.85781303 +0000 UTC m=+0.102067398 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, release=1763362218, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:07:12 np0005548789.localdomain podman[299181]: 2025-12-06 10:07:12.958085932 +0000 UTC m=+0.202340270 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, CEPH_POINT_RELEASE=)
Dec 06 10:07:13 np0005548789.localdomain sudo[299090]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:13 np0005548789.localdomain sudo[299301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:13 np0005548789.localdomain sudo[299301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:13 np0005548789.localdomain sudo[299301]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:13 np0005548789.localdomain sudo[299319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:07:13 np0005548789.localdomain sudo[299319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:14 np0005548789.localdomain sudo[299319]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:14 np0005548789.localdomain sudo[299369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:07:14 np0005548789.localdomain sudo[299369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:14 np0005548789.localdomain sudo[299369]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:14.993 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:15.129 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/325418580' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:07:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:16 np0005548789.localdomain sudo[299387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:07:16 np0005548789.localdomain sudo[299387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:07:16 np0005548789.localdomain sudo[299387]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:07:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='client.44270 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: Reconfig service osd.default_drive_group
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' 
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:07:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:07:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:07:17 np0005548789.localdomain sshd[294121]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:07:17 np0005548789.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Dec 06 10:07:17 np0005548789.localdomain systemd[1]: session-66.scope: Consumed 28.576s CPU time.
Dec 06 10:07:17 np0005548789.localdomain systemd-logind[766]: Session 66 logged out. Waiting for processes to exit.
Dec 06 10:07:17 np0005548789.localdomain systemd-logind[766]: Removed session 66.
Dec 06 10:07:17 np0005548789.localdomain podman[299405]: 2025-12-06 10:07:17.941265102 +0000 UTC m=+0.098001465 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:07:17 np0005548789.localdomain podman[299405]: 2025-12-06 10:07:17.951260824 +0000 UTC m=+0.107997247 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:07:17 np0005548789.localdomain sshd[299424]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:07:17 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:07:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:07:18 np0005548789.localdomain systemd[1]: tmp-crun.d5zoP1.mount: Deactivated successfully.
Dec 06 10:07:18 np0005548789.localdomain podman[299426]: 2025-12-06 10:07:18.948401661 +0000 UTC m=+0.108569284 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:07:18 np0005548789.localdomain podman[299426]: 2025-12-06 10:07:18.957746202 +0000 UTC m=+0.117913825 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:07:18 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:07:19 np0005548789.localdomain sshd[299424]: Received disconnect from 154.113.10.34 port 50174:11: Bye Bye [preauth]
Dec 06 10:07:19 np0005548789.localdomain sshd[299424]: Disconnected from authenticating user root 154.113.10.34 port 50174 [preauth]
Dec 06 10:07:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:20.026 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:20.132 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:23 np0005548789.localdomain sshd[299449]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:07:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:07:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:07:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:07:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:07:23 np0005548789.localdomain systemd[1]: tmp-crun.ggeiID.mount: Deactivated successfully.
Dec 06 10:07:23 np0005548789.localdomain podman[299451]: 2025-12-06 10:07:23.942999576 +0000 UTC m=+0.102150911 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Dec 06 10:07:23 np0005548789.localdomain podman[299452]: 2025-12-06 10:07:23.995325622 +0000 UTC m=+0.152396644 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:07:24 np0005548789.localdomain podman[299451]: 2025-12-06 10:07:24.024051349 +0000 UTC m=+0.183202714 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7)
Dec 06 10:07:24 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:07:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:07:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:07:24 np0005548789.localdomain podman[299452]: 2025-12-06 10:07:24.080312825 +0000 UTC m=+0.237383797 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:07:24 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:07:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:07:24 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19210 "" "Go-http-client/1.1"
Dec 06 10:07:24 np0005548789.localdomain sshd[299449]: Received disconnect from 64.227.102.57 port 43244:11: Bye Bye [preauth]
Dec 06 10:07:24 np0005548789.localdomain sshd[299449]: Disconnected from authenticating user root 64.227.102.57 port 43244 [preauth]
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.062 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.135 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.210 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.211 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.211 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.211 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.212 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.670 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.739 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.739 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.967 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.969 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11511MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.970 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:25.970 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.067 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.068 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.068 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.108 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.552 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.560 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.585 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.588 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:07:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:26.589 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:07:26 np0005548789.localdomain podman[299529]: 2025-12-06 10:07:26.924113647 +0000 UTC m=+0.081839748 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:07:26 np0005548789.localdomain podman[299529]: 2025-12-06 10:07:26.94214079 +0000 UTC m=+0.099866881 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 10:07:26 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:07:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:27.590 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:27.591 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:27.591 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:27.591 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 1002...
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Activating special unit Exit the Session...
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Removed slice User Background Tasks Slice.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped target Main User Target.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped target Basic System.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped target Paths.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped target Sockets.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped target Timers.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Closed D-Bus User Message Bus Socket.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Removed slice User Application Slice.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Reached target Shutdown.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Finished Exit the Session.
Dec 06 10:07:28 np0005548789.localdomain systemd[26209]: Reached target Exit the Session.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 1002.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: user@1002.service: Consumed 13.160s CPU time, read 188.0K from disk, written 7.0K to disk.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Dec 06 10:07:28 np0005548789.localdomain systemd[1]: user-1002.slice: Consumed 4min 24.672s CPU time.
Dec 06 10:07:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:28.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:28.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:07:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:28.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.010 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.012 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.012 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.012 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.103 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:30.138 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.014 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.033 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.033 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.034 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.035 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:31.035 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e89 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(synchronizing).osd e90 e90: 6 total, 6 up, 6 in
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.24103 172.18.0.105:0/2222976075' entity='mgr.np0005548787.umwsra' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/2080000025' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548786.mczynb
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: osdmap e90: 6 total, 6 up, 6 in
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: mgrmap e24: np0005548786.mczynb(active, starting, since 0.0565768s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: Standby manager daemon np0005548787.umwsra started
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: mgrmap e25: np0005548786.mczynb(active, starting, since 5s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3985914868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1488514553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3750572853' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1577274021' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:07:33 np0005548789.localdomain systemd[1]: tmp-crun.XdrC4i.mount: Deactivated successfully.
Dec 06 10:07:33 np0005548789.localdomain podman[299549]: 2025-12-06 10:07:33.928378479 +0000 UTC m=+0.088562131 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:07:33 np0005548789.localdomain podman[299549]: 2025-12-06 10:07:33.96524644 +0000 UTC m=+0.125430072 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:07:33 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:07:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:35.139 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:35.146 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:07:38 np0005548789.localdomain podman[299572]: 2025-12-06 10:07:38.920319554 +0000 UTC m=+0.077329982 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:07:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(probing) e11 handle_auth_request failed to assign global_id
Dec 06 10:07:39 np0005548789.localdomain podman[299572]: 2025-12-06 10:07:39.014831313 +0000 UTC m=+0.171841731 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 10:07:39 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.147 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.149 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.150 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.150 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.184 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:40.185 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:43 np0005548789.localdomain sshd[296776]: Received disconnect from 192.168.122.11 port 38310:11: disconnected by user
Dec 06 10:07:43 np0005548789.localdomain sshd[296776]: Disconnected from user tripleo-admin 192.168.122.11 port 38310
Dec 06 10:07:43 np0005548789.localdomain sshd[296719]: pam_unix(sshd:session): session closed for user tripleo-admin
Dec 06 10:07:43 np0005548789.localdomain systemd[1]: session-67.scope: Deactivated successfully.
Dec 06 10:07:43 np0005548789.localdomain systemd[1]: session-67.scope: Consumed 1.754s CPU time.
Dec 06 10:07:43 np0005548789.localdomain systemd-logind[766]: Session 67 logged out. Waiting for processes to exit.
Dec 06 10:07:43 np0005548789.localdomain systemd-logind[766]: Removed session 67.
Dec 06 10:07:43 np0005548789.localdomain sshd[298955]: Received disconnect from 45.78.222.162 port 59030:11: Bye Bye [preauth]
Dec 06 10:07:43 np0005548789.localdomain sshd[298955]: Disconnected from authenticating user root 45.78.222.162 port 59030 [preauth]
Dec 06 10:07:43 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x56140ed13600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Dec 06 10:07:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@-1(probing) e12  my rank is now 3 (was -1)
Dec 06 10:07:43 np0005548789.localdomain ceph-mon[298582]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:07:43 np0005548789.localdomain ceph-mon[298582]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 06 10:07:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.186 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.189 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.189 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.189 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.235 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:45.236 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:07:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:07:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:07:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:07:47.297 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:07:47.298 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:07:47.299 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:48 np0005548789.localdomain ceph-mds[287313]: mds.beacon.mds.np0005548789.vxwwsq missed beacon ack from the monitors
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2638595726' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3442107100' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3222977501' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3222977501' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mgrc update_daemon_metadata mon.np0005548789 metadata {addrs=[v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005548789.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005548789.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 calling monitor election
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548787 is new leader, mons np0005548787,np0005548790,np0005548788,np0005548789 in quorum (ranks 0,1,2,3)
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: monmap epoch 12
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:07:43.610976+0000
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548787
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: osdmap e90: 6 total, 6 up, 6 in
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: mgrmap e25: np0005548786.mczynb(active, starting, since 30s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:07:48 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:07:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:07:48 np0005548789.localdomain podman[299595]: 2025-12-06 10:07:48.923936718 +0000 UTC m=+0.084450876 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:07:48 np0005548789.localdomain podman[299595]: 2025-12-06 10:07:48.932052283 +0000 UTC m=+0.092566471 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:07:48 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:07:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:07:49 np0005548789.localdomain podman[299614]: 2025-12-06 10:07:49.921547329 +0000 UTC m=+0.083548559 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:07:49 np0005548789.localdomain systemd[1]: tmp-crun.E1j6X4.mount: Deactivated successfully.
Dec 06 10:07:49 np0005548789.localdomain podman[299614]: 2025-12-06 10:07:49.953611666 +0000 UTC m=+0.115612906 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:07:49 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.236 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.238 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.238 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.238 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:50.276 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e90 _set_new_cache_sizes cache_size:1019699809 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 1003...
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Activating special unit Exit the Session...
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped target Main User Target.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped target Basic System.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped target Paths.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped target Sockets.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped target Timers.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Closed D-Bus User Message Bus Socket.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Removed slice User Application Slice.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Reached target Shutdown.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Finished Exit the Session.
Dec 06 10:07:53 np0005548789.localdomain systemd[296743]: Reached target Exit the Session.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 1003.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Dec 06 10:07:53 np0005548789.localdomain systemd[1]: user-1003.slice: Consumed 2.288s CPU time.
Dec 06 10:07:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:07:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:07:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:07:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:07:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:07:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19220 "" "Go-http-client/1.1"
Dec 06 10:07:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:07:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:07:54 np0005548789.localdomain systemd[1]: tmp-crun.rzP3aA.mount: Deactivated successfully.
Dec 06 10:07:54 np0005548789.localdomain podman[299638]: 2025-12-06 10:07:54.927718992 +0000 UTC m=+0.087527559 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm)
Dec 06 10:07:54 np0005548789.localdomain podman[299638]: 2025-12-06 10:07:54.964185262 +0000 UTC m=+0.123993759 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:07:54 np0005548789.localdomain podman[299639]: 2025-12-06 10:07:54.973943966 +0000 UTC m=+0.129946379 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:07:54 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:07:55 np0005548789.localdomain podman[299639]: 2025-12-06 10:07:55.008943951 +0000 UTC m=+0.164946374 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 06 10:07:55 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.277 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.325 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:07:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:07:55.326 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:07:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:07:57 np0005548789.localdomain podman[299677]: 2025-12-06 10:07:57.932041814 +0000 UTC m=+0.089300103 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 10:07:57 np0005548789.localdomain podman[299677]: 2025-12-06 10:07:57.944595692 +0000 UTC m=+0.101853971 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:07:57 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:07:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e90 _set_new_cache_sizes cache_size:1020047688 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.329 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.329 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.329 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.349 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:00.350 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e90 _set_new_cache_sizes cache_size:1020054592 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:08:04 np0005548789.localdomain podman[299696]: 2025-12-06 10:08:04.933354327 +0000 UTC m=+0.090529160 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:08:04 np0005548789.localdomain podman[299696]: 2025-12-06 10:08:04.969158187 +0000 UTC m=+0.126332940 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:08:04 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.350 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.352 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.353 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.353 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.390 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:05.390 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:05 np0005548789.localdomain sshd[299720]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e91 e91: 6 total, 6 up, 6 in
Dec 06 10:08:06 np0005548789.localdomain sshd[299722]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:06 np0005548789.localdomain sshd[299720]: Received disconnect from 14.194.101.210 port 49068:11: Bye Bye [preauth]
Dec 06 10:08:06 np0005548789.localdomain sshd[299720]: Disconnected from authenticating user root 14.194.101.210 port 49068 [preauth]
Dec 06 10:08:06 np0005548789.localdomain sshd[299722]: Accepted publickey for ceph-admin from 192.168.122.108 port 37478 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:08:06 np0005548789.localdomain systemd-logind[766]: New session 69 of user ceph-admin.
Dec 06 10:08:06 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 1002.
Dec 06 10:08:06 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Dec 06 10:08:06 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Dec 06 10:08:06 np0005548789.localdomain systemd[1]: Starting User Manager for UID 1002...
Dec 06 10:08:06 np0005548789.localdomain systemd[299726]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Queued start job for default target Main User Target.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Created slice User Application Slice.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Reached target Paths.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Reached target Timers.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Starting D-Bus User Message Bus Socket...
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Starting Create User's Volatile Files and Directories...
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Finished Create User's Volatile Files and Directories.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Reached target Sockets.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Reached target Basic System.
Dec 06 10:08:07 np0005548789.localdomain systemd[1]: Started User Manager for UID 1002.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Reached target Main User Target.
Dec 06 10:08:07 np0005548789.localdomain systemd[299726]: Startup finished in 158ms.
Dec 06 10:08:07 np0005548789.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Dec 06 10:08:07 np0005548789.localdomain sshd[299722]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/1889957737' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548790.kvkfyr
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: osdmap e91: 6 total, 6 up, 6 in
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: mgrmap e26: np0005548790.kvkfyr(active, starting, since 0.0602071s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: Manager daemon np0005548790.kvkfyr is now available
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"}]': finished
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548786.localdomain.devices.0"}]': finished
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:08:07 np0005548789.localdomain sudo[299742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:07 np0005548789.localdomain sudo[299742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:07 np0005548789.localdomain sudo[299742]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:07 np0005548789.localdomain sudo[299760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:08:07 np0005548789.localdomain sudo[299760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e91 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:08 np0005548789.localdomain ceph-mon[298582]: removing stray HostCache host record np0005548786.localdomain.devices.0
Dec 06 10:08:08 np0005548789.localdomain ceph-mon[298582]: mgrmap e27: np0005548790.kvkfyr(active, since 1.07281s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:08 np0005548789.localdomain podman[299851]: 2025-12-06 10:08:08.234800743 +0000 UTC m=+0.102218351 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:08:08 np0005548789.localdomain podman[299851]: 2025-12-06 10:08:08.366835313 +0000 UTC m=+0.234252901 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Dec 06 10:08:08 np0005548789.localdomain sudo[299760]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:09 np0005548789.localdomain sudo[299971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:09 np0005548789.localdomain sudo[299971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:08:09 np0005548789.localdomain sudo[299971]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='client.26943 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:07] ENGINE Bus STARTING
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:07] ENGINE Serving on https://172.18.0.108:7150
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:07] ENGINE Client ('172.18.0.108', 58740) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:07] ENGINE Serving on http://172.18.0.108:8765
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:07] ENGINE Bus STARTED
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='client.44342 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: Cluster is now healthy
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:09 np0005548789.localdomain sudo[299991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:08:09 np0005548789.localdomain sudo[299991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:09 np0005548789.localdomain podman[299989]: 2025-12-06 10:08:09.244037536 +0000 UTC m=+0.091801418 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:08:09 np0005548789.localdomain podman[299989]: 2025-12-06 10:08:09.324386038 +0000 UTC m=+0.172149910 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:08:09 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:08:09 np0005548789.localdomain sudo[299991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:10 np0005548789.localdomain sudo[300063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300063]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:08:10 np0005548789.localdomain sudo[300081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain ceph-mon[298582]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.430 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.432 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.432 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.432 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.439 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.439 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:10.442 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:10 np0005548789.localdomain sudo[300081]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:10 np0005548789.localdomain sudo[300118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300118]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:10 np0005548789.localdomain sudo[300136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300136]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:10 np0005548789.localdomain sudo[300154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300154]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:10 np0005548789.localdomain sudo[300172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300172]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:10 np0005548789.localdomain sudo[300190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:10 np0005548789.localdomain sudo[300190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:10 np0005548789.localdomain sudo[300190]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300224]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300242]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain sudo[300260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300260]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: mgrmap e28: np0005548790.kvkfyr(active, since 3s), standbys: np0005548789.mzhmje, np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='client.54127 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Saving service mon spec with placement label:mon
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:11 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain sudo[300278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:11 np0005548789.localdomain sudo[300278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300278]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:11 np0005548789.localdomain sudo[300296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300296]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300314]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:11 np0005548789.localdomain sudo[300332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300332]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300350]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300384]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:11 np0005548789.localdomain sudo[300402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300402]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:11 np0005548789.localdomain sudo[300420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:11 np0005548789.localdomain sudo[300420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:11 np0005548789.localdomain sudo[300420]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:12 np0005548789.localdomain sudo[300438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300438]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:12 np0005548789.localdomain sudo[300456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300456]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548789.localdomain sudo[300474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300474]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:12 np0005548789.localdomain sudo[300492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300492]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: from='client.44354 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548789", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:12 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548789.localdomain sudo[300510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548789.localdomain sudo[300510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300510]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548789.localdomain sudo[300544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300544]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548789.localdomain sudo[300562]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300562]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:12 np0005548789.localdomain sudo[300580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300580]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:12 np0005548789.localdomain sudo[300598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300598]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:12 np0005548789.localdomain sudo[300616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300616]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 np0005548789.localdomain sudo[300634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:12 np0005548789.localdomain sudo[300634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:12 np0005548789.localdomain sudo[300634]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain sudo[300652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:13 np0005548789.localdomain sudo[300652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300652]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain sudo[300670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548789.localdomain sudo[300670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300670]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:13 np0005548789.localdomain sudo[300704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548789.localdomain sudo[300704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300704]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain sudo[300722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:13 np0005548789.localdomain sudo[300722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300722]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain sudo[300740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:13 np0005548789.localdomain sudo[300740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300740]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:13 np0005548789.localdomain sudo[300758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:13 np0005548789.localdomain sudo[300758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:13 np0005548789.localdomain sudo[300758]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3734042444' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 0 B/s wr, 19 op/s
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548787 (monmap changed)...
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548787 on np0005548787.localdomain
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548787.umwsra", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.480 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.482 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.482 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.482 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.487 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:15.487 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548787.umwsra (monmap changed)...
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548787.umwsra on np0005548787.localdomain
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:08:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0)
Dec 06 10:08:18 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/327302380' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:19 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/327302380' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Dec 06 10:08:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:08:19 np0005548789.localdomain podman[300776]: 2025-12-06 10:08:19.922827012 +0000 UTC m=+0.082687643 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:08:19 np0005548789.localdomain podman[300776]: 2025-12-06 10:08:19.956133686 +0000 UTC m=+0.115994257 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:08:19 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.214 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26470 172.18.0.108:0/129842165' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e92 e92: 6 total, 6 up, 6 in
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr handle_mgr_map Activating!
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr handle_mgr_map I am now activating
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548787"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.491 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.493 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.493 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.493 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548788"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548789"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005548790"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).mds e16 all = 0
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).mds e16 all = 0
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).mds e16 all = 0
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.526 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:20.527 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:08:20 np0005548789.localdomain sshd[299722]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).mds e16 all = 1
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Dec 06 10:08:20 np0005548789.localdomain systemd[1]: session-69.scope: Consumed 6.077s CPU time.
Dec 06 10:08:20 np0005548789.localdomain systemd-logind[766]: Session 69 logged out. Waiting for processes to exit.
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: balancer
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Starting
Dec 06 10:08:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain systemd-logind[766]: Removed session 69.
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Optimize plan auto_2025-12-06_10:08:20
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: cephadm
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: crash
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: devicehealth
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: iostat
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: nfs
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: orchestrator
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [devicehealth INFO root] Starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: pg_autoscaler
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: progress
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Loading...
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f0406bf1580>, <progress.module.GhostEvent object at 0x7f0406bf15b0>, <progress.module.GhostEvent object at 0x7f0406bf1610>, <progress.module.GhostEvent object at 0x7f0406bf1640>, <progress.module.GhostEvent object at 0x7f0406bf1670>, <progress.module.GhostEvent object at 0x7f0406bf16a0>, <progress.module.GhostEvent object at 0x7f0406bf16d0>, <progress.module.GhostEvent object at 0x7f0406bf1700>, <progress.module.GhostEvent object at 0x7f0406bf1730>, <progress.module.GhostEvent object at 0x7f0406bf1760>, <progress.module.GhostEvent object at 0x7f0406bf1790>, <progress.module.GhostEvent object at 0x7f0406bf17c0>, <progress.module.GhostEvent object at 0x7f0406bf17f0>, <progress.module.GhostEvent object at 0x7f0406bf1820>, <progress.module.GhostEvent object at 0x7f0406bf1850>, <progress.module.GhostEvent object at 0x7f0406bf1880>, <progress.module.GhostEvent object at 0x7f0406bf18b0>, <progress.module.GhostEvent object at 0x7f0406bf18e0>, <progress.module.GhostEvent object at 0x7f0406bf1910>, <progress.module.GhostEvent object at 0x7f0406bf1940>, <progress.module.GhostEvent object at 0x7f0406bf1970>, <progress.module.GhostEvent object at 0x7f0406bf19a0>, <progress.module.GhostEvent object at 0x7f0406bf19d0>, <progress.module.GhostEvent object at 0x7f0406bf1a00>, <progress.module.GhostEvent object at 0x7f0406bf1a30>, <progress.module.GhostEvent object at 0x7f0406bf1a60>, <progress.module.GhostEvent object at 0x7f0406bf1a90>, <progress.module.GhostEvent object at 0x7f0406bf1ac0>, <progress.module.GhostEvent object at 0x7f0406bf1af0>, <progress.module.GhostEvent object at 0x7f0406bf1b20>, <progress.module.GhostEvent object at 0x7f0406bf1b50>, <progress.module.GhostEvent object at 0x7f0406bf1b80>, <progress.module.GhostEvent object at 0x7f0406bf1bb0>, <progress.module.GhostEvent object at 0x7f0406bf1be0>, <progress.module.GhostEvent object at 0x7f0406bf1c10>, <progress.module.GhostEvent object at 0x7f0406bf1c40>, <progress.module.GhostEvent object at 0x7f0406bf1c70>, <progress.module.GhostEvent object at 0x7f0406bf1ca0>, <progress.module.GhostEvent object at 0x7f0406bf1cd0>, <progress.module.GhostEvent object at 0x7f0406bf1d00>, <progress.module.GhostEvent object at 0x7f0406bf1d30>, <progress.module.GhostEvent object at 0x7f0406bf1d60>, <progress.module.GhostEvent object at 0x7f0406bf1d90>, <progress.module.GhostEvent object at 0x7f0406bf1dc0>, <progress.module.GhostEvent object at 0x7f0406bf1df0>, <progress.module.GhostEvent object at 0x7f0406bf1e20>, <progress.module.GhostEvent object at 0x7f0406bf1e50>, <progress.module.GhostEvent object at 0x7f0406bf1e80>, <progress.module.GhostEvent object at 0x7f0406bf1eb0>, <progress.module.GhostEvent object at 0x7f0406bf1ee0>] historic events
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Loaded OSDMap, ready.
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] recovery thread starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] starting setup
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: rbd_support
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: restful
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [restful INFO root] server_addr: :: server_port: 8003
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [restful WARNING root] server not running: no certificate configured
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: status
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: telemetry
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:08:20 np0005548789.localdomain podman[300811]: 2025-12-06 10:08:20.660297753 +0000 UTC m=+0.089699475 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:08:20 np0005548789.localdomain podman[300811]: 2025-12-06 10:08:20.672173911 +0000 UTC m=+0.101575633 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: mgr load Constructed class from module: volumes
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.682+0000 7f03f12ef640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.682+0000 7f03f12ef640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.682+0000 7f03f12ef640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.682+0000 7f03f12ef640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.682+0000 7f03f12ef640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] PerfHandler: starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.692+0000 7f03eeaea640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.692+0000 7f03eeaea640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.692+0000 7f03eeaea640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.692+0000 7f03eeaea640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:08:20.692+0000 7f03eeaea640 -1 client.0 error registering admin socket command: (17) File exists
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_task_task: vms, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_task_task: volumes, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_task_task: images, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_task_task: backups, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] TaskHandler: starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} v 0)
Dec 06 10:08:20 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} : dispatch
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 06 10:08:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] setup complete
Dec 06 10:08:20 np0005548789.localdomain sshd[300957]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:20 np0005548789.localdomain sshd[300957]: Accepted publickey for ceph-admin from 192.168.122.107 port 33678 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:08:20 np0005548789.localdomain systemd-logind[766]: New session 71 of user ceph-admin.
Dec 06 10:08:20 np0005548789.localdomain systemd[1]: Started Session 71 of User ceph-admin.
Dec 06 10:08:20 np0005548789.localdomain sshd[300957]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:08:21 np0005548789.localdomain sudo[300961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:21 np0005548789.localdomain sudo[300961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:21 np0005548789.localdomain sudo[300961]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:21 np0005548789.localdomain sudo[300979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:08:21 np0005548789.localdomain sudo[300979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:21.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/2304971504' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548789.mzhmje
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: mgrmap e29: np0005548789.mzhmje(active, starting, since 0.0455985s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548787"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: Manager daemon np0005548789.mzhmje is now available
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/mirror_snapshot_schedule"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548789.mzhmje/trash_purge_schedule"} : dispatch
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:08:21] ENGINE Bus STARTING
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:08:21] ENGINE Bus STARTING
Dec 06 10:08:21 np0005548789.localdomain podman[301071]: 2025-12-06 10:08:21.795005367 +0000 UTC m=+0.068918429 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=1763362218, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z)
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:08:21] ENGINE Serving on https://172.18.0.107:7150
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:08:21] ENGINE Serving on https://172.18.0.107:7150
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:08:21] ENGINE Client ('172.18.0.107', 48298) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:08:21] ENGINE Client ('172.18.0.107', 48298) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:21 np0005548789.localdomain podman[301071]: 2025-12-06 10:08:21.86310668 +0000 UTC m=+0.137019782 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:08:21] ENGINE Serving on http://172.18.0.107:8765
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:08:21] ENGINE Serving on http://172.18.0.107:8765
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cherrypy.error] [06/Dec/2025:10:08:21] ENGINE Bus STARTED
Dec 06 10:08:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : [06/Dec/2025:10:08:21] ENGINE Bus STARTED
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain.devices.0}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mgrmap e30: np0005548789.mzhmje(active, since 1.06844s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:21] ENGINE Bus STARTING
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:21] ENGINE Serving on https://172.18.0.107:7150
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:21] ENGINE Client ('172.18.0.107', 48298) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:21] ENGINE Serving on http://172.18.0.107:8765
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:08:21] ENGINE Bus STARTED
Dec 06 10:08:22 np0005548789.localdomain sudo[300979]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:22 np0005548789.localdomain sudo[301209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:22 np0005548789.localdomain sudo[301209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:22 np0005548789.localdomain sudo[301209]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:22 np0005548789.localdomain sudo[301228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:08:22 np0005548789.localdomain sudo[301228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mgr[288591]: [devicehealth INFO root] Check health
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:08:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:23 np0005548789.localdomain sudo[301228]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: Cluster is now healthy
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:23 np0005548789.localdomain sudo[301287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:23 np0005548789.localdomain sudo[301287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:23 np0005548789.localdomain sudo[301287]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:23 np0005548789.localdomain sudo[301305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:08:23 np0005548789.localdomain sudo[301305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:08:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:08:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:08:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:08:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:08:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19220 "" "Go-http-client/1.1"
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:08:24 np0005548789.localdomain sudo[301305]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain.devices.0}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:24 np0005548789.localdomain sudo[301342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:24 np0005548789.localdomain sudo[301342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301342]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain sudo[301360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:24 np0005548789.localdomain sudo[301360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301360]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:24 np0005548789.localdomain sudo[301378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548789.localdomain sudo[301378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301378]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain sudo[301396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:24 np0005548789.localdomain sudo[301396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301396]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain sudo[301414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548789.localdomain sudo[301414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301414]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain sudo[301448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548789.localdomain sudo[301448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301448]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: mgrmap e31: np0005548789.mzhmje(active, since 3s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd/host:np0005548787", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:24 np0005548789.localdomain sudo[301466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:24 np0005548789.localdomain sudo[301466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:24 np0005548789.localdomain sudo[301466]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain sudo[301484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain sudo[301484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:08:25 np0005548789.localdomain sudo[301484]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain sudo[301504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:25 np0005548789.localdomain sudo[301504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301504]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain podman[301502]: 2025-12-06 10:08:25.171037723 +0000 UTC m=+0.100588293 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, maintainer=Red Hat, Inc.)
Dec 06 10:08:25 np0005548789.localdomain podman[301502]: 2025-12-06 10:08:25.183598681 +0000 UTC m=+0.113149161 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 06 10:08:25 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:08:25 np0005548789.localdomain sudo[301544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:25 np0005548789.localdomain sudo[301544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301544]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain podman[301503]: 2025-12-06 10:08:25.27410705 +0000 UTC m=+0.201860456 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:08:25 np0005548789.localdomain podman[301503]: 2025-12-06 10:08:25.282419941 +0000 UTC m=+0.210173377 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:25 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:08:25 np0005548789.localdomain sudo[301569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548789.localdomain sudo[301569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301569]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain sudo[301595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:25 np0005548789.localdomain sudo[301595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301595]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain sudo[301613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548789.localdomain sudo[301613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301613]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain sshd[301630]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.528 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.530 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.530 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.559 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:25.559 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:25 np0005548789.localdomain sudo[301649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548789.localdomain sudo[301649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301649]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mgr.np0005548790.kvkfyr 172.18.0.108:0/2122066654; not ready for session (expect reconnect)
Dec 06 10:08:25 np0005548789.localdomain sudo[301667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain sudo[301667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301667]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain sudo[301685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain sudo[301685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301685]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain sudo[301703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:25 np0005548789.localdomain sudo[301703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301703]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain sudo[301721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:25 np0005548789.localdomain sudo[301721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:25 np0005548789.localdomain sudo[301721]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Standby manager daemon np0005548790.kvkfyr started
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:25 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain sudo[301739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301739]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain sudo[301757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:26 np0005548789.localdomain sudo[301757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301757]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain sudo[301775]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301775]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301775]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.204 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.227 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.227 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:26 np0005548789.localdomain sudo[301810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301810]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} v 0)
Dec 06 10:08:26 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:08:26 np0005548789.localdomain sudo[301828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301828]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain sudo[301865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain sudo[301865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301865]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Dec 06 10:08:26 np0005548789.localdomain sudo[301883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:26 np0005548789.localdomain sudo[301883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301883]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain sudo[301901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:26 np0005548789.localdomain sudo[301901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301901]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:26 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1224196971' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:26 np0005548789.localdomain sudo[301919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.717 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:26 np0005548789.localdomain sudo[301919]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain sudo[301939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:26 np0005548789.localdomain sudo[301939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301939]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.786 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.787 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:08:26 np0005548789.localdomain sshd[301630]: Received disconnect from 118.219.234.233 port 50132:11: Bye Bye [preauth]
Dec 06 10:08:26 np0005548789.localdomain sshd[301630]: Disconnected from authenticating user root 118.219.234.233 port 50132 [preauth]
Dec 06 10:08:26 np0005548789.localdomain sudo[301957]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:26 np0005548789.localdomain sudo[301957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:26 np0005548789.localdomain sudo[301957]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.982 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.983 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11480MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.983 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:26.983 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:27 np0005548789.localdomain sudo[301991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:27 np0005548789.localdomain sudo[301991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548789.localdomain sudo[301991]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:08:27 np0005548789.localdomain sudo[302009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:08:27 np0005548789.localdomain sudo[302009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548789.localdomain sudo[302009]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:08:27 np0005548789.localdomain sudo[302027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548789.localdomain sudo[302027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548789.localdomain sudo[302027]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain.devices.0}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548787.localdomain}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev e5b18223-81f9-4f8a-872e-b03e40a6e131 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev e5b18223-81f9-4f8a-872e-b03e40a6e131 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event e5b18223-81f9-4f8a-872e-b03e40a6e131 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.292 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.293 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.293 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.362 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 5s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: Updating np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1224196971' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.445 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.445 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.459 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.483 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.523 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:27 np0005548789.localdomain sudo[302045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:27 np0005548789.localdomain sudo[302045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:27 np0005548789.localdomain sudo[302045]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:27 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3249850813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.974 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:27.981 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.005 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.008 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.008 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3249850813' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:08:28 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:28 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:28 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:28 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:08:28 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:08:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:08:28 np0005548789.localdomain podman[302085]: 2025-12-06 10:08:28.918078552 +0000 UTC m=+0.079182798 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Dec 06 10:08:28 np0005548789.localdomain podman[302085]: 2025-12-06 10:08:28.929703003 +0000 UTC m=+0.090807259 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:08:28 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.986 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.987 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:08:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:28.987 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.054 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.054 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.054 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.055 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:08:29 np0005548789.localdomain sshd[302105]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:29 np0005548789.localdomain sshd[302106]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain.devices.0}] v 0)
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548788.localdomain}] v 0)
Dec 06 10:08:29 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:08:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:08:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:29 np0005548789.localdomain sudo[302108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:29 np0005548789.localdomain sudo[302108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:29 np0005548789.localdomain sudo[302108]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:29 np0005548789.localdomain sudo[302126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:29 np0005548789.localdomain sudo[302126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.733 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.763 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.763 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.764 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.764 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.764 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.765 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.765 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:08:29 np0005548789.localdomain sshd[302105]: Received disconnect from 64.227.102.57 port 33218:11: Bye Bye [preauth]
Dec 06 10:08:29 np0005548789.localdomain sshd[302105]: Disconnected from authenticating user root 64.227.102.57 port 33218 [preauth]
Dec 06 10:08:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:29.954 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.045334052 +0000 UTC m=+0.075230809 container create 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True)
Dec 06 10:08:30 np0005548789.localdomain systemd[1]: Started libpod-conmon-5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5.scope.
Dec 06 10:08:30 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.017578875 +0000 UTC m=+0.047475702 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.120937311 +0000 UTC m=+0.150834088 container init 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, architecture=x86_64, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.134237392 +0000 UTC m=+0.164134169 container start 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1763362218, RELEASE=main, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.13450555 +0000 UTC m=+0.164402337 container attach 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:08:30 np0005548789.localdomain systemd[1]: tmp-crun.2Z1ppV.mount: Deactivated successfully.
Dec 06 10:08:30 np0005548789.localdomain systemd[1]: libpod-5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5.scope: Deactivated successfully.
Dec 06 10:08:30 np0005548789.localdomain sweet_beaver[302176]: 167 167
Dec 06 10:08:30 np0005548789.localdomain podman[302161]: 2025-12-06 10:08:30.139354125 +0000 UTC m=+0.169250932 container died 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, build-date=2025-11-26T19:44:28Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1763362218, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, version=7)
Dec 06 10:08:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:30.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:30.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:30 np0005548789.localdomain podman[302181]: 2025-12-06 10:08:30.242218056 +0000 UTC m=+0.092216211 container remove 5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_beaver, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1763362218, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:08:30 np0005548789.localdomain systemd[1]: libpod-conmon-5dc8672dcc55de10c05d73a12defb379021b8c2ffc14b1de89fbefa3cd7e65e5.scope: Deactivated successfully.
Dec 06 10:08:30 np0005548789.localdomain sudo[302126]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:30 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:08:30 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:30 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:08:30 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:08:30 np0005548789.localdomain sudo[302197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:30 np0005548789.localdomain sudo[302197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:30 np0005548789.localdomain sudo[302197]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:30 np0005548789.localdomain sudo[302215]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:30 np0005548789.localdomain sudo[302215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:30.558 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:30.559 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:30 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Writing back 50 completed events
Dec 06 10:08:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:31.027411004 +0000 UTC m=+0.076423664 container create c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, version=7, RELEASE=main, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a1faae6de27bb8ff17635f97614e04310a7989935942b68c5b17c874b9a4b85a-merged.mount: Deactivated successfully.
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914.scope.
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:30.993863863 +0000 UTC m=+0.042876533 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:31.099072005 +0000 UTC m=+0.148084665 container init c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, GIT_CLEAN=True, release=1763362218, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: tmp-crun.qHMFqX.mount: Deactivated successfully.
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:31.111593602 +0000 UTC m=+0.160606272 container start c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, release=1763362218, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:08:31 np0005548789.localdomain vibrant_shaw[302267]: 167 167
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:31.112362266 +0000 UTC m=+0.161374926 container attach c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: libpod-c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914.scope: Deactivated successfully.
Dec 06 10:08:31 np0005548789.localdomain podman[302250]: 2025-12-06 10:08:31.11448107 +0000 UTC m=+0.163493740 container died c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main)
Dec 06 10:08:31 np0005548789.localdomain podman[302272]: 2025-12-06 10:08:31.25748279 +0000 UTC m=+0.133203276 container remove c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_shaw, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.41.4)
Dec 06 10:08:31 np0005548789.localdomain systemd[1]: libpod-conmon-c57769c602aa8fc07e7d2b1598e3b0868c82b8e00754fb14f03609c43673f914.scope: Deactivated successfully.
Dec 06 10:08:31 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:08:31 np0005548789.localdomain sudo[302215]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:08:31 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:31 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:08:31 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:08:31 np0005548789.localdomain sudo[302296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:31 np0005548789.localdomain sudo[302296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:31 np0005548789.localdomain sudo[302296]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:08:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:31 np0005548789.localdomain sudo[302314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:31 np0005548789.localdomain sudo[302314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2511db499222bfceb614c3294a4a3aac44d207f9f92cae0ea8d7358c72b4c212-merged.mount: Deactivated successfully.
Dec 06 10:08:32 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.1578425 +0000 UTC m=+0.079646131 container create 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, RELEASE=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., release=1763362218, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:32.178 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:32.196 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:32.197 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:08:32 np0005548789.localdomain systemd[1]: Started libpod-conmon-70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738.scope.
Dec 06 10:08:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.127245178 +0000 UTC m=+0.049048859 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.237522461 +0000 UTC m=+0.159326092 container init 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph)
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.246628337 +0000 UTC m=+0.168431968 container start 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.247145582 +0000 UTC m=+0.168949243 container attach 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, version=7, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph)
Dec 06 10:08:32 np0005548789.localdomain lucid_bouman[302365]: 167 167
Dec 06 10:08:32 np0005548789.localdomain systemd[1]: libpod-70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738.scope: Deactivated successfully.
Dec 06 10:08:32 np0005548789.localdomain podman[302350]: 2025-12-06 10:08:32.250709039 +0000 UTC m=+0.172512710 container died 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Dec 06 10:08:32 np0005548789.localdomain podman[302370]: 2025-12-06 10:08:32.34228331 +0000 UTC m=+0.082324443 container remove 70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_bouman, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.component=rhceph-container, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True)
Dec 06 10:08:32 np0005548789.localdomain systemd[1]: libpod-conmon-70160df232d73ab8d4735fbd215f2d538fca8d687fdbb9baecff2d9d848a0738.scope: Deactivated successfully.
Dec 06 10:08:32 np0005548789.localdomain sshd[302106]: Received disconnect from 123.160.164.187 port 54828:11: Bye Bye [preauth]
Dec 06 10:08:32 np0005548789.localdomain sshd[302106]: Disconnected from authenticating user root 123.160.164.187 port 54828 [preauth]
Dec 06 10:08:32 np0005548789.localdomain sudo[302314]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:08:32 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:32 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:08:32 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:32 np0005548789.localdomain sudo[302394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:32 np0005548789.localdomain sudo[302394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:32 np0005548789.localdomain sudo[302394]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:32 np0005548789.localdomain sudo[302412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:32 np0005548789.localdomain sudo[302412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ecacb212a165db74fc22ba98139a6a8f12b12d9e169791da2c0eab0a950b21fb-merged.mount: Deactivated successfully.
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.211545262 +0000 UTC m=+0.070206627 container create bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main)
Dec 06 10:08:33 np0005548789.localdomain systemd[1]: Started libpod-conmon-bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1.scope.
Dec 06 10:08:33 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.276226422 +0000 UTC m=+0.134887797 container init bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:08:33 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.18360686 +0000 UTC m=+0.042268245 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:33 np0005548789.localdomain goofy_wright[302464]: 167 167
Dec 06 10:08:33 np0005548789.localdomain systemd[1]: libpod-bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1.scope: Deactivated successfully.
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.293664357 +0000 UTC m=+0.152325722 container start bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, ceph=True, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.293855023 +0000 UTC m=+0.152516398 container attach bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, release=1763362218, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph)
Dec 06 10:08:33 np0005548789.localdomain podman[302448]: 2025-12-06 10:08:33.295468122 +0000 UTC m=+0.154129497 container died bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1763362218, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, version=7)
Dec 06 10:08:33 np0005548789.localdomain podman[302469]: 2025-12-06 10:08:33.378890317 +0000 UTC m=+0.084548229 container remove bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_wright, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:33 np0005548789.localdomain systemd[1]: libpod-conmon-bb23283739c220df69ac7cb4b33992937af847fccc471d58831b3f293be512f1.scope: Deactivated successfully.
Dec 06 10:08:33 np0005548789.localdomain sudo[302412]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:33 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:08:33 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:08:33 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:08:33 np0005548789.localdomain sudo[302485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:33 np0005548789.localdomain sudo[302485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:33 np0005548789.localdomain sudo[302485]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:33 np0005548789.localdomain sudo[302503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:33 np0005548789.localdomain sudo[302503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ec464f3513e5c3afa95c75e49fa45403e3e530dfd4f37daad73d1739d88190da-merged.mount: Deactivated successfully.
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.07067186 +0000 UTC m=+0.076189328 container create 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git)
Dec 06 10:08:34 np0005548789.localdomain systemd[1]: Started libpod-conmon-2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e.scope.
Dec 06 10:08:34 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.039009975 +0000 UTC m=+0.044527483 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.144315119 +0000 UTC m=+0.149832577 container init 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=1763362218, name=rhceph)
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.15528214 +0000 UTC m=+0.160799598 container start 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:08:34 np0005548789.localdomain silly_kepler[302552]: 167 167
Dec 06 10:08:34 np0005548789.localdomain systemd[1]: libpod-2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e.scope: Deactivated successfully.
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.155578199 +0000 UTC m=+0.161095697 container attach 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True)
Dec 06 10:08:34 np0005548789.localdomain podman[302537]: 2025-12-06 10:08:34.158956921 +0000 UTC m=+0.164474419 container died 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Dec 06 10:08:34 np0005548789.localdomain podman[302557]: 2025-12-06 10:08:34.261933085 +0000 UTC m=+0.094550121 container remove 2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_kepler, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph)
Dec 06 10:08:34 np0005548789.localdomain systemd[1]: libpod-conmon-2f3d8dc3c20a1ca236e1c6eaddf9538f0c868bac6a90acc5b4c7a6488b807a9e.scope: Deactivated successfully.
Dec 06 10:08:34 np0005548789.localdomain sudo[302503]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:34 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:08:34 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:08:34 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:08:34 np0005548789.localdomain sudo[302574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:34 np0005548789.localdomain sudo[302574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:34 np0005548789.localdomain sudo[302574]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:34 np0005548789.localdomain sudo[302592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:34 np0005548789.localdomain sudo[302592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3333674039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:34 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548787", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ce761430d69334dd94a31804d71973070c50376e38dd1b1c3f5c1f6dc6ff4a15-merged.mount: Deactivated successfully.
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.101898465 +0000 UTC m=+0.078148487 container create 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main)
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: tmp-crun.613DUo.mount: Deactivated successfully.
Dec 06 10:08:35 np0005548789.localdomain podman[302627]: 2025-12-06 10:08:35.130889978 +0000 UTC m=+0.112726069 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166.scope.
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:08:35 np0005548789.localdomain podman[302627]: 2025-12-06 10:08:35.166561604 +0000 UTC m=+0.148397655 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.074400455 +0000 UTC m=+0.050650537 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.177603947 +0000 UTC m=+0.153853989 container init 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, release=1763362218, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.186128233 +0000 UTC m=+0.162378295 container start 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, RELEASE=main, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.186515385 +0000 UTC m=+0.162765407 container attach 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.41.4, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Dec 06 10:08:35 np0005548789.localdomain sad_elbakyan[302666]: 167 167
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: libpod-210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166.scope: Deactivated successfully.
Dec 06 10:08:35 np0005548789.localdomain podman[302628]: 2025-12-06 10:08:35.190140574 +0000 UTC m=+0.166390656 container died 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7)
Dec 06 10:08:35 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:08:35 np0005548789.localdomain podman[302671]: 2025-12-06 10:08:35.280686934 +0000 UTC m=+0.080176308 container remove 210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_elbakyan, version=7, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7)
Dec 06 10:08:35 np0005548789.localdomain systemd[1]: libpod-conmon-210320d4ba75a9e4beba51d8c7e11f3b04123b8d733dc4f0bf68baab67644166.scope: Deactivated successfully.
Dec 06 10:08:35 np0005548789.localdomain sudo[302592]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain.devices.0}] v 0)
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548789.localdomain}] v 0)
Dec 06 10:08:35 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:08:35 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:08:35 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.560 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.592 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:35.593 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548787", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/109628701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2265627899' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 ' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-cfb8622338409a0aa7cb66aee8e9c1c408ae7d98bc5f4ace1d1c936a1a8635fe-merged.mount: Deactivated successfully.
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain.devices.0}] v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005548790.localdomain}] v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.34469 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548787"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Remove daemons mon.np0005548787
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005548787
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005548787: new quorum should be ['np0005548790', 'np0005548788', 'np0005548789'] (from ['np0005548790', 'np0005548788', 'np0005548789'])
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005548787: new quorum should be ['np0005548790', 'np0005548788', 'np0005548789'] (from ['np0005548790', 'np0005548788', 'np0005548789'])
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005548787 from monmap...
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing monitor np0005548787 from monmap...
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "mon rm", "name": "np0005548787"} v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [INF] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon rm", "name": "np0005548787"} : dispatch
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005548787 from np0005548787.localdomain -- ports []
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005548787 from np0005548787.localdomain -- ports []
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: client.44398 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@3(peon) e13  my rank is now 2 (was 3)
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: client.44398 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: client.54179 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:08:36 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: paxos.2).electionLogic(56) init, last seen epoch 56
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:37 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:39 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:08:39 np0005548789.localdomain podman[302687]: 2025-12-06 10:08:39.928394061 +0000 UTC m=+0.081974731 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec 06 10:08:39 np0005548789.localdomain podman[302687]: 2025-12-06 10:08:39.971381267 +0000 UTC m=+0.124961917 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:08:39 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:08:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 handle_auth_request failed to assign global_id
Dec 06 10:08:40 np0005548789.localdomain ceph-mds[287313]: mds.beacon.mds.np0005548789.vxwwsq missed beacon ack from the monitors
Dec 06 10:08:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:40.593 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='client.34469 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548787"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Remove daemons mon.np0005548787
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Safe to remove mon.np0005548787: new quorum should be ['np0005548790', 'np0005548788', 'np0005548789'] (from ['np0005548790', 'np0005548788', 'np0005548789'])
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Removing monitor np0005548787 from monmap...
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon rm", "name": "np0005548787"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Removing daemon mon.np0005548787 from np0005548787.localdomain -- ports []
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 is new leader, mons np0005548790,np0005548789 in quorum (ranks 0,2)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: monmap epoch 13
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:08:36.308855+0000
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 20s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1/3 mons down, quorum np0005548790,np0005548789 (MON_DOWN)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005548790,np0005548789
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005548790,np0005548789
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]:     mon.np0005548788 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:41 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 06 10:08:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 06 10:08:41 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:08:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon) e13 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon) e13 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:08:41 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2722608319' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.478207) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722478260, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2567, "num_deletes": 272, "total_data_size": 10789467, "memory_usage": 11593992, "flush_reason": "Manual Compaction"}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722520175, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 7196772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10571, "largest_seqno": 13137, "table_properties": {"data_size": 7184980, "index_size": 7273, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3461, "raw_key_size": 31353, "raw_average_key_size": 22, "raw_value_size": 7158759, "raw_average_value_size": 5221, "num_data_blocks": 313, "num_entries": 1371, "num_filter_entries": 1371, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015627, "oldest_key_time": 1765015627, "file_creation_time": 1765015722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 42018 microseconds, and 9532 cpu microseconds.
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.520230) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 7196772 bytes OK
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.520253) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.521657) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.521675) EVENT_LOG_v1 {"time_micros": 1765015722521670, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.521693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 10775666, prev total WAL file size 10781099, number of live WAL files 2.
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.523596) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(7028KB)], [15(12MB)]
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722523681, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 20664240, "oldest_snapshot_seqno": -1}
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11123 keys, 17441653 bytes, temperature: kUnknown
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722623010, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 17441653, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17376655, "index_size": 36097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297757, "raw_average_key_size": 26, "raw_value_size": 17185459, "raw_average_value_size": 1545, "num_data_blocks": 1384, "num_entries": 11123, "num_filter_entries": 11123, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015722, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.623245) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 17441653 bytes
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.624804) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.5 rd, 176.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.9, 12.8 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(5.3) write-amplify(2.4) OK, records in: 11676, records dropped: 553 output_compression: NoCompression
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.624825) EVENT_LOG_v1 {"time_micros": 1765015722624816, "job": 6, "event": "compaction_finished", "compaction_time_micros": 99113, "compaction_time_cpu_micros": 45640, "output_level": 6, "num_output_files": 1, "total_output_size": 17441653, "num_input_records": 11676, "num_output_records": 11123, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722625543, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015722626979, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.523446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.627004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.627008) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.627010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.627012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:42.627014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.34476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Removed label mon from host np0005548787.localdomain
Dec 06 10:08:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removed label mon from host np0005548787.localdomain
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:08:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:08:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:08:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:08:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2689790601' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 is new leader, mons np0005548790,np0005548788,np0005548789 in quorum (ranks 0,1,2)
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: monmap epoch 13
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:08:36.308855+0000
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005548790
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 22s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548790,np0005548789)
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Removed label mgr from host np0005548787.localdomain
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005548787.localdomain
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:08:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:08:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:45 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:45 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44458 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:45 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Removed label _admin from host np0005548787.localdomain
Dec 06 10:08:45 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005548787.localdomain
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.599 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.599 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.599 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.631 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:45.632 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: Removed label mgr from host np0005548787.localdomain
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:08:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:08:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:08:47 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:08:47.299 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:08:47.299 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:08:47.300 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:48 np0005548789.localdomain ceph-mon[298582]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:48 np0005548789.localdomain ceph-mon[298582]: from='client.44458 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005548787.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:48 np0005548789.localdomain ceph-mon[298582]: Removed label _admin from host np0005548787.localdomain
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:48 np0005548789.localdomain sudo[302713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:08:48 np0005548789.localdomain sudo[302713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:48 np0005548789.localdomain sudo[302713]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548789.localdomain sudo[302731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:48 np0005548789.localdomain sudo[302731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548789.localdomain sudo[302731]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548789.localdomain sudo[302749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:48 np0005548789.localdomain sudo[302749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548789.localdomain sudo[302749]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:48 np0005548789.localdomain sudo[302767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:48 np0005548789.localdomain sudo[302767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:48 np0005548789.localdomain sudo[302767]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302785]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302819]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:49 np0005548789.localdomain sudo[302837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302837]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain sudo[302855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain sudo[302855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302855]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain sudo[302873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:49 np0005548789.localdomain sudo[302873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302873]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:08:49 np0005548789.localdomain sudo[302891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302891]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Removing np0005548787.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: Removing np0005548787.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:49 np0005548789.localdomain sudo[302909]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:08:49 np0005548789.localdomain sudo[302927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302927]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302945]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302979]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302979]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain sudo[302997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:08:49 np0005548789.localdomain sudo[302997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:49 np0005548789.localdomain sudo[302997]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:08:50 np0005548789.localdomain sudo[303021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548789.localdomain sudo[303021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:50 np0005548789.localdomain sudo[303021]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:50 np0005548789.localdomain podman[303015]: 2025-12-06 10:08:50.129629781 +0000 UTC m=+0.119757470 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 06 10:08:50 np0005548789.localdomain podman[303015]: 2025-12-06 10:08:50.163021858 +0000 UTC m=+0.153149517 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:08:50 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev 4259bd03-1a0b-415c-a9e9-199a778c03cd (Updating mgr deployment (-1 -> 3))
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005548787.umwsra from np0005548787.localdomain -- ports [8765]
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005548787.umwsra from np0005548787.localdomain -- ports [8765]
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.633 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.635 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.635 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.635 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.671 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:08:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:50.672 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:08:50 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:08:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:08:50 np0005548789.localdomain systemd[1]: tmp-crun.kVY8vI.mount: Deactivated successfully.
Dec 06 10:08:50 np0005548789.localdomain podman[303050]: 2025-12-06 10:08:50.909421027 +0000 UTC m=+0.078815978 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:50 np0005548789.localdomain podman[303050]: 2025-12-06 10:08:50.950439433 +0000 UTC m=+0.119834374 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:08:50 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:08:51 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: Removing daemon mgr.np0005548787.umwsra from np0005548787.localdomain -- ports [8765]
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.957432) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731957480, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 630, "num_deletes": 250, "total_data_size": 765562, "memory_usage": 778872, "flush_reason": "Manual Compaction"}
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731963321, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 453098, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13142, "largest_seqno": 13767, "table_properties": {"data_size": 449748, "index_size": 1205, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8100, "raw_average_key_size": 19, "raw_value_size": 442687, "raw_average_value_size": 1054, "num_data_blocks": 48, "num_entries": 420, "num_filter_entries": 420, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015722, "oldest_key_time": 1765015722, "file_creation_time": 1765015731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 5912 microseconds, and 2282 cpu microseconds.
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.963350) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 453098 bytes OK
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.963364) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.964934) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.964946) EVENT_LOG_v1 {"time_micros": 1765015731964943, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.964959) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 761862, prev total WAL file size 761862, number of live WAL files 2.
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.965291) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323836' seq:72057594037927935, type:22 .. '6B760031353337' seq:0, type:0; will stop at (end)
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(442KB)], [18(16MB)]
Dec 06 10:08:51 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015731965351, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17894751, "oldest_snapshot_seqno": -1}
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 11016 keys, 16889865 bytes, temperature: kUnknown
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732047602, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 16889865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16826454, "index_size": 34766, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27589, "raw_key_size": 297268, "raw_average_key_size": 26, "raw_value_size": 16637818, "raw_average_value_size": 1510, "num_data_blocks": 1311, "num_entries": 11016, "num_filter_entries": 11016, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015731, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.047944) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 16889865 bytes
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.049890) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.3 rd, 205.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.6 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(76.8) write-amplify(37.3) OK, records in: 11543, records dropped: 527 output_compression: NoCompression
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.049922) EVENT_LOG_v1 {"time_micros": 1765015732049908, "job": 8, "event": "compaction_finished", "compaction_time_micros": 82339, "compaction_time_cpu_micros": 36245, "output_level": 6, "num_output_files": 1, "total_output_size": 16889865, "num_input_records": 11543, "num_output_records": 11016, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732050137, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015732053118, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:51.965229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.053160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.053164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.053167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.053168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:08:52.053170) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005548787.umwsra
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005548787.umwsra
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev 4259bd03-1a0b-415c-a9e9-199a778c03cd (Updating mgr deployment (-1 -> 3))
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event 4259bd03-1a0b-415c-a9e9-199a778c03cd (Updating mgr deployment (-1 -> 3)) in 2 seconds
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev d378aec2-e8d0-4148-bcb8-8a9d09701394 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev d378aec2-e8d0-4148-bcb8-8a9d09701394 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:52 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event d378aec2-e8d0-4148-bcb8-8a9d09701394 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 06 10:08:52 np0005548789.localdomain sudo[303074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:52 np0005548789.localdomain sudo[303074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:52 np0005548789.localdomain sudo[303074]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth rm", "entity": "mgr.np0005548787.umwsra"} : dispatch
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005548787.umwsra"}]': finished
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:53 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:08:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:08:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:08:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:08:53 np0005548789.localdomain ceph-mon[298582]: Removing key for mgr.np0005548787.umwsra
Dec 06 10:08:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:08:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19228 "" "Go-http-client/1.1"
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev 9c69ec7c-cfed-449b-bbea-090d12f32dd8 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev 9c69ec7c-cfed-449b-bbea-090d12f32dd8 (Updating node-proxy deployment (+4 -> 4))
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event 9c69ec7c-cfed-449b-bbea-090d12f32dd8 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Dec 06 10:08:54 np0005548789.localdomain sudo[303093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:54 np0005548789.localdomain sudo[303093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:08:54 np0005548789.localdomain sudo[303093]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:54 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548787.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.073 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.093 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.094 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.094 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.121 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:55 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Writing back 50 completed events
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.673 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.675 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.676 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.676 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.708 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:08:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:08:55.709 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:08:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:08:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:08:55 np0005548789.localdomain podman[303112]: 2025-12-06 10:08:55.927228761 +0000 UTC m=+0.084020664 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 06 10:08:55 np0005548789.localdomain podman[303112]: 2025-12-06 10:08:55.965104752 +0000 UTC m=+0.121896665 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2)
Dec 06 10:08:55 np0005548789.localdomain systemd[1]: tmp-crun.RrNnuT.mount: Deactivated successfully.
Dec 06 10:08:55 np0005548789.localdomain podman[303111]: 2025-12-06 10:08:55.986957951 +0000 UTC m=+0.146855677 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:08:55 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:08:56 np0005548789.localdomain podman[303111]: 2025-12-06 10:08:56.00019173 +0000 UTC m=+0.160089516 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:08:56 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548787 (monmap changed)...
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548787 on np0005548787.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.54257 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548787.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Added label _no_schedule to host np0005548787.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005548787.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548787.localdomain
Dec 06 10:08:56 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548787.localdomain
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='client.54257 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005548787.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: Added label _no_schedule to host np0005548787.localdomain
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005548787.localdomain
Dec 06 10:08:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:57 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:57 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:57 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:57 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:57 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:57 np0005548789.localdomain sshd[303151]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:08:58 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44467 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548787.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:58 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:58 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:58 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:58 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:08:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:58 np0005548789.localdomain sshd[303151]: Received disconnect from 154.113.10.34 port 51818:11: Bye Bye [preauth]
Dec 06 10:08:58 np0005548789.localdomain sshd[303151]: Disconnected from authenticating user root 154.113.10.34 port 51818 [preauth]
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='client.44467 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005548787.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:08:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548787.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Removed host np0005548787.localdomain
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removed host np0005548787.localdomain
Dec 06 10:08:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:08:59 np0005548789.localdomain podman[303153]: 2025-12-06 10:08:59.934340999 +0000 UTC m=+0.089060746 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:08:59 np0005548789.localdomain podman[303153]: 2025-12-06 10:08:59.954222358 +0000 UTC m=+0.108942065 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:08:59 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:08:59 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005548787.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain"} : dispatch
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain"}]': finished
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: Removed host np0005548787.localdomain
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.710 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.712 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.713 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.713 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.744 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:00.745 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:01 np0005548789.localdomain sudo[303171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:01 np0005548789.localdomain sudo[303171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:01 np0005548789.localdomain sudo[303171]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:01 np0005548789.localdomain sudo[303189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:01 np0005548789.localdomain sudo[303189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.606156434 +0000 UTC m=+0.078029583 container create fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1763362218, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:09:01 np0005548789.localdomain systemd[1]: Started libpod-conmon-fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0.scope.
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.573370625 +0000 UTC m=+0.045243794 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:01 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.695088315 +0000 UTC m=+0.166961434 container init fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public)
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.70619626 +0000 UTC m=+0.178069369 container start fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git)
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.706379585 +0000 UTC m=+0.178252774 container attach fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1763362218, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:01 np0005548789.localdomain silly_satoshi[303239]: 167 167
Dec 06 10:09:01 np0005548789.localdomain systemd[1]: libpod-fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0.scope: Deactivated successfully.
Dec 06 10:09:01 np0005548789.localdomain podman[303224]: 2025-12-06 10:09:01.712275303 +0000 UTC m=+0.184148482 container died fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Dec 06 10:09:01 np0005548789.localdomain podman[303244]: 2025-12-06 10:09:01.810142813 +0000 UTC m=+0.088207771 container remove fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_satoshi, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1763362218, name=rhceph, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True)
Dec 06 10:09:01 np0005548789.localdomain systemd[1]: libpod-conmon-fa41c3b4817d1ba97d63748dff0081f0f0d64ae58b286e1ed7eca52c81ba70b0.scope: Deactivated successfully.
Dec 06 10:09:01 np0005548789.localdomain sudo[303189]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:01 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:01 np0005548789.localdomain sudo[303260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:01 np0005548789.localdomain sudo[303260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:01 np0005548789.localdomain sudo[303260]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:02 np0005548789.localdomain sudo[303278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:02 np0005548789.localdomain sudo[303278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.511174494 +0000 UTC m=+0.078339712 container create 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, distribution-scope=public)
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: Started libpod-conmon-8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92.scope.
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.575822413 +0000 UTC m=+0.142987621 container init 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_CLEAN=True)
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.479603893 +0000 UTC m=+0.046769151 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.588178586 +0000 UTC m=+0.155343804 container start 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, GIT_CLEAN=True, distribution-scope=public, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.588486295 +0000 UTC m=+0.155651553 container attach 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, version=7, vcs-type=git, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1763362218, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Dec 06 10:09:02 np0005548789.localdomain musing_lumiere[303331]: 167 167
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: libpod-8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92.scope: Deactivated successfully.
Dec 06 10:09:02 np0005548789.localdomain podman[303315]: 2025-12-06 10:09:02.592351851 +0000 UTC m=+0.159517089 container died 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, RELEASE=main, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main)
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-160781fb2c69432426e8f5bf9a30f81c6704cc23fd369d42dfdae832a22f1f35-merged.mount: Deactivated successfully.
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1505fa216ab21d4b3717abef1ea26f4fc36c0bea09c76937a08d203c5784e4e3-merged.mount: Deactivated successfully.
Dec 06 10:09:02 np0005548789.localdomain podman[303336]: 2025-12-06 10:09:02.697648326 +0000 UTC m=+0.092896903 container remove 8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_lumiere, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:02 np0005548789.localdomain systemd[1]: libpod-conmon-8e8aad40d60cb7d2666c9e0298855f7e8694c601063a8c41068bdd18ec961b92.scope: Deactivated successfully.
Dec 06 10:09:02 np0005548789.localdomain sudo[303278]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:02 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:02 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:02 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:02 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:02 np0005548789.localdomain sudo[303359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:02 np0005548789.localdomain sudo[303359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:02 np0005548789.localdomain sudo[303359]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:03 np0005548789.localdomain sudo[303377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:03 np0005548789.localdomain sudo[303377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:03 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.520584161 +0000 UTC m=+0.058429613 container create 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True)
Dec 06 10:09:03 np0005548789.localdomain systemd[1]: Started libpod-conmon-1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730.scope.
Dec 06 10:09:03 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.577010472 +0000 UTC m=+0.114855924 container init 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, ceph=True, distribution-scope=public, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.583747275 +0000 UTC m=+0.121592717 container start 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph)
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.584055455 +0000 UTC m=+0.121900907 container attach 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, architecture=x86_64, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7)
Dec 06 10:09:03 np0005548789.localdomain wizardly_johnson[303427]: 167 167
Dec 06 10:09:03 np0005548789.localdomain systemd[1]: libpod-1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730.scope: Deactivated successfully.
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.586595291 +0000 UTC m=+0.124440763 container died 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1763362218, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=)
Dec 06 10:09:03 np0005548789.localdomain podman[303412]: 2025-12-06 10:09:03.49831047 +0000 UTC m=+0.036155952 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-51fe94b548087a4d2b9c980263d69cb47e2f14c82c2e0bb7c4d98d250280af2e-merged.mount: Deactivated successfully.
Dec 06 10:09:03 np0005548789.localdomain podman[303432]: 2025-12-06 10:09:03.669101428 +0000 UTC m=+0.076931280 container remove 1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_johnson, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, version=7, name=rhceph, release=1763362218)
Dec 06 10:09:03 np0005548789.localdomain systemd[1]: libpod-conmon-1f9dde573b26de883910e7ad1b6b72c5b90046fb8228d88d567a5f83c5f03730.scope: Deactivated successfully.
Dec 06 10:09:03 np0005548789.localdomain sudo[303377]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:03 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:03 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:03 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:03 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:03 np0005548789.localdomain sudo[303455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:03 np0005548789.localdomain sudo[303455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:03 np0005548789.localdomain sudo[303455]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:03 np0005548789.localdomain sudo[303473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:03 np0005548789.localdomain sudo[303473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.404136605 +0000 UTC m=+0.053227465 container create 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, ceph=True, GIT_BRANCH=main, release=1763362218, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:09:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17.scope.
Dec 06 10:09:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.470475524 +0000 UTC m=+0.119566394 container init 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, version=7, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Dec 06 10:09:04 np0005548789.localdomain youthful_williamson[303524]: 167 167
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.478781614 +0000 UTC m=+0.127872514 container start 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.expose-services=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, release=1763362218, RELEASE=main)
Dec 06 10:09:04 np0005548789.localdomain systemd[1]: libpod-58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17.scope: Deactivated successfully.
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.479560879 +0000 UTC m=+0.128651779 container attach 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.381002587 +0000 UTC m=+0.030093507 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:04 np0005548789.localdomain podman[303508]: 2025-12-06 10:09:04.483018683 +0000 UTC m=+0.132109593 container died 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218)
Dec 06 10:09:04 np0005548789.localdomain podman[303529]: 2025-12-06 10:09:04.573702386 +0000 UTC m=+0.082388024 container remove 58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_williamson, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main)
Dec 06 10:09:04 np0005548789.localdomain systemd[1]: libpod-conmon-58ed546b5412fd9d5b6c7594c0164c59f851f6c370af0aab1bd531be2c59de17.scope: Deactivated successfully.
Dec 06 10:09:04 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-42957f4825799ccfbe6e3a866bfa2de1e7f738c2726206560a06458a98a3ac1b-merged.mount: Deactivated successfully.
Dec 06 10:09:04 np0005548789.localdomain sudo[303473]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:04 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:04 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:04 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:04 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:04 np0005548789.localdomain sudo[303545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:04 np0005548789.localdomain sudo[303545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:04 np0005548789.localdomain sudo[303545]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:04 np0005548789.localdomain sudo[303563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:04 np0005548789.localdomain sudo[303563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.258818238 +0000 UTC m=+0.078632081 container create ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7)
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:09:05 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712.scope.
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.322593321 +0000 UTC m=+0.142406984 container init ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-11-26T19:44:28Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.227635138 +0000 UTC m=+0.047448811 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.33650283 +0000 UTC m=+0.156316423 container start ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.336815149 +0000 UTC m=+0.156628852 container attach ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: libpod-ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712.scope: Deactivated successfully.
Dec 06 10:09:05 np0005548789.localdomain recursing_fermat[303614]: 167 167
Dec 06 10:09:05 np0005548789.localdomain podman[303598]: 2025-12-06 10:09:05.340469679 +0000 UTC m=+0.160283322 container died ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Dec 06 10:09:05 np0005548789.localdomain podman[303613]: 2025-12-06 10:09:05.415025247 +0000 UTC m=+0.110068170 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:09:05 np0005548789.localdomain podman[303613]: 2025-12-06 10:09:05.42707519 +0000 UTC m=+0.122118123 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:09:05 np0005548789.localdomain podman[303624]: 2025-12-06 10:09:05.473448907 +0000 UTC m=+0.118732499 container remove ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermat, version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: libpod-conmon-ed8a31d2db9ce08fa6f793d07f13d59211c11c6af2078461d266d081e6300712.scope: Deactivated successfully.
Dec 06 10:09:05 np0005548789.localdomain sudo[303563]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:05 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:09:05 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:09:05 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:09:05 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:09:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-15d924119a66538052ca412c2fea855bdd1bc0e5a463af0ad04c1cb15818c648-merged.mount: Deactivated successfully.
Dec 06 10:09:05 np0005548789.localdomain sudo[303658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:05 np0005548789.localdomain sudo[303658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:05 np0005548789.localdomain sudo[303658]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:05 np0005548789.localdomain sudo[303676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:05 np0005548789.localdomain sudo[303676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.745 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.748 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.748 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.748 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.778 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:05.779 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:09:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.099962233 +0000 UTC m=+0.056759732 container create c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: Started libpod-conmon-c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d.scope.
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.15958262 +0000 UTC m=+0.116380149 container init c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1763362218, GIT_BRANCH=main)
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: tmp-crun.H1ylue.mount: Deactivated successfully.
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.167134178 +0000 UTC m=+0.123931677 container start c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, build-date=2025-11-26T19:44:28Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_CLEAN=True, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, release=1763362218)
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.167339994 +0000 UTC m=+0.124137513 container attach c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:09:06 np0005548789.localdomain peaceful_sinoussi[303725]: 167 167
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: libpod-c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d.scope: Deactivated successfully.
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.169045006 +0000 UTC m=+0.125842555 container died c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, release=1763362218, version=7, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:06 np0005548789.localdomain podman[303710]: 2025-12-06 10:09:06.08363124 +0000 UTC m=+0.040428749 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:06 np0005548789.localdomain podman[303730]: 2025-12-06 10:09:06.223980601 +0000 UTC m=+0.050185994 container remove c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_sinoussi, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True)
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: libpod-conmon-c3d0fcc5c5e81adb76fb46916c2076f0100700785d2c0ca57847e46b66847d3d.scope: Deactivated successfully.
Dec 06 10:09:06 np0005548789.localdomain sudo[303676]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:06 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:06 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:06 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:06 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e81a04cbad3dd485b0a36ad607e5bfe38cbaa19f802f345f531188613a4eb80c-merged.mount: Deactivated successfully.
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev 023838e0-b52a-42a7-bd47-8afddb7f894e (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev 023838e0-b52a-42a7-bd47-8afddb7f894e (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event 023838e0-b52a-42a7-bd47-8afddb7f894e (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:07 np0005548789.localdomain sudo[303747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:07 np0005548789.localdomain sudo[303747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:07 np0005548789.localdomain sudo[303747]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Saving service mon spec with placement label:mon
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev dbac2153-14b5-4981-983f-b8320034ab00 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev dbac2153-14b5-4981-983f-b8320034ab00 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:07 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event dbac2153-14b5-4981-983f-b8320034ab00 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:09:07 np0005548789.localdomain sudo[303765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:07 np0005548789.localdomain sudo[303765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:07 np0005548789.localdomain sudo[303765]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.914 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.922 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a74db33-f5e3-4bfb-802b-56d08e915c65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:07.916529', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a48bc6a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '604b0fe8a30e6fc746636727d66b33a6bfe1f16569108ae47a425ff14c6dca16'}]}, 'timestamp': '2025-12-06 10:09:07.923424', '_unique_id': '9ffb0f41dd004788ba0e8bf789163c47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.926 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.927 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.927 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.944 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 14300000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e6bfb6e-17a2-4d99-b9e1-2e460155dee8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14300000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:09:07.927796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9a4c1b58-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.194140248, 'message_signature': 'b14003508cc1792cf6ab57feac829e6ef53fa4fed1213552d3036f09d9dd4981'}]}, 'timestamp': '2025-12-06 10:09:07.945382', '_unique_id': '96108dd34d754a7a8a451cdd03e2c427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.971 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7377136b-fba2-4943-b744-fe5219b058a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:07.947705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a50138e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '59e777b8f51ae92e5455a1ef6bcfb1ab440ee1fc84b4f1b576753a55d6248461'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:07.947705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a5025d6-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '893c1bb2cbcac082aaa0de0970c25271ff2ad835b95e3d0ef402a16749844ab2'}]}, 'timestamp': '2025-12-06 10:09:07.971976', '_unique_id': 'bc7eca61326d4f4fafb57daa4f9042f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.974 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2a634b5-7260-4592-b5c0-cf19da10dcc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:07.974551', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a50a394-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': 'af0fedd61bd3b9cba1dfdb1390b80753218a8c0946d71b1303b948c7999d7a18'}]}, 'timestamp': '2025-12-06 10:09:07.975096', '_unique_id': '9fa49bfbfe1241e2a60160818fb4ff2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.976 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.977 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '568db3ea-97d1-4cf8-aea8-521a383c2d1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:07.977671', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a511d24-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '8e35e5af3533db9fe8c00799c8b7bee8cdbf5097ade216878792d00505946985'}]}, 'timestamp': '2025-12-06 10:09:07.978214', '_unique_id': 'f81f80a1261f40069449ce529ee59ccd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.979 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.980 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.980 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62ec6ad3-3791-4aff-9913-af8e7250053e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:09:07.980710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9a51945c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.194140248, 'message_signature': 'ee885a0b88cca73d6467c0c7289ab9b55a4e26d1253fff6cbfda2c897874d125'}]}, 'timestamp': '2025-12-06 10:09:07.981242', '_unique_id': '67d4846308544145aaf71da92743ef8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.982 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fdc82b-f56d-485b-92aa-9bf588ee50a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:07.983681', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a53d33e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': 'ee436cc1d242b18fbd7a387e5b78bcc6cd28cd7efb083088bcedc0ae0bb803ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:07.983681', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a53e5ae-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': '2bc2e55c3ed6739033c81903590fc79e35696a6d4ee8c0ec3beb6384470a0428'}]}, 'timestamp': '2025-12-06 10:09:07.996410', '_unique_id': 'd54382dae000454ca0c4c2f73e6620f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaac2366-808f-4f39-87b4-c3ba92fd5e16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:07.998781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a5453c2-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': 'a9c38a7d3598a55dab7fdccafab4b7877727ff7b63307e2d766bd3dce945afe4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:07.998781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a546434-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': 'd51b9ec1f63d1d739b9e2ac609d735b5faba4a1c87673073dc8cc5ae2f44cbf9'}]}, 'timestamp': '2025-12-06 10:09:07.999648', '_unique_id': 'e3a62b382e5f4e2898c18a2a5ac00003'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.000 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.001 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79533855-fca1-4678-8645-de57faa31b02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.001896', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a54ccda-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': 'bfe68996535373d04a54e58ab8fde3ee61da3347103247466180586c6e9a1833'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.001896', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a54dfb8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '389b67240902636f2bd756a12c55dc80ab13fc20863a135b87fe08e7ab1dfb1b'}]}, 'timestamp': '2025-12-06 10:09:08.002839', '_unique_id': '144cf6e118754343bc17395b376f41a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.005 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db8bb643-15b7-4506-9db8-aa49ce3d13ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.005837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a5568de-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '9195028ca352f5fafd5b7a57baf6c4d2b90165aa6d8dd80c0293b0db1345735a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.005837', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a557acc-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '76507ac9ddc570eb6e719e4ad71d101812320c8535b28299684f147121b77ace'}]}, 'timestamp': '2025-12-06 10:09:08.006807', '_unique_id': '6f07b9707e544abebff5fa16a060e1c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91d8a94d-c432-43d5-a94a-564b7198714d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.009212', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a55eb7e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': 'df50f57b4f088f24367f81d98729355043a2e111dc2369483459ef8ca85afefa'}]}, 'timestamp': '2025-12-06 10:09:08.009700', '_unique_id': '14bd775412214432a7c99ae24e2f5caa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6402536-0e8d-4f52-a675-79edb9f7db10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.012111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a565c12-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '1f6bdff7fc69932eb05a364e57978644b6a03cb3b666b785d95a11dcee484eca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.012111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a566e14-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '9544c32e09cb334dd7bc09cf08545ed18d66815f973e6d935d7129c80f30347f'}]}, 'timestamp': '2025-12-06 10:09:08.013042', '_unique_id': '2b4d3158e8a149cf9325bc62dc24fe3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.015 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04029069-3005-4c41-8274-5cc826e8e7e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.015615', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a56ef42-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '20e482ee207a888a0fcf8ca495923d28bfbee1478b6456f5dd942c3b71f75af7'}]}, 'timestamp': '2025-12-06 10:09:08.016352', '_unique_id': '0555c68978934b578fa46c66c9ae04ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04f09ab1-c9a2-4a0a-bce6-604c5eddf788', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.018809', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a5761fc-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': 'dc712f8a041299bb5a2b65b87b0df7f0a070573fb1ba66bf1324220a4e07168f'}]}, 'timestamp': '2025-12-06 10:09:08.019280', '_unique_id': '9a913791a8414cdd8719c9f2fa43391f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.021 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a56ad630-89f4-4c81-a291-5b58e0ee6c25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.022265', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a57e9c4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': 'dc6e2d90cb4ca766e5944a9eda4fd26d0c19b8167997aa46fc75b01aa435fc3d'}]}, 'timestamp': '2025-12-06 10:09:08.022784', '_unique_id': '8934bd79f96c4ece9c16db6097344789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.023 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.024 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cddc65a1-6a4a-4dc4-8550-6e9a394b152d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.024942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a585116-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': '4437f6a9139835bc15d9ddf79e7d8a97a731402806be07a09a67a08a3888eec0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.024942', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a586138-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': '6990bd1e7aeaeb0345f7b44c93311d628dc9e53aeea9663dbaff64987bef9ca7'}]}, 'timestamp': '2025-12-06 10:09:08.025815', '_unique_id': '9deeca6c202744b5b80fc8c4e40e051d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fa15156-881e-455e-b5a3-e57410faad47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.028023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a58c98e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': 'b61c9d10450a6bd7fd201f739253e30a491d7d20176ab090ec81cab05fbf883b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.028023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a58d9ba-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.233193805, 'message_signature': '600083dd58ef12f0ff6558f0326bf23cde59eb4b237f8de0ce57945ff90bc90c'}]}, 'timestamp': '2025-12-06 10:09:08.028903', '_unique_id': '0557cc51e5ca4cb480d54511b805d11d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.030 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.031 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9ee31bc-2166-420d-95cc-bca96b469a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.031106', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a594238-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '9f71dcd24c72e5efa69e2b07b5607330219d173162fdf46636848438a56f0746'}]}, 'timestamp': '2025-12-06 10:09:08.031503', '_unique_id': '883e5d61c461495eaadc60d23f29275d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a13b122-6252-484a-ad57-0ffb04398e1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:09:08.032999', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9a59891e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '2a0177182359ac2fdc901523ef68d4b29855930573662d291c9e862621126893'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:09:08.032999', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9a59935a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.197088356, 'message_signature': '22ef2aa59a46eee400d642b2336020e8fe817246866ee9f9f0626a3aa00e9691'}]}, 'timestamp': '2025-12-06 10:09:08.033544', '_unique_id': 'd1ea4cad9af04c7f90cb7c1c6e8af5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1260772-b3d8-4dd9-9a00-3840287eb1a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.035027', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a59d7ca-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '2cbd916679658742058a2777228614d77e159fd0f6f39248ed46105b9b8711ab'}]}, 'timestamp': '2025-12-06 10:09:08.035315', '_unique_id': 'f536d06add394d0cb8da1e8686c6582e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f975d04-3d9f-4d66-9697-0dd8e701c99e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:09:08.036651', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '9a5a180c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12166.165895086, 'message_signature': '9cb1512d39863d528032b913d7d9c7524a8da001df0d1083a81c77bcec1bc083'}]}, 'timestamp': '2025-12-06 10:09:08.036960', '_unique_id': '59e5a15d092546bd86f80260ebad8bcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:09:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:09:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:09:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:08 np0005548789.localdomain ceph-mon[298582]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:08 np0005548789.localdomain ceph-mon[298582]: from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:08 np0005548789.localdomain ceph-mon[298582]: Saving service mon spec with placement label:mon
Dec 06 10:09:09 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:10 np0005548789.localdomain ceph-mon[298582]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:10 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Writing back 50 completed events
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.780 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.783 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.783 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.783 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.817 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:10.818 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:09:10 np0005548789.localdomain podman[303783]: 2025-12-06 10:09:10.937997189 +0000 UTC m=+0.090768357 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:09:10 np0005548789.localdomain podman[303783]: 2025-12-06 10:09:10.980255513 +0000 UTC m=+0.133026691 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 06 10:09:10 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:09:11 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:11 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:09:12 np0005548789.localdomain ceph-mon[298582]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:12 np0005548789.localdomain ceph-mon[298582]: from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548790"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Remove daemons mon.np0005548790
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005548790
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005548790: new quorum should be ['np0005548788', 'np0005548789'] (from ['np0005548788', 'np0005548789'])
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005548790: new quorum should be ['np0005548788', 'np0005548789'] (from ['np0005548788', 'np0005548789'])
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005548790 from monmap...
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing monitor np0005548790 from monmap...
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005548790 from np0005548790.localdomain -- ports []
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005548790 from np0005548790.localdomain -- ports []
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: client.54179 ms_handle_reset on v2:172.18.0.103:3300/0
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: client.44398 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@2(peon) e14  my rank is now 1 (was 2)
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: paxos.1).electionLogic(62) init, last seen epoch 62
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain sudo[303808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:13 np0005548789.localdomain sudo[303808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303808]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005548790"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: Remove daemons mon.np0005548790
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: Safe to remove mon.np0005548790: new quorum should be ['np0005548788', 'np0005548789'] (from ['np0005548788', 'np0005548789'])
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: Removing monitor np0005548790 from monmap...
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: Removing daemon mon.np0005548790 from np0005548790.localdomain -- ports []
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: monmap epoch 14
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:09:13.351903+0000
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 53s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:13 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:13 np0005548789.localdomain sudo[303826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:13 np0005548789.localdomain sudo[303826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303826]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548789.localdomain sudo[303844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303844]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:13 np0005548789.localdomain sudo[303862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303862]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548789.localdomain sudo[303880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303880]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548789.localdomain sudo[303914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303914]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:13 np0005548789.localdomain sudo[303932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303932]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain sudo[303950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain sudo[303950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303950]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:13 np0005548789.localdomain sudo[303968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:13 np0005548789.localdomain sudo[303968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:13 np0005548789.localdomain sudo[303968]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[303986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:14 np0005548789.localdomain sudo[303986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[303986]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[304004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548789.localdomain sudo[304004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304004]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain sudo[304022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:14 np0005548789.localdomain sudo[304022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304022]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[304040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548789.localdomain sudo[304040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304040]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[304074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548789.localdomain sudo[304074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304074]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[304092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:14 np0005548789.localdomain sudo[304092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304092]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain sudo[304110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain sudo[304110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:14 np0005548789.localdomain sudo[304110]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev f6dd0353-489f-4408-b0d7-0fd7319bdf9d (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev f6dd0353-489f-4408-b0d7-0fd7319bdf9d (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event f6dd0353-489f-4408-b0d7-0fd7319bdf9d (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:09:15 np0005548789.localdomain sudo[304128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:15 np0005548789.localdomain sudo[304128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:15 np0005548789.localdomain sudo[304128]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:15 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Writing back 50 completed events
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.820 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.821 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.822 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.822 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.864 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:15.864 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:16 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:16 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:16 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:16 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:09:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:16 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:17 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:17 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:17 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:17 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:17 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:18 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:18 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:18 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:18 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:19 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:19 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:19 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:19 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:19 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:19 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:20 np0005548789.localdomain sudo[304146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:20 np0005548789.localdomain sudo[304146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:20 np0005548789.localdomain sudo[304146]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:09:20 np0005548789.localdomain sudo[304170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:20 np0005548789.localdomain sudo[304170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:20 np0005548789.localdomain podman[304164]: 2025-12-06 10:09:20.32184571 +0000 UTC m=+0.093027535 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:09:20 np0005548789.localdomain podman[304164]: 2025-12-06 10:09:20.355202216 +0000 UTC m=+0.126384051 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Optimize plan auto_2025-12-06_10:09:20
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] do_upmap
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] pools ['backups', 'manila_metadata', 'volumes', 'vms', 'images', 'manila_data', '.mgr']
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [balancer INFO root] prepared 0/10 changes
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] _maybe_adjust
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033250017448352874 of space, bias 1.0, pg target 0.6650003489670575 quantized to 32 (current 32)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] scanning for idle connections..
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [volumes INFO mgr_util] cleaning up connections: []
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 06 10:09:20 np0005548789.localdomain ceph-mgr[288591]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.720957901 +0000 UTC m=+0.065901987 container create 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: Started libpod-conmon-6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687.scope.
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.699122712 +0000 UTC m=+0.044066778 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.80720159 +0000 UTC m=+0.152145716 container init 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.41.4, release=1763362218, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, vendor=Red Hat, Inc.)
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.815342976 +0000 UTC m=+0.160287072 container start 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, release=1763362218, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph)
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.815691976 +0000 UTC m=+0.160636112 container attach 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, io.buildah.version=1.41.4, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, release=1763362218)
Dec 06 10:09:20 np0005548789.localdomain jovial_jackson[304234]: 167 167
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: libpod-6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687.scope: Deactivated successfully.
Dec 06 10:09:20 np0005548789.localdomain podman[304219]: 2025-12-06 10:09:20.82144264 +0000 UTC m=+0.166386776 container died 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:20 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.865 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.868 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.868 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.868 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.913 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:20.914 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:20 np0005548789.localdomain podman[304239]: 2025-12-06 10:09:20.933313832 +0000 UTC m=+0.100067938 container remove 6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_jackson, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git)
Dec 06 10:09:20 np0005548789.localdomain systemd[1]: libpod-conmon-6e4737215ee694683c0f9a82af81724c69b513afc301c921570d671606094687.scope: Deactivated successfully.
Dec 06 10:09:20 np0005548789.localdomain sudo[304170]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:21 np0005548789.localdomain sudo[304253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:21 np0005548789.localdomain sudo[304253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:09:21 np0005548789.localdomain sudo[304253]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:21 np0005548789.localdomain sudo[304272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:21 np0005548789.localdomain sudo[304272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:21 np0005548789.localdomain podman[304271]: 2025-12-06 10:09:21.173164342 +0000 UTC m=+0.073343171 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:09:21 np0005548789.localdomain podman[304271]: 2025-12-06 10:09:21.192128023 +0000 UTC m=+0.092306782 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-611581aab770f0d2936712ca6cd3f2d8c9fe954b0e76cfcbcb173c0d435482f3-merged.mount: Deactivated successfully.
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.537447893 +0000 UTC m=+0.048789082 container create a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, RELEASE=main, distribution-scope=public, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True)
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd.scope.
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.598828033 +0000 UTC m=+0.110169202 container init a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.606620487 +0000 UTC m=+0.117961656 container start a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.606734881 +0000 UTC m=+0.118076050 container attach a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.expose-services=)
Dec 06 10:09:21 np0005548789.localdomain thirsty_diffie[304343]: 167 167
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: libpod-a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd.scope: Deactivated successfully.
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.610508515 +0000 UTC m=+0.121849674 container died a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1763362218, vcs-type=git, ceph=True)
Dec 06 10:09:21 np0005548789.localdomain podman[304328]: 2025-12-06 10:09:21.516345766 +0000 UTC m=+0.027686935 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:21 np0005548789.localdomain podman[304348]: 2025-12-06 10:09:21.706236221 +0000 UTC m=+0.082690524 container remove a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_diffie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, release=1763362218, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph)
Dec 06 10:09:21 np0005548789.localdomain systemd[1]: libpod-conmon-a9cac8564f7819ca77ca51a7098684245a5f57c398d249d124f49ba77b1b44bd.scope: Deactivated successfully.
Dec 06 10:09:21 np0005548789.localdomain sudo[304272]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:21 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:21 np0005548789.localdomain sudo[304372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:21 np0005548789.localdomain sudo[304372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:21 np0005548789.localdomain sudo[304372]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:22 np0005548789.localdomain sudo[304390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:22 np0005548789.localdomain sudo[304390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5c7cc5a5d0690136dea4bb69342d81df8a57b4f96ab089a52c849728e0ab90c3-merged.mount: Deactivated successfully.
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.576015889 +0000 UTC m=+0.101119050 container create 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, name=rhceph, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, GIT_CLEAN=True, release=1763362218, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:22 np0005548789.localdomain systemd[1]: Started libpod-conmon-569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa.scope.
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.523343982 +0000 UTC m=+0.048447163 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:22 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.666823956 +0000 UTC m=+0.191927117 container init 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, com.redhat.component=rhceph-container)
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.682798738 +0000 UTC m=+0.207901899 container start 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z)
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.683052906 +0000 UTC m=+0.208156067 container attach 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, build-date=2025-11-26T19:44:28Z, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7)
Dec 06 10:09:22 np0005548789.localdomain vigorous_dirac[304439]: 167 167
Dec 06 10:09:22 np0005548789.localdomain systemd[1]: libpod-569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa.scope: Deactivated successfully.
Dec 06 10:09:22 np0005548789.localdomain podman[304424]: 2025-12-06 10:09:22.686471549 +0000 UTC m=+0.211574710 container died 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:22 np0005548789.localdomain podman[304444]: 2025-12-06 10:09:22.770146311 +0000 UTC m=+0.076164797 container remove 569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_dirac, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:22 np0005548789.localdomain systemd[1]: libpod-conmon-569bf7cec4cdf9541b9e157b2039f3939248bf1dde9d75105a43956e8e9155fa.scope: Deactivated successfully.
Dec 06 10:09:22 np0005548789.localdomain sudo[304390]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:22 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:22 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:22 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:22 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain sudo[304469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:23 np0005548789.localdomain sudo[304469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:23 np0005548789.localdomain sudo[304469]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:23 np0005548789.localdomain sudo[304487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:23 np0005548789.localdomain sudo[304487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:23 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-66130f7b57de69b0e3bdc61098a6f4dbf902ffa1162a5e5655a4955145a639f2-merged.mount: Deactivated successfully.
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.609961006 +0000 UTC m=+0.072236788 container create e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Dec 06 10:09:23 np0005548789.localdomain systemd[1]: Started libpod-conmon-e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae.scope.
Dec 06 10:09:23 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.576404715 +0000 UTC m=+0.038680527 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.681796051 +0000 UTC m=+0.144071833 container init e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, version=7, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:23 np0005548789.localdomain elastic_curie[304537]: 167 167
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.693795273 +0000 UTC m=+0.156071045 container start e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, build-date=2025-11-26T19:44:28Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph)
Dec 06 10:09:23 np0005548789.localdomain systemd[1]: libpod-e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae.scope: Deactivated successfully.
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.694136144 +0000 UTC m=+0.156412156 container attach e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, build-date=2025-11-26T19:44:28Z, version=7, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:23 np0005548789.localdomain podman[304522]: 2025-12-06 10:09:23.697966479 +0000 UTC m=+0.160242321 container died e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1763362218, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Dec 06 10:09:23 np0005548789.localdomain podman[304542]: 2025-12-06 10:09:23.79886214 +0000 UTC m=+0.089565101 container remove e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_curie, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, release=1763362218, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:23 np0005548789.localdomain systemd[1]: libpod-conmon-e8649da0f2dff4fe654db5f6715d5360bce5b3ed589db7263bcb4f7da69a14ae.scope: Deactivated successfully.
Dec 06 10:09:23 np0005548789.localdomain sudo[304487]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:23 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:23 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:23 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:09:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:09:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:09:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:09:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:09:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19241 "" "Go-http-client/1.1"
Dec 06 10:09:23 np0005548789.localdomain sudo[304556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:24 np0005548789.localdomain sudo[304556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:24 np0005548789.localdomain sudo[304556]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:24 np0005548789.localdomain sudo[304574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:24 np0005548789.localdomain sudo[304574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-32cf92c56e986e9cf0392dcebe075c9952a8504acfcf5920943c983ef8630b5c-merged.mount: Deactivated successfully.
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.534701231 +0000 UTC m=+0.066310230 container create 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34.scope.
Dec 06 10:09:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.501750948 +0000 UTC m=+0.033359977 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.605195176 +0000 UTC m=+0.136804165 container init 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, release=1763362218, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, vcs-type=git)
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.615247298 +0000 UTC m=+0.146856287 container start 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, release=1763362218, RELEASE=main, version=7, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4)
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.61560394 +0000 UTC m=+0.147212929 container attach 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.41.4, ceph=True, version=7, release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph)
Dec 06 10:09:24 np0005548789.localdomain elastic_pascal[304625]: 167 167
Dec 06 10:09:24 np0005548789.localdomain systemd[1]: libpod-590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34.scope: Deactivated successfully.
Dec 06 10:09:24 np0005548789.localdomain podman[304610]: 2025-12-06 10:09:24.620148756 +0000 UTC m=+0.151757755 container died 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, com.redhat.component=rhceph-container, release=1763362218)
Dec 06 10:09:24 np0005548789.localdomain podman[304630]: 2025-12-06 10:09:24.728129061 +0000 UTC m=+0.095405126 container remove 590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_pascal, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git)
Dec 06 10:09:24 np0005548789.localdomain systemd[1]: libpod-conmon-590e29022460d2dde343d7968e63e9b340f4a4fddf6dd24e3b4368d22477ba34.scope: Deactivated successfully.
Dec 06 10:09:24 np0005548789.localdomain sudo[304574]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:24 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:24 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2fff68c4181a8b7a4e6d1d95fc9f12e1e4bd51405bbc9ff7551e61a53747ac48-merged.mount: Deactivated successfully.
Dec 06 10:09:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Dec 06 10:09:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Dec 06 10:09:25 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:09:25 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.915 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.918 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.919 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.919 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:09:25 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.961 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:25.963 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.54285 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548790.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:09:26 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:09:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:09:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:09:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:26 np0005548789.localdomain podman[304648]: 2025-12-06 10:09:26.937839209 +0000 UTC m=+0.093655174 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:09:26 np0005548789.localdomain podman[304648]: 2025-12-06 10:09:26.976890216 +0000 UTC m=+0.132706171 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 06 10:09:26 np0005548789.localdomain podman[304647]: 2025-12-06 10:09:26.984403883 +0000 UTC m=+0.142878278 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:09:26 np0005548789.localdomain podman[304647]: 2025-12-06 10:09:26.996479407 +0000 UTC m=+0.154953792 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 06 10:09:27 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:09:27 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:09:27 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:27 np0005548789.localdomain ceph-mon[298582]: from='client.54285 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005548790.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:27 np0005548789.localdomain ceph-mon[298582]: Deploying daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:09:27 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:09:27 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:09:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.586 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.587 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.587 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.587 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:09:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:28 np0005548789.localdomain ceph-mon[298582]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:28.986 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.000 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.001 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.002 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.018 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.019 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.019 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.020 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.020 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/788914456' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.472 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e14  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (2) No such file or directory
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: paxos.1).electionLogic(64) init, last seen epoch 64
Dec 06 10:09:29 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.532 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.533 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.687 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.688 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11483MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.688 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.689 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.754 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.754 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.755 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:09:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:29.799 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:30 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:30 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:09:30 np0005548789.localdomain podman[304718]: 2025-12-06 10:09:30.921888532 +0000 UTC m=+0.081279311 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 06 10:09:30 np0005548789.localdomain podman[304718]: 2025-12-06 10:09:30.935971787 +0000 UTC m=+0.095362566 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:09:30 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:09:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:30.964 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:30.966 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:30.966 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:30.967 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:31.001 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:31.001 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:31 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:31 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:31 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:32 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:32 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:33 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:33 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:33 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 handle_auth_request failed to assign global_id
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:09:34 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548790.vhcezv (monmap changed)...
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548790.vhcezv on np0005548790.localdomain
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 is new leader, mons np0005548788,np0005548789 in quorum (ranks 0,1)
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: monmap epoch 15
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 74s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1/3 mons down, quorum np0005548788,np0005548789 (MON_DOWN)
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005548788,np0005548789
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]:     mon.np0005548790 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548790.kvkfyr", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:35 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:35 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:35 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:35 np0005548789.localdomain sudo[304738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548790.kvkfyr (monmap changed)...
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548790.kvkfyr on np0005548790.localdomain
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/862062483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:35 np0005548789.localdomain sudo[304738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:09:35 np0005548789.localdomain sudo[304738]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:35 np0005548789.localdomain sudo[304762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:09:35 np0005548789.localdomain sudo[304762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:35 np0005548789.localdomain podman[304756]: 2025-12-06 10:09:35.732268884 +0000 UTC m=+0.084020984 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:09:35 np0005548789.localdomain podman[304756]: 2025-12-06 10:09:35.742482381 +0000 UTC m=+0.094234481 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:09:35 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:35 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3172055493' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.002 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.004 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.004 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.004 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.036 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.037 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2247057544' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.297 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.303 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.320 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.322 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.323 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:36 np0005548789.localdomain sudo[304762]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:36 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:36 np0005548789.localdomain ceph-mgr[288591]: mgr finish mon failed to return metadata for mon.np0005548790: (22) Invalid argument
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.502 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.503 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.503 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.504 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.504 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.504 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.504 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:36.505 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: log_channel(cluster) log [INF] : mon.np0005548789 calling monitor election
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: paxos.1).electionLogic(66) init, last seen epoch 66
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548790 calling monitor election
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789 calling monitor election
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 calling monitor election
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548788 is new leader, mons np0005548788,np0005548789,np0005548790 in quorum (ranks 0,1,2)
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: monmap epoch 15
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: last_changed 2025-12-06T10:09:29.475464+0000
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: created 2025-12-06T07:57:14.295835+0000
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: min_mon_release 18 (reef)
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: election_strategy: 1
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005548788
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005548789
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005548790
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: fsmap cephfs:1 {0=mds.np0005548790.vhcezv=up:active} 2 up:standby
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: osdmap e92: 6 total, 6 up, 6 in
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: mgrmap e32: np0005548789.mzhmje(active, since 76s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005548788,np0005548789)
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]:     stray daemon mgr.np0005548785.vhqlsq on host np0005548785.localdomain not managed by cephadm
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:09:36 np0005548789.localdomain ceph-mon[298582]:     stray host np0005548785.localdomain has 1 stray daemons: ['mgr.np0005548785.vhqlsq']
Dec 06 10:09:37 np0005548789.localdomain sshd[304839]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:37 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:37 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_open ignoring open from mon.np0005548790 172.18.0.108:0/3264568232; not ready for session (expect reconnect)
Dec 06 10:09:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2288131588' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:09:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:37 np0005548789.localdomain sshd[304839]: Received disconnect from 64.227.102.57 port 44878:11: Bye Bye [preauth]
Dec 06 10:09:37 np0005548789.localdomain sshd[304839]: Disconnected from authenticating user root 64.227.102.57 port 44878 [preauth]
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:38 np0005548789.localdomain ceph-mgr[288591]: mgr.server handle_report got status from non-daemon mon.np0005548790
Dec 06 10:09:38 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:38.480+0000 7f041dcc8640 -1 mgr.server handle_report got status from non-daemon mon.np0005548790
Dec 06 10:09:38 np0005548789.localdomain sudo[304841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:38 np0005548789.localdomain sudo[304841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304841]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain sudo[304859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:38 np0005548789.localdomain sudo[304859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304859]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain sudo[304877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548789.localdomain sudo[304877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304877]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain sudo[304895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:38 np0005548789.localdomain sudo[304895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304895]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:38 np0005548789.localdomain sudo[304913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548789.localdomain sudo[304913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304913]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain sudo[304947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548789.localdomain sudo[304947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304947]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:09:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:09:38 np0005548789.localdomain sudo[304965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:38 np0005548789.localdomain sudo[304965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:38 np0005548789.localdomain sudo[304965]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sudo[304983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain sudo[304983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[304983]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain sudo[305001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:39 np0005548789.localdomain sudo[305001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305001]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain sudo[305019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:39 np0005548789.localdomain sudo[305019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305019]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain sudo[305037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548789.localdomain sudo[305037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305037]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:39 np0005548789.localdomain sudo[305055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:39 np0005548789.localdomain sudo[305055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305055]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sudo[305073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548789.localdomain sudo[305073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305073]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sudo[305107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548789.localdomain sudo[305107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305107]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sudo[305125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:39 np0005548789.localdomain sudo[305125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305125]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sudo[305143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain sudo[305143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:39 np0005548789.localdomain sudo[305143]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:39 np0005548789.localdomain sshd[305161]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] update: starting ev 1f510e1d-7b73-4412-b039-2de466b2d3e6 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] complete: finished ev 1f510e1d-7b73-4412-b039-2de466b2d3e6 (Updating node-proxy deployment (+3 -> 3))
Dec 06 10:09:39 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Completed event 1f510e1d-7b73-4412-b039-2de466b2d3e6 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2596033626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:40 np0005548789.localdomain sudo[305163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:40 np0005548789.localdomain sudo[305163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:40 np0005548789.localdomain sudo[305163]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:40 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:40 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:40 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:40 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:40 np0005548789.localdomain ceph-mgr[288591]: [progress INFO root] Writing back 50 completed events
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548788.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3895678344' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:09:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:41.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:41.084 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:41.084 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:41.085 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:41.086 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:41 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:41 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:41 np0005548789.localdomain sshd[305161]: Received disconnect from 14.194.101.210 port 54974:11: Bye Bye [preauth]
Dec 06 10:09:41 np0005548789.localdomain sshd[305161]: Disconnected from authenticating user root 14.194.101.210 port 54974 [preauth]
Dec 06 10:09:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:09:41 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:41 np0005548789.localdomain podman[305181]: 2025-12-06 10:09:41.377075778 +0000 UTC m=+0.086380745 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:09:41 np0005548789.localdomain podman[305181]: 2025-12-06 10:09:41.413865687 +0000 UTC m=+0.123170694 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:09:41 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548788 (monmap changed)...
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548788 on np0005548788.localdomain
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:42 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:42 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:42 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.2 (monmap changed)...
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:43 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.5 (monmap changed)...
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548788.erzujf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548788.yvwbqq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(audit) log [DBG] : from='client.44541 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO root] Reconfig service osd.default_drive_group
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:44 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:44 np0005548789.localdomain sudo[305206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:44 np0005548789.localdomain sudo[305206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:44 np0005548789.localdomain sudo[305206]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:44 np0005548789.localdomain sudo[305224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:44 np0005548789.localdomain sudo[305224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548788.erzujf (monmap changed)...
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548788.erzujf on np0005548788.localdomain
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548788.yvwbqq (monmap changed)...
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548788.yvwbqq on np0005548788.localdomain
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='client.44541 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfig service osd.default_drive_group
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548789 (monmap changed)...
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548789.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548789 on np0005548789.localdomain
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:45 np0005548789.localdomain sshd[305242]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:45 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.461872219 +0000 UTC m=+0.081794076 container create 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True)
Dec 06 10:09:45 np0005548789.localdomain systemd[1]: Started libpod-conmon-1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657.scope.
Dec 06 10:09:45 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.427890195 +0000 UTC m=+0.047812092 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.539310034 +0000 UTC m=+0.159231901 container init 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:09:45 np0005548789.localdomain systemd[1]: tmp-crun.XTJiFQ.mount: Deactivated successfully.
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.562128241 +0000 UTC m=+0.182050108 container start 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, version=7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.562530393 +0000 UTC m=+0.182452300 container attach 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, release=1763362218, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7)
Dec 06 10:09:45 np0005548789.localdomain priceless_kare[305276]: 167 167
Dec 06 10:09:45 np0005548789.localdomain systemd[1]: libpod-1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657.scope: Deactivated successfully.
Dec 06 10:09:45 np0005548789.localdomain podman[305261]: 2025-12-06 10:09:45.570809943 +0000 UTC m=+0.190731850 container died 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, GIT_CLEAN=True, name=rhceph, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Dec 06 10:09:45 np0005548789.localdomain podman[305281]: 2025-12-06 10:09:45.667057814 +0000 UTC m=+0.083966412 container remove 1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_kare, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_BRANCH=main, architecture=x86_64, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Dec 06 10:09:45 np0005548789.localdomain systemd[1]: libpod-conmon-1453efeb067aa7ef657551153a9eeb617d8cd97f6a690338d20648c06f0e9657.scope: Deactivated successfully.
Dec 06 10:09:45 np0005548789.localdomain sudo[305224]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:45 np0005548789.localdomain sshd[305242]: Received disconnect from 179.33.210.213 port 35242:11: Bye Bye [preauth]
Dec 06 10:09:45 np0005548789.localdomain sshd[305242]: Disconnected from authenticating user root 179.33.210.213 port 35242 [preauth]
Dec 06 10:09:46 np0005548789.localdomain ceph-mon[298582]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:46 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:46 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:46 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:46 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:46.087 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:46 np0005548789.localdomain sudo[305297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:46 np0005548789.localdomain sudo[305297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:46 np0005548789.localdomain sudo[305297]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:46 np0005548789.localdomain sudo[305315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:46 np0005548789.localdomain sudo[305315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-679ae9f13e4f0608335b72542c899868251d91ca02cf84fe368978b2df8ba5ad-merged.mount: Deactivated successfully.
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:09:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:09:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.702356822 +0000 UTC m=+0.071582469 container create 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, ceph=True, RELEASE=main, release=1763362218, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=rhceph-container)
Dec 06 10:09:46 np0005548789.localdomain systemd[1]: Started libpod-conmon-93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712.scope.
Dec 06 10:09:46 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.776688362 +0000 UTC m=+0.145913989 container init 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, distribution-scope=public, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.679979898 +0000 UTC m=+0.049205575 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.790726205 +0000 UTC m=+0.159951872 container start 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, name=rhceph, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, architecture=x86_64, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.791034795 +0000 UTC m=+0.160260422 container attach 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True)
Dec 06 10:09:46 np0005548789.localdomain nice_curran[305364]: 167 167
Dec 06 10:09:46 np0005548789.localdomain systemd[1]: libpod-93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712.scope: Deactivated successfully.
Dec 06 10:09:46 np0005548789.localdomain podman[305349]: 2025-12-06 10:09:46.795400097 +0000 UTC m=+0.164625764 container died 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main)
Dec 06 10:09:46 np0005548789.localdomain podman[305369]: 2025-12-06 10:09:46.902288438 +0000 UTC m=+0.092852589 container remove 93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_curran, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:09:46 np0005548789.localdomain systemd[1]: libpod-conmon-93a2256d59a75689ea06cc06c4c2e400ecceff275d86e50e434b8577bd760712.scope: Deactivated successfully.
Dec 06 10:09:47 np0005548789.localdomain sudo[305315]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.1 (monmap changed)...
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.1 on np0005548789.localdomain
Dec 06 10:09:47 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:47 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:47 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:47 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:47 np0005548789.localdomain sudo[305392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:47 np0005548789.localdomain sudo[305392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:47 np0005548789.localdomain sudo[305392]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:47 np0005548789.localdomain sudo[305410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:47 np0005548789.localdomain sudo[305410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:09:47.300 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:09:47.300 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:09:47.302 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:47 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: tmp-crun.vyGaOg.mount: Deactivated successfully.
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8d8c68b29bf19d373668becd5f287da884895351e5e14a57cddc81e1575095ab-merged.mount: Deactivated successfully.
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.772927233 +0000 UTC m=+0.081102616 container create 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1763362218, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4)
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: Started libpod-conmon-68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3.scope.
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.839280153 +0000 UTC m=+0.147455566 container init 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, release=1763362218, GIT_CLEAN=True)
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.740883476 +0000 UTC m=+0.049058949 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.852230433 +0000 UTC m=+0.160405836 container start 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, com.redhat.component=rhceph-container, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.852689167 +0000 UTC m=+0.160864610 container attach 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Dec 06 10:09:47 np0005548789.localdomain flamboyant_mclean[305460]: 167 167
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: libpod-68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3.scope: Deactivated successfully.
Dec 06 10:09:47 np0005548789.localdomain podman[305444]: 2025-12-06 10:09:47.856141351 +0000 UTC m=+0.164316754 container died 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.openshift.expose-services=, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph)
Dec 06 10:09:47 np0005548789.localdomain podman[305465]: 2025-12-06 10:09:47.943594957 +0000 UTC m=+0.079303981 container remove 68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_mclean, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, GIT_CLEAN=True, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, version=7, architecture=x86_64)
Dec 06 10:09:47 np0005548789.localdomain systemd[1]: libpod-conmon-68942d5883816e3d102f42db7795f09748c3590865964ed5c4cf129f257e0da3.scope: Deactivated successfully.
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.4 (monmap changed)...
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.4 on np0005548789.localdomain
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:48 np0005548789.localdomain sudo[305410]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:48 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:48 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:48 np0005548789.localdomain sudo[305490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:48 np0005548789.localdomain sudo[305490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:48 np0005548789.localdomain sudo[305490]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:48 np0005548789.localdomain sudo[305508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:48 np0005548789.localdomain sudo[305508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bd991d8484c1e009835b28be4e3f899c0fdbc2213d3c353b5186210df8014171-merged.mount: Deactivated successfully.
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.826708587 +0000 UTC m=+0.078229199 container create fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, distribution-scope=public, vcs-type=git, name=rhceph, version=7, GIT_BRANCH=main, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669.scope.
Dec 06 10:09:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.796049943 +0000 UTC m=+0.047570575 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.898351517 +0000 UTC m=+0.149872129 container init fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, release=1763362218, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.911239955 +0000 UTC m=+0.162760557 container start fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.911715539 +0000 UTC m=+0.163236221 container attach fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1763362218, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:48 np0005548789.localdomain sweet_bell[305559]: 167 167
Dec 06 10:09:48 np0005548789.localdomain systemd[1]: libpod-fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669.scope: Deactivated successfully.
Dec 06 10:09:48 np0005548789.localdomain podman[305544]: 2025-12-06 10:09:48.916056051 +0000 UTC m=+0.167576703 container died fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, vcs-type=git, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:09:49 np0005548789.localdomain podman[305564]: 2025-12-06 10:09:49.01585783 +0000 UTC m=+0.087756947 container remove fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_bell, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, distribution-scope=public, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: libpod-conmon-fab2676db316fcb64c5307ff7c4419e7fd2cf5070798ef79f13a4482dc71b669.scope: Deactivated successfully.
Dec 06 10:09:49 np0005548789.localdomain sudo[305508]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.131139) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789131230, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2414, "num_deletes": 252, "total_data_size": 4580199, "memory_usage": 4650600, "flush_reason": "Manual Compaction"}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789146518, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2535389, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13772, "largest_seqno": 16181, "table_properties": {"data_size": 2525616, "index_size": 5830, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25522, "raw_average_key_size": 22, "raw_value_size": 2504070, "raw_average_value_size": 2204, "num_data_blocks": 257, "num_entries": 1136, "num_filter_entries": 1136, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015732, "oldest_key_time": 1765015732, "file_creation_time": 1765015789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 15430 microseconds, and 6261 cpu microseconds.
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.146578) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2535389 bytes OK
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.146603) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.148321) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.148341) EVENT_LOG_v1 {"time_micros": 1765015789148335, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.148364) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4568461, prev total WAL file size 4584813, number of live WAL files 2.
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.149373) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2475KB)], [21(16MB)]
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789149415, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19425254, "oldest_snapshot_seqno": -1}
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mds.mds.np0005548789.vxwwsq (monmap changed)...
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005548789.vxwwsq", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mds.mds.np0005548789.vxwwsq on np0005548789.localdomain
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:49 np0005548789.localdomain sudo[305581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:49 np0005548789.localdomain sudo[305581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:49 np0005548789.localdomain sudo[305581]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 11613 keys, 17625003 bytes, temperature: kUnknown
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789275779, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 17625003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17557789, "index_size": 37097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29061, "raw_key_size": 311818, "raw_average_key_size": 26, "raw_value_size": 17358916, "raw_average_value_size": 1494, "num_data_blocks": 1411, "num_entries": 11613, "num_filter_entries": 11613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015789, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.276891) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 17625003 bytes
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.278717) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.7 rd, 138.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 16.1 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(14.6) write-amplify(7.0) OK, records in: 12152, records dropped: 539 output_compression: NoCompression
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.278749) EVENT_LOG_v1 {"time_micros": 1765015789278736, "job": 10, "event": "compaction_finished", "compaction_time_micros": 127189, "compaction_time_cpu_micros": 49550, "output_level": 6, "num_output_files": 1, "total_output_size": 17625003, "num_input_records": 12152, "num_output_records": 11613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789279232, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015789281796, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.149283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.281909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.281918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.281922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.281925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:09:49.281928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:09:49 np0005548789.localdomain sudo[305599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:49 np0005548789.localdomain sudo[305599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 570 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:09:49 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 e93: 6 total, 6 up, 6 in
Dec 06 10:09:49 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:49.449+0000 7f047a21a640 -1 mgr handle_mgr_map I was active but no longer am
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: tmp-crun.m5gbKK.mount: Deactivated successfully.
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bec86e4695a48de2dcc3233b55a28b20a086b1cadb21042654e00eb469465d46-merged.mount: Deactivated successfully.
Dec 06 10:09:49 np0005548789.localdomain sshd[300957]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:09:49 np0005548789.localdomain systemd-logind[766]: Session 71 logged out. Waiting for processes to exit.
Dec 06 10:09:49 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: ignoring --setuser ceph since I am not root
Dec 06 10:09:49 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: ignoring --setgroup ceph since I am not root
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: pidfile_write: ignore empty --pid-file
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'alerts'
Dec 06 10:09:49 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'balancer'
Dec 06 10:09:49 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:49.670+0000 7f2667da7140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:09:49 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'cephadm'
Dec 06 10:09:49 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:49.766+0000 7f2667da7140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.790989804 +0000 UTC m=+0.074651811 container create e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main)
Dec 06 10:09:49 np0005548789.localdomain sshd[305672]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: Started libpod-conmon-e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562.scope.
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.754241906 +0000 UTC m=+0.037903923 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.885650618 +0000 UTC m=+0.169312615 container init e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, release=1763362218, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Dec 06 10:09:49 np0005548789.localdomain sshd[305672]: Accepted publickey for ceph-admin from 192.168.122.103 port 57128 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: libpod-e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562.scope: Deactivated successfully.
Dec 06 10:09:49 np0005548789.localdomain sharp_edison[305675]: 167 167
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.912612671 +0000 UTC m=+0.196274668 container start e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.912951451 +0000 UTC m=+0.196613438 container attach e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:49 np0005548789.localdomain podman[305657]: 2025-12-06 10:09:49.914860338 +0000 UTC m=+0.198522325 container died e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:09:49 np0005548789.localdomain systemd-logind[766]: New session 72 of user ceph-admin.
Dec 06 10:09:49 np0005548789.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Dec 06 10:09:49 np0005548789.localdomain sshd[305672]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:09:50 np0005548789.localdomain podman[305682]: 2025-12-06 10:09:50.012034947 +0000 UTC m=+0.085083405 container remove e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_edison, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: libpod-conmon-e735288c00e33917682d2ac61c72386afbdfe943ce3e3fae6ceb4b55ce872562.scope: Deactivated successfully.
Dec 06 10:09:50 np0005548789.localdomain sudo[305696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:50 np0005548789.localdomain sudo[305696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:50 np0005548789.localdomain sudo[305696]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:50 np0005548789.localdomain sudo[305599]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: session-71.scope: Consumed 27.292s CPU time.
Dec 06 10:09:50 np0005548789.localdomain systemd-logind[766]: Removed session 71.
Dec 06 10:09:50 np0005548789.localdomain sudo[305719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:09:50 np0005548789.localdomain sudo[305719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' 
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26618 172.18.0.107:0/2196335751' entity='mgr.np0005548789.mzhmje' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548785.vhqlsq
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: osdmap e93: 6 total, 6 up, 6 in
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/90840268' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: mgrmap e33: np0005548785.vhqlsq(active, starting, since 0.0485672s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548787.umwsra", "id": "np0005548787.umwsra"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: Manager daemon np0005548785.vhqlsq is now available
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: removing stray HostCache host record np0005548787.localdomain.devices.0
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005548787.localdomain.devices.0"}]': finished
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548785.vhqlsq/trash_purge_schedule"} : dispatch
Dec 06 10:09:50 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'crash'
Dec 06 10:09:50 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:09:50 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'dashboard'
Dec 06 10:09:50 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:50.456+0000 7f2667da7140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: tmp-crun.akhboo.mount: Deactivated successfully.
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-35ec4e3289f7e23672bf5f8de3af2ac5e3fcf72697529a3b00546e774b912359-merged.mount: Deactivated successfully.
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:09:50 np0005548789.localdomain podman[305749]: 2025-12-06 10:09:50.599220788 +0000 UTC m=+0.097267224 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:09:50 np0005548789.localdomain podman[305749]: 2025-12-06 10:09:50.605004452 +0000 UTC m=+0.103050868 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:09:50 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'devicehealth'
Dec 06 10:09:51 np0005548789.localdomain podman[305829]: 2025-12-06 10:09:51.026516907 +0000 UTC m=+0.117286456 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, build-date=2025-11-26T19:44:28Z, RELEASE=main, CEPH_POINT_RELEASE=)
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:51.069+0000 7f2667da7140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.089 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.090 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.091 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.091 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.111 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:51.112 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:51 np0005548789.localdomain podman[305829]: 2025-12-06 10:09:51.139306238 +0000 UTC m=+0.230075827 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, version=7, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph)
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]:   from numpy import show_config as show_numpy_config
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'influx'
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:51.215+0000 7f2667da7140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'insights'
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:51.278+0000 7f2667da7140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'iostat'
Dec 06 10:09:51 np0005548789.localdomain podman[305875]: 2025-12-06 10:09:51.364819445 +0000 UTC m=+0.128802933 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'k8sevents'
Dec 06 10:09:51 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:51.400+0000 7f2667da7140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 10:09:51 np0005548789.localdomain podman[305875]: 2025-12-06 10:09:51.402748429 +0000 UTC m=+0.166731857 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:09:51 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:09:51 np0005548789.localdomain ceph-mon[298582]: mgrmap e34: np0005548785.vhqlsq(active, since 1.08564s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:51 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:09:50] ENGINE Bus STARTING
Dec 06 10:09:51 np0005548789.localdomain systemd[1]: tmp-crun.WuPcYX.mount: Deactivated successfully.
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'localpool'
Dec 06 10:09:51 np0005548789.localdomain sudo[305719]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 10:09:51 np0005548789.localdomain sudo[305966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:51 np0005548789.localdomain sudo[305966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:51 np0005548789.localdomain sudo[305966]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:51 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'mirroring'
Dec 06 10:09:51 np0005548789.localdomain sudo[305984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:09:51 np0005548789.localdomain sudo[305984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'nfs'
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.188+0000 7f2667da7140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'orchestrator'
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.356+0000 7f2667da7140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'osd_support'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.420+0000 7f2667da7140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.481+0000 7f2667da7140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'progress'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.550+0000 7f2667da7140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'prometheus'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.610+0000 7f2667da7140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain sudo[305984]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:52 np0005548789.localdomain sudo[306033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:52 np0005548789.localdomain sudo[306033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:52 np0005548789.localdomain sudo[306033]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:52 np0005548789.localdomain sudo[306051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:09:52 np0005548789.localdomain sudo[306051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:09:52 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rbd_support'
Dec 06 10:09:52 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:52.928+0000 7f2667da7140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'restful'
Dec 06 10:09:53 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:53.014+0000 7f2667da7140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rgw'
Dec 06 10:09:53 np0005548789.localdomain sudo[306051]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'rook'
Dec 06 10:09:53 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:53.366+0000 7f2667da7140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain sudo[306088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:53 np0005548789.localdomain sudo[306088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306088]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain sudo[306106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:53 np0005548789.localdomain sudo[306106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306106]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain sudo[306124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548789.localdomain sudo[306124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306124]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain sudo[306142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:53 np0005548789.localdomain sudo[306142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306142]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain sudo[306160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548789.localdomain sudo[306160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306160]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'selftest'
Dec 06 10:09:53 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:53.793+0000 7f2667da7140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: mgrmap e35: np0005548785.vhqlsq(active, since 3s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'snap_schedule'
Dec 06 10:09:53 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:53.854+0000 7f2667da7140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 10:09:53 np0005548789.localdomain sudo[306194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548789.localdomain sudo[306194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:53 np0005548789.localdomain sudo[306194]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:09:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'stats'
Dec 06 10:09:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:09:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:09:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:09:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19235 "" "Go-http-client/1.1"
Dec 06 10:09:53 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'status'
Dec 06 10:09:53 np0005548789.localdomain sudo[306212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:09:53 np0005548789.localdomain sudo[306212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306212]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'telegraf'
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.050+0000 7f2667da7140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain sudo[306230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548789.localdomain sudo[306230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306230]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'telemetry'
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.110+0000 7f2667da7140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain sudo[306248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:54 np0005548789.localdomain sudo[306248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306248]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:54 np0005548789.localdomain sudo[306266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306266]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.242+0000 7f2667da7140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain sudo[306284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548789.localdomain sudo[306284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306284]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'volumes'
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.389+0000 7f2667da7140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain sudo[306302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306302]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548789.localdomain sudo[306320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306320]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Loading python module 'zabbix'
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.577+0000 7f2667da7140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain sudo[306354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548789.localdomain sudo[306354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306354]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-mgr-np0005548789-mzhmje[288579]: 2025-12-06T10:09:54.635+0000 7f2667da7140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 10:09:54 np0005548789.localdomain ceph-mgr[288591]: ms_deliver_dispatch: unhandled message 0x55d45c1ff600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Dec 06 10:09:54 np0005548789.localdomain sudo[306372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:09:54 np0005548789.localdomain sudo[306372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306372]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:54 np0005548789.localdomain sudo[306390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306390]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:09:54 np0005548789.localdomain sudo[306408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306408]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:09:54 np0005548789.localdomain ceph-mon[298582]: Standby manager daemon np0005548789.mzhmje started
Dec 06 10:09:54 np0005548789.localdomain sudo[306426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:09:54 np0005548789.localdomain sudo[306426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306426]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:54 np0005548789.localdomain sudo[306444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:54 np0005548789.localdomain sudo[306444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:54 np0005548789.localdomain sudo[306444]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:55 np0005548789.localdomain sudo[306462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306462]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306480]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306514]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306532]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548789.localdomain sudo[306550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306550]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:55 np0005548789.localdomain sudo[306568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306568]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:09:55 np0005548789.localdomain sudo[306586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306586]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306604]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:55 np0005548789.localdomain sudo[306622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306622]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306640]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306640]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain sudo[306674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306674]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: mgrmap e36: np0005548785.vhqlsq(active, since 5s), standbys: np0005548788.yvwbqq, np0005548787.umwsra, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:09:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:09:55 np0005548789.localdomain sudo[306692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:09:55 np0005548789.localdomain sudo[306692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:55 np0005548789.localdomain sudo[306692]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548789.localdomain sudo[306710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548789.localdomain sudo[306710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548789.localdomain sudo[306710]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.113 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.114 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.114 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.115 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.160 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:09:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:09:56.161 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:09:56 np0005548789.localdomain sudo[306728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:56 np0005548789.localdomain sudo[306728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:56 np0005548789.localdomain sudo[306728]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 06 10:09:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:09:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:09:57 np0005548789.localdomain podman[306747]: 2025-12-06 10:09:57.923423754 +0000 UTC m=+0.078446885 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm)
Dec 06 10:09:57 np0005548789.localdomain podman[306747]: 2025-12-06 10:09:57.941233212 +0000 UTC m=+0.096256393 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:09:56] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.service_discovery.Root object at 0x7fef2d81f340>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 8765 not bound on 172.18.0.103.
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.2 on np0005548788.localdomain
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Dec 06 10:09:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:57 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:09:57 np0005548789.localdomain podman[306746]: 2025-12-06 10:09:57.996674312 +0000 UTC m=+0.152351203 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Dec 06 10:09:58 np0005548789.localdomain podman[306746]: 2025-12-06 10:09:58.013290883 +0000 UTC m=+0.168967744 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec 06 10:09:58 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:09:58 np0005548789.localdomain sudo[306786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:58 np0005548789.localdomain sudo[306786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:58 np0005548789.localdomain sudo[306786]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:58 np0005548789.localdomain sudo[306804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:09:58 np0005548789.localdomain sudo[306804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.5 on np0005548788.localdomain
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005548789.mzhmje", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:09:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.162548117 +0000 UTC m=+0.066083433 container create 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Dec 06 10:09:59 np0005548789.localdomain systemd[1]: Started libpod-conmon-630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df.scope.
Dec 06 10:09:59 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.130954624 +0000 UTC m=+0.034489950 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.242709803 +0000 UTC m=+0.146245119 container init 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.25457393 +0000 UTC m=+0.158109236 container start 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1763362218, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main)
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.254966543 +0000 UTC m=+0.158501909 container attach 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1763362218, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64)
Dec 06 10:09:59 np0005548789.localdomain tender_elbakyan[306851]: 167 167
Dec 06 10:09:59 np0005548789.localdomain podman[306838]: 2025-12-06 10:09:59.258686875 +0000 UTC m=+0.162222241 container died 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Dec 06 10:09:59 np0005548789.localdomain systemd[1]: libpod-630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df.scope: Deactivated successfully.
Dec 06 10:09:59 np0005548789.localdomain podman[306856]: 2025-12-06 10:09:59.369817445 +0000 UTC m=+0.099985525 container remove 630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_elbakyan, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=)
Dec 06 10:09:59 np0005548789.localdomain systemd[1]: libpod-conmon-630e4f393655121908b1b1f3e21b14d4d91ffef938b2c9e4235cc5b78cffb6df.scope: Deactivated successfully.
Dec 06 10:09:59 np0005548789.localdomain sudo[306804]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mgr.np0005548789.mzhmje (monmap changed)...
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mgr.np0005548789.mzhmje on np0005548789.localdomain
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005548790.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:09:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548789.localdomain systemd[1]: tmp-crun.UPjbGj.mount: Deactivated successfully.
Dec 06 10:10:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-fa0c81e5c034f4a4210e5bf4fb8134fb99265b846dd7a1aa120726d16b34aa90-merged.mount: Deactivated successfully.
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring crash.np0005548790 (monmap changed)...
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon crash.np0005548790 on np0005548790.localdomain
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 06 10:10:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.162 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.198 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:01.199 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.0 (monmap changed)...
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.0 on np0005548790.localdomain
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Dec 06 10:10:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:02 np0005548789.localdomain podman[306872]: 2025-12-06 10:10:02.223137533 +0000 UTC m=+0.076497876 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:10:02 np0005548789.localdomain podman[306872]: 2025-12-06 10:10:02.234100354 +0000 UTC m=+0.087460697 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:10:02 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:10:02 np0005548789.localdomain sudo[306891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:02 np0005548789.localdomain sudo[306891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:02 np0005548789.localdomain sudo[306891]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.agent.HostData object at 0x7fefc03cbf40>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 7150 not bound on 172.18.0.103.
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Shutting down due to error in start listener:
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 268, in start
                                                               self.publish('start')
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 248, in publish
                                                               raise exc
                                                           cherrypy.process.wspbus.ChannelFailures: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Bus STOPPING
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 8765)) already shut down
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 7150)) already shut down
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Bus STOPPED
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Bus EXITING
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:01] ENGINE Bus EXITED
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: Failed to run cephadm http server: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring osd.3 (monmap changed)...
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon osd.3 on np0005548790.localdomain
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:03 np0005548789.localdomain sshd[306909]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:05 np0005548789.localdomain sshd[306909]: Received disconnect from 118.219.234.233 port 51900:11: Bye Bye [preauth]
Dec 06 10:10:05 np0005548789.localdomain sshd[306909]: Disconnected from authenticating user root 118.219.234.233 port 51900 [preauth]
Dec 06 10:10:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.26690 172.18.0.103:0/2299561010' entity='mgr.np0005548785.vhqlsq' 
Dec 06 10:10:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:10:05 np0005548789.localdomain podman[306911]: 2025-12-06 10:10:05.92863217 +0000 UTC m=+0.091607532 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:10:05 np0005548789.localdomain podman[306911]: 2025-12-06 10:10:05.939527249 +0000 UTC m=+0.102502571 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:10:05 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:10:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:06.200 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:07 np0005548789.localdomain ceph-mon[298582]: mgrmap e37: np0005548785.vhqlsq(active, since 17s), standbys: np0005548788.yvwbqq, np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:10:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5895 writes, 25K keys, 5895 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5895 writes, 817 syncs, 7.22 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 46 writes, 173 keys, 46 commit groups, 1.0 writes per commit group, ingest: 0.27 MB, 0.00 MB/s
                                                          Interval WAL: 46 writes, 20 syncs, 2.30 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:10:10 np0005548789.localdomain ceph-mon[298582]: Health check update: 2 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:10 np0005548789.localdomain ceph-mon[298582]: Health check update: 2 stray host(s) with 2 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:11.203 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:11 np0005548789.localdomain ceph-mon[298582]: pgmap v3: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:10:11 np0005548789.localdomain podman[306933]: 2025-12-06 10:10:11.91502064 +0000 UTC m=+0.074635201 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:10:11 np0005548789.localdomain systemd[299726]: Starting Mark boot as successful...
Dec 06 10:10:11 np0005548789.localdomain systemd[299726]: Finished Mark boot as successful.
Dec 06 10:10:11 np0005548789.localdomain podman[306933]: 2025-12-06 10:10:11.95515914 +0000 UTC m=+0.114773701 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:10:11 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:10:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.2 total, 600.0 interval
                                                          Cumulative writes: 5115 writes, 22K keys, 5115 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5115 writes, 779 syncs, 6.57 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 201 writes, 475 keys, 201 commit groups, 1.0 writes per commit group, ingest: 0.43 MB, 0.00 MB/s
                                                          Interval WAL: 201 writes, 93 syncs, 2.16 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:10:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:13 np0005548789.localdomain ceph-mon[298582]: pgmap v4: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:15 np0005548789.localdomain ceph-mon[298582]: pgmap v5: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.205 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.206 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.207 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.207 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.237 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:16.237 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:10:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:10:17 np0005548789.localdomain ceph-mon[298582]: pgmap v6: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:19 np0005548789.localdomain ceph-mon[298582]: pgmap v7: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:10:20 np0005548789.localdomain podman[306959]: 2025-12-06 10:10:20.904227436 +0000 UTC m=+0.071131396 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:10:20 np0005548789.localdomain podman[306959]: 2025-12-06 10:10:20.911445513 +0000 UTC m=+0.078349473 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:10:20 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.238 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.240 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.240 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.240 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.271 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:21.272 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:21 np0005548789.localdomain ceph-mon[298582]: pgmap v8: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:10:21 np0005548789.localdomain podman[306976]: 2025-12-06 10:10:21.915348065 +0000 UTC m=+0.075803137 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:10:21 np0005548789.localdomain podman[306976]: 2025-12-06 10:10:21.929121399 +0000 UTC m=+0.089576471 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:10:21 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:10:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:23 np0005548789.localdomain ceph-mon[298582]: pgmap v9: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:10:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:10:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:10:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:10:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:10:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19240 "" "Go-http-client/1.1"
Dec 06 10:10:25 np0005548789.localdomain ceph-mon[298582]: pgmap v10: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.272 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.274 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.314 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:26.315 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:27 np0005548789.localdomain ceph-mon[298582]: pgmap v11: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.198 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.199 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.199 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.200 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.200 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:28 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3108344624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.664 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.741 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.742 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:10:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:10:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:10:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3108344624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:28 np0005548789.localdomain systemd[1]: tmp-crun.LbGKmq.mount: Deactivated successfully.
Dec 06 10:10:28 np0005548789.localdomain podman[307022]: 2025-12-06 10:10:28.947233039 +0000 UTC m=+0.101817891 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:10:28 np0005548789.localdomain podman[307022]: 2025-12-06 10:10:28.95921818 +0000 UTC m=+0.113803092 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:10:28 np0005548789.localdomain podman[307021]: 2025-12-06 10:10:28.921220124 +0000 UTC m=+0.083871829 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, config_id=edpm, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 06 10:10:28 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.979 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.981 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11480MB free_disk=0.0GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.981 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:28.981 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:29 np0005548789.localdomain podman[307021]: 2025-12-06 10:10:29.005159424 +0000 UTC m=+0.167811129 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:10:29 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.093 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.093 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.094 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=0GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.140 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3639889960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.612 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.619 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.655 282197 ERROR nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [req-9a7f33cf-8803-4290-a03d-4b69fc5456e1] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}}] for resource provider with UUID 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n update conflict: Inventory for 'DISK_GB' on resource provider '0d33e88e-6335-4a94-8f21-32ba5b8bb7ad' in use.  ", "code": "placement.inventory.inuse", "request_id": "req-9a7f33cf-8803-4290-a03d-4b69fc5456e1"}]}
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.655 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Error updating PCI resources for node np0005548789.localdomain.: nova.exception.PlacementPciException: Failed to gather or report PCI resources to Placement: There was a conflict when trying to complete your request.
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]:  update conflict: Inventory for 'DISK_GB' on resource provider '0d33e88e-6335-4a94-8f21-32ba5b8bb7ad' in use.
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager Traceback (most recent call last):
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1288, in _update_to_placement
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self.reportclient.update_from_provider_tree(
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 1484, in update_from_provider_tree
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self.set_inventory_for_provider(
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 987, in set_inventory_for_provider
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     raise exception.InventoryInUse(err['detail'])
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager nova.exception.InventoryInUse: There was a conflict when trying to complete your request.
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager 
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager  update conflict: Inventory for 'DISK_GB' on resource provider '0d33e88e-6335-4a94-8f21-32ba5b8bb7ad' in use.  
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager 
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager During handling of the above exception, another exception occurred:
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager 
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager Traceback (most recent call last):
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10513, in _update_available_resource_for_node
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self.rt.update_available_resource(context, nodename,
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 889, in update_available_resource
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self._update_available_resource(context, resources, startup=startup)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     return f(*args, **kwargs)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 994, in _update_available_resource
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self._update(context, cn, startup=startup)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1303, in _update
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     self._update_to_placement(context, compute_node, startup)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     return Retrying(*dargs, **dkw).call(f, *args, **kw)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     return attempt.get(self._wrap_exception)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     six.reraise(self.value[0], self.value[1], self.value[2])
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     raise value
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager   File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1298, in _update_to_placement
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager     raise exception.PlacementPciException(error=str(e))
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager nova.exception.PlacementPciException: Failed to gather or report PCI resources to Placement: There was a conflict when trying to complete your request.
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager 
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager  update conflict: Inventory for 'DISK_GB' on resource provider '0d33e88e-6335-4a94-8f21-32ba5b8bb7ad' in use.  
Dec 06 10:10:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:29.656 282197 ERROR nova.compute.manager 
Dec 06 10:10:29 np0005548789.localdomain ceph-mon[298582]: pgmap v12: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3639889960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.662 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.662 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.662 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.906 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.907 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.908 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:10:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:30.908 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.316 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.318 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.319 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.319 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.321 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.335 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.336 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.336 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.337 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.337 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.338 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.339 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.339 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.369 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:31.371 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:31 np0005548789.localdomain ceph-mon[298582]: pgmap v13: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:32.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:32.184 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:10:32 np0005548789.localdomain podman[307083]: 2025-12-06 10:10:32.900998669 +0000 UTC m=+0.066724172 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:10:32 np0005548789.localdomain podman[307083]: 2025-12-06 10:10:32.916182116 +0000 UTC m=+0.081907639 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:10:32 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:10:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:33 np0005548789.localdomain ceph-mon[298582]: pgmap v14: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4114425361' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:36 np0005548789.localdomain ceph-mon[298582]: pgmap v15: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:36 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1769226532' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.372 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.373 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.373 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.374 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.374 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:10:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:36.377 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:10:36 np0005548789.localdomain podman[307103]: 2025-12-06 10:10:36.893813166 +0000 UTC m=+0.058355079 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:10:36 np0005548789.localdomain podman[307103]: 2025-12-06 10:10:36.90820081 +0000 UTC m=+0.072742763 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:10:36 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:10:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:37.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:37 np0005548789.localdomain sshd[307127]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 e94: 6 total, 6 up, 6 in
Dec 06 10:10:37 np0005548789.localdomain sshd[305672]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:10:37 np0005548789.localdomain systemd[1]: session-72.scope: Deactivated successfully.
Dec 06 10:10:37 np0005548789.localdomain systemd[1]: session-72.scope: Consumed 6.581s CPU time.
Dec 06 10:10:37 np0005548789.localdomain systemd-logind[766]: Session 72 logged out. Waiting for processes to exit.
Dec 06 10:10:37 np0005548789.localdomain systemd-logind[766]: Removed session 72.
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: pgmap v16: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548788.yvwbqq
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: osdmap e94: 6 total, 6 up, 6 in
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3346912753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: mgrmap e38: np0005548788.yvwbqq(active, starting, since 0.0607571s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: Manager daemon np0005548788.yvwbqq is now available
Dec 06 10:10:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:38 np0005548789.localdomain sshd[307128]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:38 np0005548789.localdomain sshd[307128]: Accepted publickey for ceph-admin from 192.168.122.106 port 60440 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:10:38 np0005548789.localdomain systemd-logind[766]: New session 73 of user ceph-admin.
Dec 06 10:10:38 np0005548789.localdomain systemd[1]: Started Session 73 of User ceph-admin.
Dec 06 10:10:38 np0005548789.localdomain sshd[307128]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:10:38 np0005548789.localdomain sudo[307132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:38 np0005548789.localdomain sudo[307132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:38 np0005548789.localdomain sudo[307132]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:38 np0005548789.localdomain sudo[307150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:10:38 np0005548789.localdomain sudo[307150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/mirror_snapshot_schedule"} : dispatch
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548788.yvwbqq/trash_purge_schedule"} : dispatch
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3616421039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: mgrmap e39: np0005548788.yvwbqq(active, since 1.0804s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/261091109' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:10:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/261091109' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:10:39 np0005548789.localdomain podman[307241]: 2025-12-06 10:10:39.422442928 +0000 UTC m=+0.100565323 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, version=7, com.redhat.component=rhceph-container)
Dec 06 10:10:39 np0005548789.localdomain podman[307241]: 2025-12-06 10:10:39.531235137 +0000 UTC m=+0.209357482 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main)
Dec 06 10:10:39 np0005548789.localdomain sshd[307289]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:40 np0005548789.localdomain sudo[307150]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/212188230' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:39] ENGINE Bus STARTING
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:39] ENGINE Serving on http://172.18.0.106:8765
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:39] ENGINE Serving on https://172.18.0.106:7150
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:39] ENGINE Bus STARTED
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:10:39] ENGINE Client ('172.18.0.106', 48356) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 2 stray daemon(s) not managed by cephadm)
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_HOST (was: 2 stray host(s) with 2 daemon(s) not managed by cephadm)
Dec 06 10:10:40 np0005548789.localdomain ceph-mon[298582]: Cluster is now healthy
Dec 06 10:10:40 np0005548789.localdomain sudo[307362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:40 np0005548789.localdomain sudo[307362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:40 np0005548789.localdomain sudo[307362]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:40 np0005548789.localdomain sudo[307380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:10:40 np0005548789.localdomain sudo[307380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:40 np0005548789.localdomain sshd[307289]: Received disconnect from 154.113.10.34 port 45648:11: Bye Bye [preauth]
Dec 06 10:10:40 np0005548789.localdomain sshd[307289]: Disconnected from authenticating user root 154.113.10.34 port 45648 [preauth]
Dec 06 10:10:40 np0005548789.localdomain sudo[307380]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='client.44646 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:41 np0005548789.localdomain sudo[307430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:41 np0005548789.localdomain sudo[307430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:41 np0005548789.localdomain sudo[307430]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:41 np0005548789.localdomain sudo[307448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:10:41 np0005548789.localdomain sudo[307448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:41.375 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:41 np0005548789.localdomain sudo[307448]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:10:42 np0005548789.localdomain sudo[307485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:10:42 np0005548789.localdomain sudo[307485]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:10:42 np0005548789.localdomain sudo[307509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307509]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain podman[307503]: 2025-12-06 10:10:42.108025579 +0000 UTC m=+0.081462529 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:10:42 np0005548789.localdomain podman[307503]: 2025-12-06 10:10:42.152197457 +0000 UTC m=+0.125634397 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec 06 10:10:42 np0005548789.localdomain sudo[307536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:10:42 np0005548789.localdomain sudo[307536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307536]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:42 np0005548789.localdomain sudo[307564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307564]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain sudo[307582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307582]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain sudo[307616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307616]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: mgrmap e40: np0005548788.yvwbqq(active, since 3s), standbys: np0005548790.kvkfyr, np0005548789.mzhmje
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:42 np0005548789.localdomain sudo[307634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain sudo[307634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307634]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:10:42 np0005548789.localdomain sudo[307652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307652]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:42 np0005548789.localdomain sudo[307670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307670]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:42 np0005548789.localdomain sudo[307688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307688]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain sudo[307706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307706]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:42 np0005548789.localdomain sudo[307724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307724]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:42 np0005548789.localdomain sudo[307742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:42 np0005548789.localdomain sudo[307742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:42 np0005548789.localdomain sudo[307742]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307776]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307794]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:43 np0005548789.localdomain sudo[307812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:43 np0005548789.localdomain sudo[307812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307812]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:10:43 np0005548789.localdomain sudo[307830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307830]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:10:43 np0005548789.localdomain sudo[307848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307848]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: from='client.44655 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Saving service mon spec with placement label:mon
Dec 06 10:10:43 np0005548789.localdomain ceph-mon[298582]: Standby manager daemon np0005548785.vhqlsq started
Dec 06 10:10:43 np0005548789.localdomain sudo[307866]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:43 np0005548789.localdomain sudo[307884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307884]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307902]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307936]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:10:43 np0005548789.localdomain sudo[307954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307954]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:43 np0005548789.localdomain sudo[307972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307972]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[307990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:43 np0005548789.localdomain sudo[307990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[307990]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:43 np0005548789.localdomain sudo[308008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:10:43 np0005548789.localdomain sudo[308008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:43 np0005548789.localdomain sudo[308008]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548789.localdomain sudo[308026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308026]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:44 np0005548789.localdomain sudo[308044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308044]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548789.localdomain sudo[308062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308062]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548789.localdomain sudo[308096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308096]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:10:44 np0005548789.localdomain sudo[308114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308114]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: from='client.44658 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005548790", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: mgrmap e41: np0005548788.yvwbqq(active, since 6s), standbys: np0005548790.kvkfyr, np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:10:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:10:44 np0005548789.localdomain sudo[308132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:44 np0005548789.localdomain sudo[308132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308132]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:44 np0005548789.localdomain sudo[308150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:44 np0005548789.localdomain sudo[308150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:44 np0005548789.localdomain sudo[308150]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:45 np0005548789.localdomain sshd[308168]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:46 np0005548789.localdomain sudo[308170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:46 np0005548789.localdomain sudo[308170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:46 np0005548789.localdomain sudo[308170]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:46 np0005548789.localdomain sshd[308168]: Received disconnect from 64.227.102.57 port 57974:11: Bye Bye [preauth]
Dec 06 10:10:46 np0005548789.localdomain sshd[308168]: Disconnected from authenticating user root 64.227.102.57 port 57974 [preauth]
Dec 06 10:10:46 np0005548789.localdomain sudo[308188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:10:46 np0005548789.localdomain sudo[308188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:46.379 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.514772759 +0000 UTC m=+0.070666833 container create bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container)
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 0 B/s wr, 19 op/s
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548788 (monmap changed)...
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548788 on np0005548788.localdomain
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:46 np0005548789.localdomain systemd[1]: Started libpod-conmon-bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569.scope.
Dec 06 10:10:46 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.490902205 +0000 UTC m=+0.046796259 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.593436413 +0000 UTC m=+0.149330467 container init bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, vendor=Red Hat, Inc., ceph=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public)
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.605614041 +0000 UTC m=+0.161508135 container start bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public)
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.606114766 +0000 UTC m=+0.162008860 container attach bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7)
Dec 06 10:10:46 np0005548789.localdomain nostalgic_mendeleev[308237]: 167 167
Dec 06 10:10:46 np0005548789.localdomain podman[308222]: 2025-12-06 10:10:46.612990145 +0000 UTC m=+0.168884269 container died bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:10:46 np0005548789.localdomain systemd[1]: libpod-bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569.scope: Deactivated successfully.
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:10:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:10:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:10:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-14d99ef0677c403f82710e2aea2b890e704bbbb80c25b89182d4ba91110aaa2c-merged.mount: Deactivated successfully.
Dec 06 10:10:46 np0005548789.localdomain podman[308242]: 2025-12-06 10:10:46.711692386 +0000 UTC m=+0.083336227 container remove bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_mendeleev, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, release=1763362218, version=7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Dec 06 10:10:46 np0005548789.localdomain systemd[1]: libpod-conmon-bc433e600361f755858e9c8ddb792925505b20a4e8da8c435ef0b72f3d813569.scope: Deactivated successfully.
Dec 06 10:10:46 np0005548789.localdomain sudo[308188]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:10:47.301 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:10:47.301 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:10:47.303 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548789 (monmap changed)...
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548789 on np0005548789.localdomain
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring mon.np0005548790 (monmap changed)...
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: Reconfiguring daemon mon.np0005548790 on np0005548790.localdomain
Dec 06 10:10:47 np0005548789.localdomain ceph-mon[298582]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:10:47 np0005548789.localdomain sshd[307127]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:10:47 np0005548789.localdomain sshd[307127]: banner exchange: Connection from 123.160.164.187 port 51662: Connection timed out
Dec 06 10:10:47 np0005548789.localdomain sudo[308258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:47 np0005548789.localdomain sudo[308258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:10:47 np0005548789.localdomain sudo[308258]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:10:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.26871 172.18.0.106:0/3380714700' entity='mgr.np0005548788.yvwbqq' 
Dec 06 10:10:49 np0005548789.localdomain ceph-mon[298582]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:10:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:51.381 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:10:51 np0005548789.localdomain ceph-mon[298582]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:51 np0005548789.localdomain podman[308276]: 2025-12-06 10:10:51.926087107 +0000 UTC m=+0.078839329 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:10:51 np0005548789.localdomain podman[308276]: 2025-12-06 10:10:51.931469061 +0000 UTC m=+0.084221333 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:10:51 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:10:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:10:52 np0005548789.localdomain podman[308294]: 2025-12-06 10:10:52.925370578 +0000 UTC m=+0.094719302 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:10:52 np0005548789.localdomain podman[308294]: 2025-12-06 10:10:52.929981368 +0000 UTC m=+0.099330092 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:10:52 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:10:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:53 np0005548789.localdomain ceph-mon[298582]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:10:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:10:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:10:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:10:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:10:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19244 "" "Go-http-client/1.1"
Dec 06 10:10:55 np0005548789.localdomain ceph-mon[298582]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:10:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:10:56.384 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:10:57 np0005548789.localdomain ceph-mon[298582]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:10:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:10:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:10:59 np0005548789.localdomain ceph-mon[298582]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:10:59 np0005548789.localdomain podman[308318]: 2025-12-06 10:10:59.926953924 +0000 UTC m=+0.082987214 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec 06 10:10:59 np0005548789.localdomain podman[308318]: 2025-12-06 10:10:59.935603117 +0000 UTC m=+0.091636417 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:10:59 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:10:59 np0005548789.localdomain podman[308317]: 2025-12-06 10:10:59.89707975 +0000 UTC m=+0.060303779 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 06 10:10:59 np0005548789.localdomain podman[308317]: 2025-12-06 10:10:59.982193119 +0000 UTC m=+0.145417228 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:10:59 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:11:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:01.386 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:01.390 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:01 np0005548789.localdomain ceph-mon[298582]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:11:03 np0005548789.localdomain systemd[1]: tmp-crun.ApMCtM.mount: Deactivated successfully.
Dec 06 10:11:03 np0005548789.localdomain podman[308359]: 2025-12-06 10:11:03.910796979 +0000 UTC m=+0.073333723 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:11:03 np0005548789.localdomain podman[308359]: 2025-12-06 10:11:03.921948307 +0000 UTC m=+0.084485011 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:11:03 np0005548789.localdomain ceph-mon[298582]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:03 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:11:05 np0005548789.localdomain ceph-mon[298582]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.391 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.393 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.393 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.393 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.420 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:06.421 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:07 np0005548789.localdomain ceph-mon[298582]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.915 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:11:07 np0005548789.localdomain systemd[1]: tmp-crun.7rrq45.mount: Deactivated successfully.
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.925 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:07 np0005548789.localdomain podman[308378]: 2025-12-06 10:11:07.927971845 +0000 UTC m=+0.087546803 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a551a3c-5c4e-4690-8df9-c915229b41fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:07.916200', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1cfb804-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '4ff9b19a2bbe21e0b342f04f5438c8369e2ed19fce3da36dc64bff1e7ac99fdd'}]}, 'timestamp': '2025-12-06 10:11:07.926183', '_unique_id': 'c4fe21413cdc4558a717cef866749f8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.944 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.945 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3371413f-2436-4b3c-b422-f480a56c7bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:07.929093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1d2a91a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': '579ddb45c1481bd5ff6e8d83bff62dd9d08a1277ffcb3219a091a128838ea3f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:07.929093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1d2b32e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': '87af5c97462660c6943020825028b6c58fd8515592385074ae68b904302edfd3'}]}, 'timestamp': '2025-12-06 10:11:07.945559', '_unique_id': 'a3377aa847ea4b3890ef68fb3135b7fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.946 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.947 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:07.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:11:07 np0005548789.localdomain podman[308378]: 2025-12-06 10:11:07.999271276 +0000 UTC m=+0.158846214 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47141ab3-0844-411d-8a2c-86c31a55f19d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:07.947244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1dc87fa-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'a03ee8df6507e66a9c5054914ee68622480bb16486de4536f98681e550910909'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:07.947244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1dc968c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'cb773b9a6e2258a72d25c49719c77c611123629c2a11e5add9a89add9374a6fe'}]}, 'timestamp': '2025-12-06 10:11:08.010401', '_unique_id': 'b981ba402786401aba6c11677f6e9ddb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abb17b2f-d625-422d-bcf0-2b81c851ca47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.012446', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1dcf2a8-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '8d5cb15c54668a4f2a665a14a02421684d8844ee1c7f810327b160134b8faec8'}]}, 'timestamp': '2025-12-06 10:11:08.012785', '_unique_id': '4e178d8ae13d4701b104235527f49640'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8eb6bad-8909-4e6d-9109-58c964ed42a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.014269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1dd39ac-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': 'def9db4621ad36b1cd440cdbead3b7a7cf575eb91cc36f7312912186efdbe6d7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.014269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1dd44ec-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': 'a6599481f00e49e156c75968a3ed332dcf2a6b1ab913f1508e2a19ff16b7bc2b'}]}, 'timestamp': '2025-12-06 10:11:08.014869', '_unique_id': '62c84d8b43d74cdc83b29e27b4332cc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d5ecb1b-e75f-48ca-8a05-d4ef7b539682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.016391', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1dd8c90-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': '498edbc9ab5dd7d053492497e6a4afe6cf4d5816883424ada0fe19fc8ed55949'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.016391', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1dd97ee-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': '6d905890c6e527350b732d7d7bffb073f05af027b42a8d2446cc3f5712b31a3c'}]}, 'timestamp': '2025-12-06 10:11:08.016972', '_unique_id': '3fcc958705cb4e2fa88980385e0b47d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.030 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 14920000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'abdf334c-9f5d-491f-b136-6f24a3afcfe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14920000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:11:08.018373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e1dfc848-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.280201277, 'message_signature': 'a393bb92b4ebdd70b5e3d8fc85438ccc2bc13d8688220f51dea336a82c8b7eeb'}]}, 'timestamp': '2025-12-06 10:11:08.031355', '_unique_id': 'ff8d8883aadd462dbd06deca35ce8714'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.033 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb6ad6d5-af84-4065-9dd1-e35a94fb7300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.033509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1e02928-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': '26f8c293abd67e70ecde2ef7ab1e48f880a1317395c227170c98f1ecd39c44a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.033509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1e036d4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'e2b6af6dfda0889900732ca3e6f58256b5cd0fd2e70efdf01754c7553b67ee79'}]}, 'timestamp': '2025-12-06 10:11:08.034156', '_unique_id': '5d1c6b007e554e8295ba036ee8906c66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '576d1ba6-b4d4-4ff0-8a9a-cdf027b08f1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.035690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1e0808a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': '3ade1e527dfea1d2a8f37ed72c6950c3c830cbb08df78550f5777065a5f34b02'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.035690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1e08b70-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': '99a5fc80f66a0d9ffb67f204ca88e691645787a06762e6b3c70100fe1efec849'}]}, 'timestamp': '2025-12-06 10:11:08.036338', '_unique_id': '361d7c6b65ad49308926a18b9832b555'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.037 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.037 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3edc4c1-1c7b-4194-ad67-77748fd0ca18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:11:08.037961', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e1e0d788-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.280201277, 'message_signature': 'd221d7ecf72babe72b7689c61db18ff7903176a001fe9f1c31de0549528e1d5b'}]}, 'timestamp': '2025-12-06 10:11:08.038326', '_unique_id': '6e4e88b9d4224921a9d91ef46eb1be12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.040 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a9c4bb-6953-4aae-bf50-264632178113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.040428', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e13854-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': 'b0584c8517b5aeb58e3ebf2ac0d5ac46c809cb6822f9ebf323553758ad6c3b77'}]}, 'timestamp': '2025-12-06 10:11:08.040779', '_unique_id': 'a89fda6a6e5d436e965aa8b7e59512b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e42604a0-c74d-4e58-980b-d8d3c2aa85f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.042293', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e1817e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '65c66e14a1a7f0ef84afb9efae57c92121649756b8eda912d1f3455ce61c67c5'}]}, 'timestamp': '2025-12-06 10:11:08.042632', '_unique_id': 'a54d12c419bb4dfaad6fc63e3fe284c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8356347-dfd5-4df8-a35a-3bd83cd74f37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.044006', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e1c30a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '46e7392be5eed51750f7bfe4a4dab5fd8f5bc7ced6d6b74350d9f7a2543d5933'}]}, 'timestamp': '2025-12-06 10:11:08.044342', '_unique_id': '82a10df32f184630a946cdd1a8354de0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.045 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0999b32-1164-4f82-9d4e-2a05337460e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.045864', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e20b80-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '28aea3f12c222ac332f1ac4d0f61b28526867a2312214a0fd75e050fd89b25a3'}]}, 'timestamp': '2025-12-06 10:11:08.046164', '_unique_id': '1fb05da130f1463fa247a082f19bc716'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.047 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.048 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e69bb4cb-4a41-49ec-a1b0-41702cb0ffee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.047834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1e25950-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': '8786f9dec6dd16d13160104ec043a47f0c8d38d3db5fb8c7d9c745d597b8f4f0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.047834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1e2654e-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.178458474, 'message_signature': '976c22fc89ab1969002a93838d426f7a755df6063116f40a2157636ae232d594'}]}, 'timestamp': '2025-12-06 10:11:08.048453', '_unique_id': '24c34027905b4e04be8be031a02ab3f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.050 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.050 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.050 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab63e8c2-438a-42b3-9ce8-1cd43cd5f6ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.050151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1e2b62a-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'a565ce6f79354b3b8912b8fcafba7819d2fded9874087f77cd23788ff547b4a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.050151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1e2c5d4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'aeb3f29ee8fa33a5535ecec1909853aa2929ee2f0a43a15f7eb624db973150a3'}]}, 'timestamp': '2025-12-06 10:11:08.050991', '_unique_id': '15eda63a75b043f7b922046d7208eeec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.052 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84a985dc-d2a7-4a88-aa69-f0afd4d48fec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.052752', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e337e4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': 'd8d4f2e032942245d0a9aab4612ea629dd932383637302a8322e436a45485015'}]}, 'timestamp': '2025-12-06 10:11:08.053896', '_unique_id': '97178ea7420746909a913cf60291b5b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.054 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.055 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef50095b-d11f-45ba-86ae-afc6e951eed0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.055476', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e38320-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': '99c09749bcd2e78ff2a1206d61c0c2e45f25228da3b649b29db3a53e9d0264dc'}]}, 'timestamp': '2025-12-06 10:11:08.055801', '_unique_id': '2f03a6b9b72c435f8d296d718caa440c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.056 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.057 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d64ae6f-26d4-4885-810d-73b2264ac25b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.057529', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e3d30c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': 'aa206b7a57a0ac6652d10ad99f1799e51496395ff2814924f7c616cc26c2fcd6'}]}, 'timestamp': '2025-12-06 10:11:08.057849', '_unique_id': 'a0bda3d5e99e492bb51a056b1bb21d11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.058 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.059 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.059 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4297a311-f951-4139-8860-51b4aabc733b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:11:08.059274', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'e1e4181c-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.165523842, 'message_signature': 'adf0cb14bef6b369759cf48b554e0bdc81e96a33951538a0b6f54ac0de1052fa'}]}, 'timestamp': '2025-12-06 10:11:08.059629', '_unique_id': '9883808d087c44adac469b9bfc48e80d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.061 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.061 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c1adc02-2068-4aa6-b7fc-86b3f3348cee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:11:08.061013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e1e45ad4-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'd255580a3bcbf86621893f86124e7e0e056920e0c847322ef5f111e4a1f5af00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:11:08.061013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e1e46736-d28b-11f0-aaf2-fa163e118844', 'monotonic_time': 12286.196594964, 'message_signature': 'bc19a6180a5ad0433d60a00b4757807e56f587affbf2ac06450f1a258edb190b'}]}, 'timestamp': '2025-12-06 10:11:08.061604', '_unique_id': '91e5e3cb2a0c4669901e349a58207373'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:11:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:11:08.062 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:11:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:09 np0005548789.localdomain ceph-mon[298582]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.422 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.423 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.423 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.424 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.463 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:11.464 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:11 np0005548789.localdomain ceph-mon[298582]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:11:12 np0005548789.localdomain podman[308400]: 2025-12-06 10:11:12.922057483 +0000 UTC m=+0.082000796 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:11:13 np0005548789.localdomain podman[308400]: 2025-12-06 10:11:13.013078242 +0000 UTC m=+0.173021545 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 06 10:11:13 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:11:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:13 np0005548789.localdomain ceph-mon[298582]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:15 np0005548789.localdomain ceph-mon[298582]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:15 np0005548789.localdomain sshd[308425]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.465 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.467 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.467 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.468 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.509 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:16.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:11:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:11:17 np0005548789.localdomain sshd[308425]: Received disconnect from 14.194.101.210 port 44180:11: Bye Bye [preauth]
Dec 06 10:11:17 np0005548789.localdomain sshd[308425]: Disconnected from authenticating user root 14.194.101.210 port 44180 [preauth]
Dec 06 10:11:17 np0005548789.localdomain ceph-mon[298582]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:19 np0005548789.localdomain ceph-mon[298582]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:19 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/483164750' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.511 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.513 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.513 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.513 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.552 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:21.553 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:21 np0005548789.localdomain ceph-mon[298582]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:11:22 np0005548789.localdomain podman[308427]: 2025-12-06 10:11:22.933922745 +0000 UTC m=+0.089895795 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:11:22 np0005548789.localdomain podman[308427]: 2025-12-06 10:11:22.968295786 +0000 UTC m=+0.124268876 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:11:22 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:11:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:11:23 np0005548789.localdomain systemd[1]: tmp-crun.7QqjCW.mount: Deactivated successfully.
Dec 06 10:11:23 np0005548789.localdomain podman[308445]: 2025-12-06 10:11:23.095233843 +0000 UTC m=+0.088093250 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:23 np0005548789.localdomain podman[308445]: 2025-12-06 10:11:23.133580205 +0000 UTC m=+0.126439612 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:23 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:11:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:11:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:11:23 np0005548789.localdomain ceph-mon[298582]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:11:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:11:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:11:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19239 "" "Go-http-client/1.1"
Dec 06 10:11:24 np0005548789.localdomain ceph-mon[298582]: from='client.44667 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:25 np0005548789.localdomain ceph-mon[298582]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.555 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.558 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.558 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.558 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.587 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:26.588 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:27 np0005548789.localdomain ceph-mon[298582]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.275 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.276 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.276 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.276 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.277 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:28 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1823885354' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.760 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1823885354' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.822 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:11:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:28.822 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.036 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.038 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11452MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.038 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.039 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.133 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.134 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.134 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.187 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1834841947' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.600 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.606 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.627 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.630 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:11:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:29.631 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:29 np0005548789.localdomain ceph-mon[298582]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/2405516431' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 06 10:11:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1834841947' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:11:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:11:30 np0005548789.localdomain podman[308511]: 2025-12-06 10:11:30.927239702 +0000 UTC m=+0.086941875 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec 06 10:11:30 np0005548789.localdomain podman[308511]: 2025-12-06 10:11:30.969033229 +0000 UTC m=+0.128735402 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, config_id=edpm, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 06 10:11:30 np0005548789.localdomain systemd[1]: tmp-crun.xOHtTr.mount: Deactivated successfully.
Dec 06 10:11:30 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:11:30 np0005548789.localdomain podman[308512]: 2025-12-06 10:11:30.99050615 +0000 UTC m=+0.145407757 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 06 10:11:31 np0005548789.localdomain podman[308512]: 2025-12-06 10:11:31.003164123 +0000 UTC m=+0.158065740 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:11:31 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.589 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.591 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.591 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.591 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.623 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.624 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.632 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.633 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:11:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:31.633 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:11:31 np0005548789.localdomain ceph-mon[298582]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.147 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.147 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.148 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.148 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.542 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.563 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.564 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.564 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.564 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.565 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.565 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:32.565 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:11:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:33.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:33.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:33.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:34 np0005548789.localdomain ceph-mon[298582]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:11:34 np0005548789.localdomain podman[308550]: 2025-12-06 10:11:34.917208273 +0000 UTC m=+0.077836220 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:11:34 np0005548789.localdomain podman[308550]: 2025-12-06 10:11:34.927249697 +0000 UTC m=+0.087877634 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:11:34 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:11:35 np0005548789.localdomain ceph-mon[298582]: from='client.44673 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 10:11:36 np0005548789.localdomain ceph-mon[298582]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:36 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1742337711' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.625 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.627 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.627 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.627 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.661 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.662 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:36.663 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/246951837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:11:38 np0005548789.localdomain podman[308570]: 2025-12-06 10:11:38.92490465 +0000 UTC m=+0.083656596 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:11:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:11:38 np0005548789.localdomain podman[308570]: 2025-12-06 10:11:38.960248802 +0000 UTC m=+0.119000798 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:38 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:11:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/291434841' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:11:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/416774902' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:11:40 np0005548789.localdomain ceph-mon[298582]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/259866061' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:11:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1381932184' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.697 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.699 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.701 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:41.704 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 e95: 6 total, 6 up, 6 in
Dec 06 10:11:41 np0005548789.localdomain sshd[307128]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 10:11:41 np0005548789.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Dec 06 10:11:41 np0005548789.localdomain systemd[1]: session-73.scope: Consumed 6.320s CPU time.
Dec 06 10:11:41 np0005548789.localdomain systemd-logind[766]: Session 73 logged out. Waiting for processes to exit.
Dec 06 10:11:41 np0005548789.localdomain systemd-logind[766]: Removed session 73.
Dec 06 10:11:42 np0005548789.localdomain sshd[308593]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:11:42 np0005548789.localdomain sshd[308593]: Accepted publickey for ceph-admin from 192.168.122.108 port 60336 ssh2: RSA SHA256:HQYBT8n3HbnicDtP9tehd3+gJXMFtkw+fTMlmR2wCsE
Dec 06 10:11:42 np0005548789.localdomain systemd-logind[766]: New session 74 of user ceph-admin.
Dec 06 10:11:42 np0005548789.localdomain systemd[1]: Started Session 74 of User ceph-admin.
Dec 06 10:11:42 np0005548789.localdomain sshd[308593]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: Activating manager daemon np0005548790.kvkfyr
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: osdmap e95: 6 total, 6 up, 6 in
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.200:0/3000437731' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: mgrmap e42: np0005548790.kvkfyr(active, starting, since 0.0402134s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548788"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548789"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata", "id": "np0005548790"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548789.vxwwsq"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548788.erzujf"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata", "who": "mds.np0005548790.vhcezv"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548790.kvkfyr", "id": "np0005548790.kvkfyr"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548785.vhqlsq", "id": "np0005548785.vhqlsq"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548789.mzhmje", "id": "np0005548789.mzhmje"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mds metadata"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mon metadata"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: Manager daemon np0005548790.kvkfyr is now available
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/mirror_snapshot_schedule"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005548790.kvkfyr/trash_purge_schedule"} : dispatch
Dec 06 10:11:42 np0005548789.localdomain sudo[308597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:42 np0005548789.localdomain sudo[308597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:42 np0005548789.localdomain sudo[308597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:42 np0005548789.localdomain sudo[308615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:11:42 np0005548789.localdomain sudo[308615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:43 np0005548789.localdomain podman[308705]: 2025-12-06 10:11:43.288158903 +0000 UTC m=+0.145551711 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-type=git)
Dec 06 10:11:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:11:43 np0005548789.localdomain sshd[308732]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:11:43 np0005548789.localdomain podman[308724]: 2025-12-06 10:11:43.407292792 +0000 UTC m=+0.096947858 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:11:43 np0005548789.localdomain podman[308724]: 2025-12-06 10:11:43.443347655 +0000 UTC m=+0.133002701 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:11:43 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:11:43 np0005548789.localdomain podman[308705]: 2025-12-06 10:11:43.498868327 +0000 UTC m=+0.356261075 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: mgrmap e43: np0005548790.kvkfyr(active, since 1.10215s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:11:43] ENGINE Bus STARTING
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:11:43] ENGINE Serving on https://172.18.0.108:7150
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:11:43] ENGINE Client ('172.18.0.108', 49786) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:11:43] ENGINE Serving on http://172.18.0.108:8765
Dec 06 10:11:43 np0005548789.localdomain ceph-mon[298582]: [06/Dec/2025:10:11:43] ENGINE Bus STARTED
Dec 06 10:11:44 np0005548789.localdomain sudo[308615]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:44 np0005548789.localdomain sudo[308848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:44 np0005548789.localdomain sudo[308848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:44 np0005548789.localdomain sudo[308848]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:44 np0005548789.localdomain sudo[308866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:11:44 np0005548789.localdomain sudo[308866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:44 np0005548789.localdomain sshd[308732]: Received disconnect from 118.219.234.233 port 53670:11: Bye Bye [preauth]
Dec 06 10:11:44 np0005548789.localdomain sshd[308732]: Disconnected from authenticating user root 118.219.234.233 port 53670 [preauth]
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: Cluster is now healthy
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:44 np0005548789.localdomain sudo[308866]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[308917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:45 np0005548789.localdomain sudo[308917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[308917]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[308935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Dec 06 10:11:45 np0005548789.localdomain sudo[308935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[308935]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[308971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:11:45 np0005548789.localdomain sudo[308971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[308971]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[308989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:11:45 np0005548789.localdomain sudo[308989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[308989]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[309007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:45 np0005548789.localdomain sudo[309007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[309007]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[309025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:45 np0005548789.localdomain sudo[309025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[309025]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:45 np0005548789.localdomain sudo[309043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:45 np0005548789.localdomain sudo[309043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:45 np0005548789.localdomain sudo[309043]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309077]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309095]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548789.localdomain sudo[309113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309113]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:46 np0005548789.localdomain sudo[309131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309131]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:46 np0005548789.localdomain sudo[309149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309149]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309167]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: mgrmap e44: np0005548790.kvkfyr(active, since 3s), standbys: np0005548785.vhqlsq, np0005548789.mzhmje
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.conf
Dec 06 10:11:46 np0005548789.localdomain sudo[309185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:46 np0005548789.localdomain sudo[309185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309185]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309203]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:11:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:11:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:11:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:46.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:46.704 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:46 np0005548789.localdomain sudo[309237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309237]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new
Dec 06 10:11:46 np0005548789.localdomain sudo[309255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309255]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.803870) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906803901, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2726, "num_deletes": 257, "total_data_size": 11909141, "memory_usage": 12424696, "flush_reason": "Manual Compaction"}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906831401, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 7341396, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16186, "largest_seqno": 18907, "table_properties": {"data_size": 7330634, "index_size": 6691, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26504, "raw_average_key_size": 22, "raw_value_size": 7307651, "raw_average_value_size": 6130, "num_data_blocks": 287, "num_entries": 1192, "num_filter_entries": 1192, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015789, "oldest_key_time": 1765015789, "file_creation_time": 1765015906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 27572 microseconds, and 7306 cpu microseconds.
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.831436) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 7341396 bytes OK
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.831458) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.833356) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.833370) EVENT_LOG_v1 {"time_micros": 1765015906833366, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.833387) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 11896294, prev total WAL file size 11896294, number of live WAL files 2.
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.834570) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(7169KB)], [24(16MB)]
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906834639, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 24966399, "oldest_snapshot_seqno": -1}
Dec 06 10:11:46 np0005548789.localdomain sudo[309273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:46 np0005548789.localdomain sudo[309273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309273]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 10:11:46 np0005548789.localdomain sudo[309291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain sudo[309291]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain sudo[309309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph
Dec 06 10:11:46 np0005548789.localdomain sudo[309309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12264 keys, 20846487 bytes, temperature: kUnknown
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906952342, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 20846487, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20774394, "index_size": 40312, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30725, "raw_key_size": 327295, "raw_average_key_size": 26, "raw_value_size": 20563644, "raw_average_value_size": 1676, "num_data_blocks": 1548, "num_entries": 12264, "num_filter_entries": 12264, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015906, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:11:46 np0005548789.localdomain sudo[309309]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.952566) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 20846487 bytes
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.954405) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.0 rd, 177.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(7.0, 16.8 +0.0 blob) out(19.9 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 12805, records dropped: 541 output_compression: NoCompression
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.954421) EVENT_LOG_v1 {"time_micros": 1765015906954414, "job": 12, "event": "compaction_finished", "compaction_time_micros": 117766, "compaction_time_cpu_micros": 52577, "output_level": 6, "num_output_files": 1, "total_output_size": 20846487, "num_input_records": 12805, "num_output_records": 12264, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906954981, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015906956388, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.834500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.956413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.956417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.956418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.956419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:11:46.956421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:11:46 np0005548789.localdomain sudo[309327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309327]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:47 np0005548789.localdomain sudo[309345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309345]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309363]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309397]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:11:47.302 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:11:47.302 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:11:47.303 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:47 np0005548789.localdomain sudo[309415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309415]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:47 np0005548789.localdomain sudo[309433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309433]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:47 np0005548789.localdomain sudo[309451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309451]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: Standby manager daemon np0005548788.yvwbqq started
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: mgrmap e45: np0005548790.kvkfyr(active, since 4s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:11:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "mgr metadata", "who": "np0005548788.yvwbqq", "id": "np0005548788.yvwbqq"} : dispatch
Dec 06 10:11:47 np0005548789.localdomain sudo[309469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config
Dec 06 10:11:47 np0005548789.localdomain sudo[309469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309469]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309487]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8
Dec 06 10:11:47 np0005548789.localdomain sudo[309505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309505]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309523]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309557]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:47 np0005548789.localdomain sudo[309575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new
Dec 06 10:11:47 np0005548789.localdomain sudo[309575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:47 np0005548789.localdomain sudo[309575]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548789.localdomain sudo[309593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-1939e851-b10c-5c3b-9bb7-8e7f380233e8/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring.new /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain sudo[309593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:48 np0005548789.localdomain sudo[309593]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.conf
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548789.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/etc/ceph/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: Updating np0005548788.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548789.localdomain sudo[309611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:11:49 np0005548789.localdomain sudo[309611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:49 np0005548789.localdomain sudo[309611]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:49 np0005548789.localdomain sudo[309629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:11:49 np0005548789.localdomain sudo[309629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:11:49 np0005548789.localdomain sudo[309629]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: Updating np0005548790.localdomain:/var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/config/ceph.client.admin.keyring
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:11:50 np0005548789.localdomain ceph-mon[298582]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 0 B/s wr, 18 op/s
Dec 06 10:11:50 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Dec 06 10:11:50 np0005548789.localdomain ceph-mon[298582]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Dec 06 10:11:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:51.704 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:11:52 np0005548789.localdomain ceph-mon[298582]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 10:11:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:11:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:11:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:11:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:11:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:11:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:11:53 np0005548789.localdomain systemd[1]: tmp-crun.8jdRDF.mount: Deactivated successfully.
Dec 06 10:11:53 np0005548789.localdomain podman[309648]: 2025-12-06 10:11:53.962747965 +0000 UTC m=+0.118999387 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:11:53 np0005548789.localdomain podman[309648]: 2025-12-06 10:11:53.969957464 +0000 UTC m=+0.126208886 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:11:53 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:11:53 np0005548789.localdomain podman[309647]: 2025-12-06 10:11:53.934937862 +0000 UTC m=+0.091481132 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec 06 10:11:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:11:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:11:54 np0005548789.localdomain sshd[309688]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:11:54 np0005548789.localdomain podman[309647]: 2025-12-06 10:11:54.064025323 +0000 UTC m=+0.220568643 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:11:54 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:11:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:11:54 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19250 "" "Go-http-client/1.1"
Dec 06 10:11:54 np0005548789.localdomain ceph-mon[298582]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Dec 06 10:11:54 np0005548789.localdomain sshd[309688]: Received disconnect from 64.227.102.57 port 36036:11: Bye Bye [preauth]
Dec 06 10:11:54 np0005548789.localdomain sshd[309688]: Disconnected from authenticating user root 64.227.102.57 port 36036 [preauth]
Dec 06 10:11:55 np0005548789.localdomain ceph-mon[298582]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.706 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.706 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.707 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.707 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.707 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:11:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:11:56.711 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:11:57 np0005548789.localdomain ceph-mon[298582]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:11:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:00 np0005548789.localdomain ceph-mon[298582]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Dec 06 10:12:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:01.711 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:12:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:01.716 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:12:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:12:01 np0005548789.localdomain podman[309691]: 2025-12-06 10:12:01.924428353 +0000 UTC m=+0.077893681 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:12:01 np0005548789.localdomain podman[309691]: 2025-12-06 10:12:01.934267751 +0000 UTC m=+0.087733069 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:12:01 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:12:02 np0005548789.localdomain podman[309690]: 2025-12-06 10:12:02.014530053 +0000 UTC m=+0.171221479 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64)
Dec 06 10:12:02 np0005548789.localdomain podman[309690]: 2025-12-06 10:12:02.02698025 +0000 UTC m=+0.183671676 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, version=9.6)
Dec 06 10:12:02 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:12:02 np0005548789.localdomain ceph-mon[298582]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:04 np0005548789.localdomain ceph-mon[298582]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:12:05 np0005548789.localdomain systemd[1]: tmp-crun.gEpSPc.mount: Deactivated successfully.
Dec 06 10:12:05 np0005548789.localdomain podman[309730]: 2025-12-06 10:12:05.920026634 +0000 UTC m=+0.082001177 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 06 10:12:05 np0005548789.localdomain podman[309730]: 2025-12-06 10:12:05.938272706 +0000 UTC m=+0.100247279 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:12:05 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:12:06 np0005548789.localdomain ceph-mon[298582]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:06.713 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:06.719 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:07 np0005548789.localdomain ceph-mon[298582]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:12:09 np0005548789.localdomain podman[309747]: 2025-12-06 10:12:09.918227624 +0000 UTC m=+0.079829600 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:12:09 np0005548789.localdomain podman[309747]: 2025-12-06 10:12:09.924382181 +0000 UTC m=+0.085984137 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:12:09 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:12:10 np0005548789.localdomain ceph-mon[298582]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:11.714 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:11.720 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:12 np0005548789.localdomain ceph-mon[298582]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:12:13 np0005548789.localdomain podman[309771]: 2025-12-06 10:12:13.901357528 +0000 UTC m=+0.066425034 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 06 10:12:13 np0005548789.localdomain podman[309771]: 2025-12-06 10:12:13.94238547 +0000 UTC m=+0.107452966 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 10:12:13 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:12:14 np0005548789.localdomain ceph-mon[298582]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:16 np0005548789.localdomain ceph-mon[298582]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:12:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:12:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:16.716 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:16.722 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:17 np0005548789.localdomain ceph-mon[298582]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:20 np0005548789.localdomain ceph-mon[298582]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:21.717 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:21.725 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:22 np0005548789.localdomain ceph-mon[298582]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:12:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:12:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:12:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:12:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:12:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19233 "" "Go-http-client/1.1"
Dec 06 10:12:24 np0005548789.localdomain ceph-mon[298582]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:24 np0005548789.localdomain sshd[309797]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:12:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:12:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:12:24 np0005548789.localdomain podman[309800]: 2025-12-06 10:12:24.909928831 +0000 UTC m=+0.068869907 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:12:24 np0005548789.localdomain podman[309800]: 2025-12-06 10:12:24.914868131 +0000 UTC m=+0.073809217 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:12:24 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:12:24 np0005548789.localdomain podman[309799]: 2025-12-06 10:12:24.983603314 +0000 UTC m=+0.142187330 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:12:24 np0005548789.localdomain podman[309799]: 2025-12-06 10:12:24.988602415 +0000 UTC m=+0.147186501 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:12:25 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:12:25 np0005548789.localdomain sshd[309797]: Received disconnect from 154.113.10.34 port 46178:11: Bye Bye [preauth]
Dec 06 10:12:25 np0005548789.localdomain sshd[309797]: Disconnected from authenticating user root 154.113.10.34 port 46178 [preauth]
Dec 06 10:12:26 np0005548789.localdomain ceph-mon[298582]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:26.719 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:26.728 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:27 np0005548789.localdomain ceph-mon[298582]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:28.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.134 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.135 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.135 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.135 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.136 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:29 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/825719281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.567 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.623 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.624 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.766 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.767 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11439MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.767 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.768 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.841 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.842 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.842 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:12:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:29.898 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:30 np0005548789.localdomain ceph-mon[298582]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:30 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/825719281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:30 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2307179959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:30.351 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:30.355 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:12:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2307179959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:31.089 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:12:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:31.092 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:12:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:31.093 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:31.721 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:31.730 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:32 np0005548789.localdomain ceph-mon[298582]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:12:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:12:32 np0005548789.localdomain systemd[1]: tmp-crun.qplCn8.mount: Deactivated successfully.
Dec 06 10:12:32 np0005548789.localdomain podman[309885]: 2025-12-06 10:12:32.93959262 +0000 UTC m=+0.100195778 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm)
Dec 06 10:12:32 np0005548789.localdomain systemd[1]: tmp-crun.rlKe8N.mount: Deactivated successfully.
Dec 06 10:12:32 np0005548789.localdomain podman[309886]: 2025-12-06 10:12:32.987097679 +0000 UTC m=+0.145028686 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:12:33 np0005548789.localdomain podman[309886]: 2025-12-06 10:12:33.003588198 +0000 UTC m=+0.161519315 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 06 10:12:33 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:12:33 np0005548789.localdomain podman[309885]: 2025-12-06 10:12:33.054845122 +0000 UTC m=+0.215448300 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Dec 06 10:12:33 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:12:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:33.094 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:33.095 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:12:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:33.095 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:12:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:34 np0005548789.localdomain ceph-mon[298582]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:34 np0005548789.localdomain sshd[309924]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:12:35 np0005548789.localdomain sshd[309924]: Received disconnect from 179.33.210.213 port 38260:11: Bye Bye [preauth]
Dec 06 10:12:35 np0005548789.localdomain sshd[309924]: Disconnected from authenticating user root 179.33.210.213 port 38260 [preauth]
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.301 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.301 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.302 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.302 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:12:36 np0005548789.localdomain ceph-mon[298582]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.722 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:36.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:12:36 np0005548789.localdomain podman[309926]: 2025-12-06 10:12:36.926687033 +0000 UTC m=+0.083972105 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 06 10:12:36 np0005548789.localdomain podman[309926]: 2025-12-06 10:12:36.941192913 +0000 UTC m=+0.098478015 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:12:36 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:12:37 np0005548789.localdomain ceph-mon[298582]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1486122464' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4173888289' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2157659444' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:38.927 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:12:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1131690724' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:12:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1131690724' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:12:39 np0005548789.localdomain ceph-mon[298582]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.289 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.290 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.290 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.291 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.291 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.291 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.292 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.292 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.292 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:12:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:40.375 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:12:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/233671784' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:12:40 np0005548789.localdomain podman[309945]: 2025-12-06 10:12:40.909123475 +0000 UTC m=+0.072346093 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:12:40 np0005548789.localdomain podman[309945]: 2025-12-06 10:12:40.943144937 +0000 UTC m=+0.106367545 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:12:40 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:12:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:41.724 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:41.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:41 np0005548789.localdomain ceph-mon[298582]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:42.176 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:44 np0005548789.localdomain ceph-mon[298582]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:12:44 np0005548789.localdomain podman[309969]: 2025-12-06 10:12:44.926180217 +0000 UTC m=+0.083982156 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:12:45 np0005548789.localdomain podman[309969]: 2025-12-06 10:12:45.006738718 +0000 UTC m=+0.164540667 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:12:45 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:12:46 np0005548789.localdomain ceph-mon[298582]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:46 np0005548789.localdomain sshd[309994]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:12:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:12:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:12:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:46.725 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:46.739 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:12:47.303 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:12:47.303 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:12:47.304 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:47 np0005548789.localdomain ceph-mon[298582]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:49 np0005548789.localdomain sudo[309996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:12:49 np0005548789.localdomain sudo[309996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:49 np0005548789.localdomain sudo[309996]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:49 np0005548789.localdomain sudo[310014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:12:49 np0005548789.localdomain sudo[310014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:50 np0005548789.localdomain ceph-mon[298582]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:50 np0005548789.localdomain sudo[310014]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:50 np0005548789.localdomain sshd[309994]: Received disconnect from 123.160.164.187 port 47722:11: Bye Bye [preauth]
Dec 06 10:12:50 np0005548789.localdomain sshd[309994]: Disconnected from authenticating user root 123.160.164.187 port 47722 [preauth]
Dec 06 10:12:50 np0005548789.localdomain sudo[310065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:12:50 np0005548789.localdomain sudo[310065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:12:50 np0005548789.localdomain sudo[310065]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:12:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:12:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:12:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:51.727 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:51.742 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:51 np0005548789.localdomain sshd[310083]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:12:52 np0005548789.localdomain ceph-mon[298582]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:12:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:53 np0005548789.localdomain sshd[310083]: Received disconnect from 14.194.101.210 port 40108:11: Bye Bye [preauth]
Dec 06 10:12:53 np0005548789.localdomain sshd[310083]: Disconnected from authenticating user root 14.194.101.210 port 40108 [preauth]
Dec 06 10:12:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:12:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:12:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:12:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:12:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:12:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19239 "" "Go-http-client/1.1"
Dec 06 10:12:54 np0005548789.localdomain ceph-mon[298582]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:12:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:12:55 np0005548789.localdomain podman[310085]: 2025-12-06 10:12:55.929422869 +0000 UTC m=+0.086683708 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:12:55 np0005548789.localdomain podman[310085]: 2025-12-06 10:12:55.958975875 +0000 UTC m=+0.116236764 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:12:55 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:12:56 np0005548789.localdomain podman[310086]: 2025-12-06 10:12:56.037151784 +0000 UTC m=+0.191666030 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:12:56 np0005548789.localdomain podman[310086]: 2025-12-06 10:12:56.048104096 +0000 UTC m=+0.202618352 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:12:56 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:12:56 np0005548789.localdomain ceph-mon[298582]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:56.729 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:12:56.746 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:12:57 np0005548789.localdomain ceph-mon[298582]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:12:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:12:59 np0005548789.localdomain sshd[310126]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:13:00 np0005548789.localdomain ceph-mon[298582]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:00 np0005548789.localdomain sshd[310126]: Received disconnect from 64.227.102.57 port 38346:11: Bye Bye [preauth]
Dec 06 10:13:00 np0005548789.localdomain sshd[310126]: Disconnected from authenticating user root 64.227.102.57 port 38346 [preauth]
Dec 06 10:13:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:01.731 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:01.748 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:02 np0005548789.localdomain ceph-mon[298582]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:13:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:13:03 np0005548789.localdomain podman[310129]: 2025-12-06 10:13:03.345298948 +0000 UTC m=+0.092395721 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:13:03 np0005548789.localdomain podman[310129]: 2025-12-06 10:13:03.356836109 +0000 UTC m=+0.103932872 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:13:03 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:13:03 np0005548789.localdomain podman[310128]: 2025-12-06 10:13:03.452140816 +0000 UTC m=+0.203487537 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 06 10:13:03 np0005548789.localdomain podman[310128]: 2025-12-06 10:13:03.465423408 +0000 UTC m=+0.216770189 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350)
Dec 06 10:13:03 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:13:04 np0005548789.localdomain ceph-mon[298582]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:06.287 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:06.288 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:06.288 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:13:06 np0005548789.localdomain ceph-mon[298582]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:06.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:06.751 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:13:07 np0005548789.localdomain ceph-mon[298582]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:07 np0005548789.localdomain systemd[299726]: Created slice User Background Tasks Slice.
Dec 06 10:13:07 np0005548789.localdomain systemd[1]: tmp-crun.tAkA2i.mount: Deactivated successfully.
Dec 06 10:13:07 np0005548789.localdomain systemd[299726]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.921 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain podman[310167]: 2025-12-06 10:13:07.926164253 +0000 UTC m=+0.087734970 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:13:07 np0005548789.localdomain systemd[299726]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.960 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.961 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd84eb649-5227-4cd7-9961-62eaf2e2a530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:07.922494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '295ba7be-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': 'f8c89a1b37b6045ff5c1c92552c7fa9e99e3ffb2ff17c10bdc4e6cdc9e07f153'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:07.922494', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '295bb998-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': 'f0954ec12cd95355957bc131864a0e35028a7f28d9c2a6c582bd33b380571505'}]}, 'timestamp': '2025-12-06 10:13:07.961881', '_unique_id': 'ecb51f24b3884837a66c5cab4f58c35b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain podman[310167]: 2025-12-06 10:13:07.967895428 +0000 UTC m=+0.129466085 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.968 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82767e30-e191-42d4-bb9a-f621c20250be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:07.965205', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '295cc838-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': '97f05359325c16d3e90e096fdbeb50d7eb61d9943c77e1efebc596c602670431'}]}, 'timestamp': '2025-12-06 10:13:07.968855', '_unique_id': 'bb7b6f00fad54ac8bf965980fad55020'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6679a9f2-5cc1-41d1-8470-eee04794afc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:07.970654', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '295d1edc-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': '7a226da46dff33aae00674eae49a1a9a4e6d35343ed110eebb8b03e7f470ac23'}]}, 'timestamp': '2025-12-06 10:13:07.971016', '_unique_id': '2464464a28da452d92b7f61ff24c172c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.971 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.972 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.972 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cd8b4b9-7381-4ca6-a307-b7b2c03af76c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:07.972587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '295d6914-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': 'c3c9cdf6f90716c69827bab286630150cb328f343b66571f7596c776b9bf9cdd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:07.972587', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '295d74fe-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '3e1f761ce008e07649e19432c8576fc78f3d4b75bfdaada092dbc7516e87a0f4'}]}, 'timestamp': '2025-12-06 10:13:07.973229', '_unique_id': '310903ac26f74793a3bf08b625576f9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.974 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d22e28c-cd9a-42ba-89d4-57a802fabfe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:07.974779', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '295dbee6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': 'e0674405ee9494eac5f920f5e6f086a8924ce59dd9538a1ba2e6b8baf5371362'}]}, 'timestamp': '2025-12-06 10:13:07.975098', '_unique_id': '64b3c707908b4310ab93763918ded3e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.976 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.976 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b31efef-1dd8-468d-ba52-f3e2542fa5f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:07.977077', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '295f89d8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': '12ab0509efc7282082db6b1cc735ed5c8ddfb4e26012886731cfed2686063d45'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:07.977077', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '295f9838-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': 'eef4be7e50b0dd05e65879eceaa9becaf25b6bd6d3fcfe3bfa1a6bcf59bd93db'}]}, 'timestamp': '2025-12-06 10:13:07.987199', '_unique_id': 'ab48a5b9c8d64bc0ac3525e1a696dcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.989 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd83d2fcf-826b-415e-9482-f5a19897f5f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:07.989315', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '295ff6ac-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': 'a32041bb09425331a3aed75144248c3ee77c3224b44540a13883e68432c810a4'}]}, 'timestamp': '2025-12-06 10:13:07.989618', '_unique_id': '9b6ad71cdc3a4d38a2b47fd5353eae07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5934d9e9-f746-45d9-bb42-cf7c264f1a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:07.991042', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2960396e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '63e0603a5631e9ca373897be9a8adeb2f5d1c25b037713045d1ba25a3d69ca24'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:07.991042', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2960435a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '2d2cf43ed9d5d71b9d25102b04c02fd23544f966a36ba415acd0ce8b9751dc53'}]}, 'timestamp': '2025-12-06 10:13:07.991562', '_unique_id': '4bfee6b595a84a56bb672b2578198ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d3f11a5-3cbe-4604-a07f-0ac8326376a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:07.993027', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '29608720-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': '473fb06ba7104563ceccf6f30afea1a2e2ccd162ae65e5f2909009135b7b767c'}]}, 'timestamp': '2025-12-06 10:13:07.993311', '_unique_id': '89c3129a836547e3b515c0d6c64451bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.005 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 15550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '006dd3a3-913f-40ef-92e7-9a80f542a067', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15550000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:13:07.994616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '296276a2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.254781029, 'message_signature': 'd4f4e8db836963581e959dc5618d1ddcafaf0b0935bbefedcd6a15aea80564cb'}]}, 'timestamp': '2025-12-06 10:13:08.006028', '_unique_id': 'a07db29e67bc4d429e9787dc8627d08f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.006 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '527bcbb3-b029-4819-9d30-1f128ddb2f0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:08.008041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2962d3e0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': 'd4ace50fa84f3bc20f3e6fa911f3545f6748ad76564051abf90b41c033540853'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:08.008041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2962dfe8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': '9fea6bf5f97789402f66c587a71f73f914162a2e50c81fdac57b086827ada9fa'}]}, 'timestamp': '2025-12-06 10:13:08.008684', '_unique_id': 'cdee290f4e2c4f7f86df8a5431600ef9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.009 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c41d694-bada-4e7c-ab5b-1565ae640f08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:08.010396', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '29632fa2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': '27fc98996a0ae8ff7bd0eef1e7ff147efdb9daf7d1f20469b2f2bbd2f0c593d7'}]}, 'timestamp': '2025-12-06 10:13:08.010737', '_unique_id': 'eff96055e1324eb7956af407e00f7df1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '909ebb4b-7fa0-46bc-8170-99c354182877', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:08.012426', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '29637eda-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': '65c6c90b585af6d58f91abf4724c515f2728373700803598d9f583dd88a716aa'}]}, 'timestamp': '2025-12-06 10:13:08.012781', '_unique_id': '507bc4c131344d309dd196b6131a5bd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.014 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e50b859-6142-4653-bd27-a02f4a1d36f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:08.014451', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2963cdd6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': 'ff1ccd3c03d3a0acec488e1e5ab0b18639c55133eee702545c480ec6811863de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:08.014451', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2963daa6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '1cb03bc822b3126907e7559b660efc8d0b5f04919f2e0ff2a4aacc9d42aae646'}]}, 'timestamp': '2025-12-06 10:13:08.015100', '_unique_id': '4a8a4c19159b4057aad05a5acb2ae83c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.017 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.017 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b807ea80-a38a-4253-957d-5e83ed2e1c99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:08.017260', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '29643ba4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': 'a4bc7859cd5b1305d8211378b074a8495e59e2ccbb9099a752866f072b72aee1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:08.017260', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '296448ec-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '2cc9cc4961d5055e92f41adb2d05c0a811fb22dd08f680f53e0ba70ece042fa6'}]}, 'timestamp': '2025-12-06 10:13:08.017934', '_unique_id': 'cecb6fd436f747e187ce6fed16d2344c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.018 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83f0051e-dd00-4759-98ff-8c7bb699e674', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:08.019863', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '2964a0c6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': 'a7443abf972f5c2b6dd30704b6d415c078241d16610cacd384666e1b0fe1fccc'}]}, 'timestamp': '2025-12-06 10:13:08.020152', '_unique_id': 'b771690ffc4042988081195b5841586e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.021 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7f632f6-5c6c-46de-8c74-c4bd5f205ad1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:08.021751', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '2964ea4a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': 'bb684431a5bb6be8875ccbe7772a61b30ac85af769de4aaf7752e988c90e8896'}]}, 'timestamp': '2025-12-06 10:13:08.022034', '_unique_id': '41524a40402c4dacb24738b9b3817894'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.023 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.023 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f5f4fa5-68cf-4a9f-a890-881e1659eb72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:08.023416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '29652a50-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': 'c4d4321a7ecd942e015b2326352518d84fd0b8d17e4bfc6836377c71b5a445b3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:08.023416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '296536c6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.22645253, 'message_signature': '5630a7b229904870f02f1246201dfcb48cd9a002b8cbcd2a53a5c394bca38ef1'}]}, 'timestamp': '2025-12-06 10:13:08.024011', '_unique_id': '24ae9ef4a58b4be394cd007b917ed9fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.025 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '518ef467-835d-4f25-8449-2aadd965aab0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:13:08.025541', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '29657d84-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.214585251, 'message_signature': 'fa06268950bcc911d86ad6b4fc003762519c451ed733f364e48d94e32c13d380'}]}, 'timestamp': '2025-12-06 10:13:08.025817', '_unique_id': 'a9edda383e224f14923adf0622c91eac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.027 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.027 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea89dc9a-fcfc-47cb-b9f1-530f0d606f82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:13:08.027099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2965ba38-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '4fd3a8f16b4a960e077a071bb78e43e61b1bbf7832780b1811258d1f2c550183'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:13:08.027099', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2965c30c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.171877886, 'message_signature': '0068336568ce3261075a4f9795d33df052ead2530746238e8a7d42b0228e7f61'}]}, 'timestamp': '2025-12-06 10:13:08.027566', '_unique_id': '08da8ccd66d8493ea5da49254c333e74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35ccbb6-17da-4b82-9148-188756f59058', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:13:08.028899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '29660074-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12406.254781029, 'message_signature': 'cf4674c1337ec75d576e67d9489e51a893ef1c0751483947e4beab36bd318cee'}]}, 'timestamp': '2025-12-06 10:13:08.029142', '_unique_id': '074b28bc3cd04fe3b1cc9390b3d01bdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:13:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:13:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:13:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:10 np0005548789.localdomain ceph-mon[298582]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:11.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:11.754 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.872983) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991873029, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1299, "num_deletes": 255, "total_data_size": 1639360, "memory_usage": 1671184, "flush_reason": "Manual Compaction"}
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991881781, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1044636, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18912, "largest_seqno": 20206, "table_properties": {"data_size": 1039395, "index_size": 2712, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11725, "raw_average_key_size": 20, "raw_value_size": 1028560, "raw_average_value_size": 1764, "num_data_blocks": 115, "num_entries": 583, "num_filter_entries": 583, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015907, "oldest_key_time": 1765015907, "file_creation_time": 1765015991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8874 microseconds, and 3540 cpu microseconds.
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881855) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1044636 bytes OK
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.881880) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.884123) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.884142) EVENT_LOG_v1 {"time_micros": 1765015991884136, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.884162) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1633168, prev total WAL file size 1633492, number of live WAL files 2.
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.884713) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373637' seq:72057594037927935, type:22 .. '6C6F676D0034303138' seq:0, type:0; will stop at (end)
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1020KB)], [27(19MB)]
Dec 06 10:13:11 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015991884790, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21891123, "oldest_snapshot_seqno": -1}
Dec 06 10:13:11 np0005548789.localdomain podman[310186]: 2025-12-06 10:13:11.912777242 +0000 UTC m=+0.079761129 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:13:11 np0005548789.localdomain podman[310186]: 2025-12-06 10:13:11.919303349 +0000 UTC m=+0.086287256 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:13:11 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12314 keys, 21753852 bytes, temperature: kUnknown
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992030224, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 21753852, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21679511, "index_size": 42432, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30853, "raw_key_size": 329412, "raw_average_key_size": 26, "raw_value_size": 21465963, "raw_average_value_size": 1743, "num_data_blocks": 1635, "num_entries": 12314, "num_filter_entries": 12314, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765015991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.030557) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 21753852 bytes
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.032809) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.4 rd, 149.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 19.9 +0.0 blob) out(20.7 +0.0 blob), read-write-amplify(41.8) write-amplify(20.8) OK, records in: 12847, records dropped: 533 output_compression: NoCompression
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.032839) EVENT_LOG_v1 {"time_micros": 1765015992032826, "job": 14, "event": "compaction_finished", "compaction_time_micros": 145530, "compaction_time_cpu_micros": 54720, "output_level": 6, "num_output_files": 1, "total_output_size": 21753852, "num_input_records": 12847, "num_output_records": 12314, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992033142, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015992035864, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:11.884602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.035971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.035979) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.035983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.035986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:12.035990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:12 np0005548789.localdomain ceph-mon[298582]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:12.290 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:13:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e96 e96: 6 total, 6 up, 6 in
Dec 06 10:13:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:14 np0005548789.localdomain ceph-mon[298582]: pgmap v49: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s rd, 1.7 MiB/s wr, 12 op/s
Dec 06 10:13:14 np0005548789.localdomain ceph-mon[298582]: osdmap e96: 6 total, 6 up, 6 in
Dec 06 10:13:14 np0005548789.localdomain ceph-mon[298582]: mgrmap e46: np0005548790.kvkfyr(active, since 91s), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:13:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 e97: 6 total, 6 up, 6 in
Dec 06 10:13:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:13:15 np0005548789.localdomain podman[310208]: 2025-12-06 10:13:15.901546476 +0000 UTC m=+0.066064323 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:13:15 np0005548789.localdomain podman[310208]: 2025-12-06 10:13:15.938146965 +0000 UTC m=+0.102664842 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:13:15 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:13:16 np0005548789.localdomain ceph-mon[298582]: pgmap v51: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 9.9 KiB/s rd, 2.0 MiB/s wr, 14 op/s
Dec 06 10:13:16 np0005548789.localdomain ceph-mon[298582]: osdmap e97: 6 total, 6 up, 6 in
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:13:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:13:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:16.739 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:16.758 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:17 np0005548789.localdomain ceph-mon[298582]: pgmap v53: 177 pgs: 177 active+clean; 125 MiB data, 637 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 2.6 MiB/s wr, 18 op/s
Dec 06 10:13:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:20 np0005548789.localdomain ceph-mon[298582]: pgmap v54: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Dec 06 10:13:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:21.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:21.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:13:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:21.205 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:13:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:21.741 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:21.760 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:22 np0005548789.localdomain ceph-mon[298582]: pgmap v55: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.6 MiB/s wr, 29 op/s
Dec 06 10:13:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:13:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:13:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:13:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:13:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:13:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19232 "" "Go-http-client/1.1"
Dec 06 10:13:24 np0005548789.localdomain ceph-mon[298582]: pgmap v56: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.1 MiB/s wr, 23 op/s
Dec 06 10:13:25 np0005548789.localdomain sshd[310234]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:13:26 np0005548789.localdomain ceph-mon[298582]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.0 MiB/s wr, 23 op/s
Dec 06 10:13:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:26.743 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:26.763 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:13:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:13:26 np0005548789.localdomain podman[310237]: 2025-12-06 10:13:26.937740227 +0000 UTC m=+0.088437930 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:13:26 np0005548789.localdomain systemd[1]: tmp-crun.5pqDIj.mount: Deactivated successfully.
Dec 06 10:13:26 np0005548789.localdomain podman[310236]: 2025-12-06 10:13:26.997126886 +0000 UTC m=+0.148052307 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:13:27 np0005548789.localdomain podman[310237]: 2025-12-06 10:13:27.02332125 +0000 UTC m=+0.174018923 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:13:27 np0005548789.localdomain podman[310236]: 2025-12-06 10:13:27.030541729 +0000 UTC m=+0.181467200 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:13:27 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:13:27 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:13:27 np0005548789.localdomain sshd[310234]: Received disconnect from 118.219.234.233 port 55434:11: Bye Bye [preauth]
Dec 06 10:13:27 np0005548789.localdomain sshd[310234]: Disconnected from authenticating user root 118.219.234.233 port 55434 [preauth]
Dec 06 10:13:27 np0005548789.localdomain ceph-mon[298582]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.7 MiB/s wr, 19 op/s
Dec 06 10:13:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:28.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:30 np0005548789.localdomain ceph-mon[298582]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.7 MiB/s wr, 19 op/s
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.200 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.225 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.227 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.227 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:30 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2010010665' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:30.693 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2010010665' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.205 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.206 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.427 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.428 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11432MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.429 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.429 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.745 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.765 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.889 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.889 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:13:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:31.890 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:13:32 np0005548789.localdomain ceph-mon[298582]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.192 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.436 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.437 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.463 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.507 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.553 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/113296038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.980 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:32.987 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:13:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:33.341 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:13:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:33.344 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:13:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:33.344 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/113296038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:13:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:13:33 np0005548789.localdomain podman[310323]: 2025-12-06 10:13:33.909088487 +0000 UTC m=+0.072163288 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:13:33 np0005548789.localdomain podman[310323]: 2025-12-06 10:13:33.950198662 +0000 UTC m=+0.113273433 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:13:33 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:13:33 np0005548789.localdomain podman[310322]: 2025-12-06 10:13:33.972211789 +0000 UTC m=+0.135461215 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:13:33 np0005548789.localdomain podman[310322]: 2025-12-06 10:13:33.988195124 +0000 UTC m=+0.151444590 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, version=9.6, maintainer=Red Hat, Inc.)
Dec 06 10:13:34 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:13:34 np0005548789.localdomain ceph-mon[298582]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.326 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.327 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.328 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.328 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.740 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.741 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.741 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:13:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:35.741 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.269 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.286 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.287 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.287 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.287 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.287 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.288 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.288 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.288 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.288 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:13:36 np0005548789.localdomain ceph-mon[298582]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.747 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:36.768 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:37 np0005548789.localdomain ceph-mon[298582]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:13:38 np0005548789.localdomain systemd[1]: tmp-crun.WgOfX3.mount: Deactivated successfully.
Dec 06 10:13:38 np0005548789.localdomain podman[310362]: 2025-12-06 10:13:38.923823359 +0000 UTC m=+0.085524462 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:13:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2709377790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:38 np0005548789.localdomain podman[310362]: 2025-12-06 10:13:38.934972367 +0000 UTC m=+0.096673500 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 10:13:38 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:13:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2247424638' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:13:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2247424638' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:13:39 np0005548789.localdomain ceph-mon[298582]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2976675022' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e98 e98: 6 total, 6 up, 6 in
Dec 06 10:13:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:40.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:40.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:13:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:13:40Z|00062|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Dec 06 10:13:40 np0005548789.localdomain ceph-mon[298582]: osdmap e98: 6 total, 6 up, 6 in
Dec 06 10:13:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/92571660' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.770 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.772 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.772 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.772 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.779 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:41.779 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.900137) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021900300, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 628, "num_deletes": 250, "total_data_size": 1101923, "memory_usage": 1118280, "flush_reason": "Manual Compaction"}
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021909654, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 653774, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20211, "largest_seqno": 20834, "table_properties": {"data_size": 651051, "index_size": 706, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7627, "raw_average_key_size": 20, "raw_value_size": 645236, "raw_average_value_size": 1743, "num_data_blocks": 31, "num_entries": 370, "num_filter_entries": 370, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015991, "oldest_key_time": 1765015991, "file_creation_time": 1765016021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 9602 microseconds, and 4370 cpu microseconds.
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.909733) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 653774 bytes OK
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.909797) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.911634) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.911657) EVENT_LOG_v1 {"time_micros": 1765016021911650, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.911683) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1098429, prev total WAL file size 1099178, number of live WAL files 2.
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.912366) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373534' seq:72057594037927935, type:22 .. '6D6772737461740034303035' seq:0, type:0; will stop at (end)
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(638KB)], [30(20MB)]
Dec 06 10:13:41 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016021912410, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 22407626, "oldest_snapshot_seqno": -1}
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12172 keys, 20264592 bytes, temperature: kUnknown
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022020721, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 20264592, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20195696, "index_size": 37371, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 326684, "raw_average_key_size": 26, "raw_value_size": 19989071, "raw_average_value_size": 1642, "num_data_blocks": 1423, "num_entries": 12172, "num_filter_entries": 12172, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016021, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.021291) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 20264592 bytes
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023427) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.4 rd, 186.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 20.7 +0.0 blob) out(19.3 +0.0 blob), read-write-amplify(65.3) write-amplify(31.0) OK, records in: 12684, records dropped: 512 output_compression: NoCompression
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.023469) EVENT_LOG_v1 {"time_micros": 1765016022023450, "job": 16, "event": "compaction_finished", "compaction_time_micros": 108547, "compaction_time_cpu_micros": 54425, "output_level": 6, "num_output_files": 1, "total_output_size": 20264592, "num_input_records": 12684, "num_output_records": 12172, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022023862, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016022027984, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:41.912297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.028077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.028084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.028088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.028092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:13:42.028095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:13:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3225179418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:13:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:13:42 np0005548789.localdomain podman[310381]: 2025-12-06 10:13:42.919622617 +0000 UTC m=+0.078076107 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:13:42 np0005548789.localdomain podman[310381]: 2025-12-06 10:13:42.960086533 +0000 UTC m=+0.118540063 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:13:42 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:13:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e99 e99: 6 total, 6 up, 6 in
Dec 06 10:13:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:44 np0005548789.localdomain ceph-mon[298582]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.3 KiB/s wr, 23 op/s
Dec 06 10:13:44 np0005548789.localdomain ceph-mon[298582]: osdmap e99: 6 total, 6 up, 6 in
Dec 06 10:13:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:44.605 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:44.607 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:44.607 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:13:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:44.607 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:13:46 np0005548789.localdomain ceph-mon[298582]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.9 KiB/s wr, 29 op/s
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:13:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:13:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:13:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:46.807 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:13:46 np0005548789.localdomain podman[310404]: 2025-12-06 10:13:46.916995051 +0000 UTC m=+0.080147690 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:13:46 np0005548789.localdomain podman[310404]: 2025-12-06 10:13:46.98330018 +0000 UTC m=+0.146452779 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:13:46 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:13:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:47.304 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:13:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:47 np0005548789.localdomain ceph-mon[298582]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.9 KiB/s wr, 29 op/s
Dec 06 10:13:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:50 np0005548789.localdomain ceph-mon[298582]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 3.4 KiB/s wr, 44 op/s
Dec 06 10:13:50 np0005548789.localdomain sudo[310429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:50 np0005548789.localdomain sudo[310429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:50 np0005548789.localdomain sudo[310429]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:50 np0005548789.localdomain sudo[310447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:13:50 np0005548789.localdomain sudo[310447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e100 e100: 6 total, 6 up, 6 in
Dec 06 10:13:51 np0005548789.localdomain sudo[310447]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:51 np0005548789.localdomain sudo[310485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:51 np0005548789.localdomain sudo[310485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548789.localdomain sudo[310485]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:51 np0005548789.localdomain sudo[310503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:13:51 np0005548789.localdomain sudo[310503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:51.807 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:51.812 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:51 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e101 e101: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.1 KiB/s wr, 39 op/s
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: osdmap e100: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:52 np0005548789.localdomain ceph-mon[298582]: osdmap e101: 6 total, 6 up, 6 in
Dec 06 10:13:52 np0005548789.localdomain sudo[310503]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:52 np0005548789.localdomain sudo[310553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:13:52 np0005548789.localdomain sudo[310553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:13:52 np0005548789.localdomain sudo[310553]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:13:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:13:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:13:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:13:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:13:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:13:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:13:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:13:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19245 "" "Go-http-client/1.1"
Dec 06 10:13:54 np0005548789.localdomain ceph-mon[298582]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 3.0 KiB/s wr, 38 op/s
Dec 06 10:13:56 np0005548789.localdomain ceph-mon[298582]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 3.0 KiB/s wr, 38 op/s
Dec 06 10:13:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:13:56.810 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:13:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:13:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:13:57 np0005548789.localdomain podman[310571]: 2025-12-06 10:13:57.942391224 +0000 UTC m=+0.092901276 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:13:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:13:57 np0005548789.localdomain ceph-mon[298582]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Dec 06 10:13:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e102 e102: 6 total, 6 up, 6 in
Dec 06 10:13:57 np0005548789.localdomain podman[310571]: 2025-12-06 10:13:57.979084095 +0000 UTC m=+0.129594087 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:13:57 np0005548789.localdomain systemd[1]: tmp-crun.44WKJy.mount: Deactivated successfully.
Dec 06 10:13:58 np0005548789.localdomain podman[310572]: 2025-12-06 10:13:58.00398483 +0000 UTC m=+0.151153431 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:13:58 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:13:58 np0005548789.localdomain podman[310572]: 2025-12-06 10:13:58.04194774 +0000 UTC m=+0.189116331 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:13:58 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:13:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:13:58 np0005548789.localdomain ceph-mon[298582]: osdmap e102: 6 total, 6 up, 6 in
Dec 06 10:13:59 np0005548789.localdomain ceph-mon[298582]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 2.8 KiB/s wr, 30 op/s
Dec 06 10:14:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:01.771 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:01Z, description=, device_id=0ab66a60-f76b-4775-891d-30b21387ddeb, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcb0340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc66eb0>], id=16157674-15c9-4198-b993-24e8d7e375d4, ip_allocation=immediate, mac_address=fa:16:3e:fd:9f:e6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=261, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:01Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:01.813 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:01 np0005548789.localdomain podman[310627]: 2025-12-06 10:14:01.990387862 +0000 UTC m=+0.057287297 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:01 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:14:01 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:01 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:02 np0005548789.localdomain ceph-mon[298582]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 2.4 KiB/s wr, 26 op/s
Dec 06 10:14:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:02.255 263652 INFO neutron.agent.dhcp.agent [None req-42da3db9-5f85-4496-b637-b29e49580b80 - - - - - -] DHCP configuration for ports {'16157674-15c9-4198-b993-24e8d7e375d4'} is completed
Dec 06 10:14:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:02.695 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:02 np0005548789.localdomain sshd[310647]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:03 np0005548789.localdomain sshd[310647]: Received disconnect from 64.227.102.57 port 43480:11: Bye Bye [preauth]
Dec 06 10:14:03 np0005548789.localdomain sshd[310647]: Disconnected from authenticating user root 64.227.102.57 port 43480 [preauth]
Dec 06 10:14:04 np0005548789.localdomain ceph-mon[298582]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Dec 06 10:14:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:14:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:14:04 np0005548789.localdomain systemd[1]: tmp-crun.FkoD5Z.mount: Deactivated successfully.
Dec 06 10:14:04 np0005548789.localdomain podman[310649]: 2025-12-06 10:14:04.998480821 +0000 UTC m=+0.156298436 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Dec 06 10:14:05 np0005548789.localdomain podman[310649]: 2025-12-06 10:14:05.012244448 +0000 UTC m=+0.170062053 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, architecture=x86_64, name=ubi9-minimal)
Dec 06 10:14:05 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:14:05 np0005548789.localdomain podman[310650]: 2025-12-06 10:14:04.950813557 +0000 UTC m=+0.106744926 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3)
Dec 06 10:14:05 np0005548789.localdomain podman[310650]: 2025-12-06 10:14:05.082068854 +0000 UTC m=+0.238000263 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:05 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:14:05 np0005548789.localdomain systemd[1]: tmp-crun.JD9ODh.mount: Deactivated successfully.
Dec 06 10:14:06 np0005548789.localdomain ceph-mon[298582]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Dec 06 10:14:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:06.816 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 e103: 6 total, 6 up, 6 in
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.206 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:07.435 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:07Z, description=, device_id=2f787882-add0-4a41-8d77-7a5b4b61daf9, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7ee20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7e1c0>], id=4357478b-5997-4b61-92d0-dc1f719e522a, ip_allocation=immediate, mac_address=fa:16:3e:66:1c:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=310, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:07Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:07 np0005548789.localdomain sshd[310691]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:07.497 263652 INFO neutron.agent.linux.ip_lib [None req-074d74f6-5db8-4c50-9a2c-39d6f0090f5e - - - - - -] Device tap0d21f8b1-1c cannot be used as it has no MAC address
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.516 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain kernel: device tap0d21f8b1-1c entered promiscuous mode
Dec 06 10:14:07 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016047.5247] manager: (tap0d21f8b1-1c): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 06 10:14:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:07Z|00063|binding|INFO|Claiming lport 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc for this chassis.
Dec 06 10:14:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:07Z|00064|binding|INFO|0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc: Claiming unknown
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.529 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain systemd-udevd[310717]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:07.537 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-36939d22-422f-458f-92f5-9d57586edeca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36939d22-422f-458f-92f5-9d57586edeca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b45ed0d762747b4a27ad78d879f59e8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44318dec-0297-43bb-9acd-7dd1c9b801f2, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:07.539 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc in datapath 36939d22-422f-458f-92f5-9d57586edeca bound to our chassis
Dec 06 10:14:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:07.540 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port ca98273b-9dbe-42be-bcd2-d67d252201d4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:14:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:07.540 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36939d22-422f-458f-92f5-9d57586edeca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:07.541 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[72e7a196-c13e-4af3-b9f2-eca1136ce55a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:07Z|00065|binding|INFO|Setting lport 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc ovn-installed in OVS
Dec 06 10:14:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:07Z|00066|binding|INFO|Setting lport 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc up in Southbound
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.567 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.602 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:07.625 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:07 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:14:07 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:07 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:07 np0005548789.localdomain podman[310721]: 2025-12-06 10:14:07.643884771 +0000 UTC m=+0.068594260 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:07 np0005548789.localdomain ceph-mon[298582]: osdmap e103: 6 total, 6 up, 6 in
Dec 06 10:14:07 np0005548789.localdomain ceph-mon[298582]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 904 B/s wr, 17 op/s
Dec 06 10:14:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:08 np0005548789.localdomain sshd[310691]: Received disconnect from 154.113.10.34 port 57040:11: Bye Bye [preauth]
Dec 06 10:14:08 np0005548789.localdomain sshd[310691]: Disconnected from authenticating user root 154.113.10.34 port 57040 [preauth]
Dec 06 10:14:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:14:09 np0005548789.localdomain podman[310770]: 2025-12-06 10:14:09.923359572 +0000 UTC m=+0.083937344 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:14:09 np0005548789.localdomain podman[310770]: 2025-12-06 10:14:09.939249244 +0000 UTC m=+0.099827006 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:14:09 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:14:10 np0005548789.localdomain ceph-mon[298582]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 818 B/s wr, 16 op/s
Dec 06 10:14:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:10.582 263652 INFO neutron.agent.dhcp.agent [None req-ab70d153-a60d-4172-b9de-59342f3ebf6f - - - - - -] DHCP configuration for ports {'4357478b-5997-4b61-92d0-dc1f719e522a'} is completed
Dec 06 10:14:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:10.843 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:11 np0005548789.localdomain podman[310812]: 
Dec 06 10:14:11 np0005548789.localdomain podman[310812]: 2025-12-06 10:14:11.009322628 +0000 UTC m=+0.091230215 container create fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:14:11 np0005548789.localdomain systemd[1]: Started libpod-conmon-fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923.scope.
Dec 06 10:14:11 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e3d569aeb509718e295757645bec02890965246579d7d2efa87e073e25a6102/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:11 np0005548789.localdomain podman[310812]: 2025-12-06 10:14:10.966682176 +0000 UTC m=+0.048589813 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:11 np0005548789.localdomain podman[310812]: 2025-12-06 10:14:11.066832741 +0000 UTC m=+0.148740358 container init fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:11 np0005548789.localdomain podman[310812]: 2025-12-06 10:14:11.074992099 +0000 UTC m=+0.156899716 container start fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:11 np0005548789.localdomain dnsmasq[310830]: started, version 2.85 cachesize 150
Dec 06 10:14:11 np0005548789.localdomain dnsmasq[310830]: DNS service limited to local subnets
Dec 06 10:14:11 np0005548789.localdomain dnsmasq[310830]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:11 np0005548789.localdomain dnsmasq[310830]: warning: no upstream servers configured
Dec 06 10:14:11 np0005548789.localdomain dnsmasq-dhcp[310830]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:11 np0005548789.localdomain dnsmasq[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/addn_hosts - 0 addresses
Dec 06 10:14:11 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/host
Dec 06 10:14:11 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/opts
Dec 06 10:14:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:11.251 263652 INFO neutron.agent.dhcp.agent [None req-ae14176a-f701-4204-b1b3-90b50faffd79 - - - - - -] DHCP configuration for ports {'f98701d7-f054-4731-b824-f62710fa355c'} is completed
Dec 06 10:14:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:11.717 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:11Z, description=, device_id=2f787882-add0-4a41-8d77-7a5b4b61daf9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc85130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc85970>], id=38bb6272-0a5f-4360-9a7d-15229cc8d9fb, ip_allocation=immediate, mac_address=fa:16:3e:e8:15:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:03Z, description=, dns_domain=, id=36939d22-422f-458f-92f5-9d57586edeca, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1337158581-network, port_security_enabled=True, project_id=8b45ed0d762747b4a27ad78d879f59e8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23147, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=303, status=ACTIVE, subnets=['cce3d12e-2f8e-42a3-aa58-2dc0ad5211f6'], tags=[], tenant_id=8b45ed0d762747b4a27ad78d879f59e8, updated_at=2025-12-06T10:14:04Z, vlan_transparent=None, network_id=36939d22-422f-458f-92f5-9d57586edeca, port_security_enabled=False, project_id=8b45ed0d762747b4a27ad78d879f59e8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=321, status=DOWN, tags=[], tenant_id=8b45ed0d762747b4a27ad78d879f59e8, updated_at=2025-12-06T10:14:11Z on network 36939d22-422f-458f-92f5-9d57586edeca
Dec 06 10:14:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:11.838 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:12 np0005548789.localdomain podman[310848]: 2025-12-06 10:14:12.016898739 +0000 UTC m=+0.067150095 container kill fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:14:12 np0005548789.localdomain dnsmasq[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/addn_hosts - 1 addresses
Dec 06 10:14:12 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/host
Dec 06 10:14:12 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/opts
Dec 06 10:14:12 np0005548789.localdomain ceph-mon[298582]: pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 818 B/s wr, 16 op/s
Dec 06 10:14:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:12.308 263652 INFO neutron.agent.dhcp.agent [None req-8e73d45a-70c8-48ae-8c05-326d53f59fd3 - - - - - -] DHCP configuration for ports {'38bb6272-0a5f-4360-9a7d-15229cc8d9fb'} is completed
Dec 06 10:14:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:12.837 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:11Z, description=, device_id=2f787882-add0-4a41-8d77-7a5b4b61daf9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc911f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe384530eb0>], id=38bb6272-0a5f-4360-9a7d-15229cc8d9fb, ip_allocation=immediate, mac_address=fa:16:3e:e8:15:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:03Z, description=, dns_domain=, id=36939d22-422f-458f-92f5-9d57586edeca, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1337158581-network, port_security_enabled=True, project_id=8b45ed0d762747b4a27ad78d879f59e8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23147, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=303, status=ACTIVE, subnets=['cce3d12e-2f8e-42a3-aa58-2dc0ad5211f6'], tags=[], tenant_id=8b45ed0d762747b4a27ad78d879f59e8, updated_at=2025-12-06T10:14:04Z, vlan_transparent=None, network_id=36939d22-422f-458f-92f5-9d57586edeca, port_security_enabled=False, project_id=8b45ed0d762747b4a27ad78d879f59e8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=321, status=DOWN, tags=[], tenant_id=8b45ed0d762747b4a27ad78d879f59e8, updated_at=2025-12-06T10:14:11Z on network 36939d22-422f-458f-92f5-9d57586edeca
Dec 06 10:14:13 np0005548789.localdomain dnsmasq[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/addn_hosts - 1 addresses
Dec 06 10:14:13 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/host
Dec 06 10:14:13 np0005548789.localdomain podman[310885]: 2025-12-06 10:14:13.067996849 +0000 UTC m=+0.064735413 container kill fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:13 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/opts
Dec 06 10:14:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:14:13 np0005548789.localdomain systemd[1]: tmp-crun.IggIgx.mount: Deactivated successfully.
Dec 06 10:14:13 np0005548789.localdomain podman[310901]: 2025-12-06 10:14:13.192539802 +0000 UTC m=+0.096744992 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:14:13 np0005548789.localdomain podman[310901]: 2025-12-06 10:14:13.226294075 +0000 UTC m=+0.130499305 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:14:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:13 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:14:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:13.271 263652 INFO neutron.agent.dhcp.agent [None req-6ea15905-d29b-41f1-9f31-b6463b7fd42b - - - - - -] DHCP configuration for ports {'38bb6272-0a5f-4360-9a7d-15229cc8d9fb'} is completed
Dec 06 10:14:14 np0005548789.localdomain ceph-mon[298582]: pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:15 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2845146463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:16 np0005548789.localdomain podman[310944]: 2025-12-06 10:14:16.113712228 +0000 UTC m=+0.061742352 container kill fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:14:16 np0005548789.localdomain dnsmasq[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/addn_hosts - 0 addresses
Dec 06 10:14:16 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/host
Dec 06 10:14:16 np0005548789.localdomain dnsmasq-dhcp[310830]: read /var/lib/neutron/dhcp/36939d22-422f-458f-92f5-9d57586edeca/opts
Dec 06 10:14:16 np0005548789.localdomain ceph-mon[298582]: pgmap v88: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:16.270 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:16 np0005548789.localdomain kernel: device tap0d21f8b1-1c left promiscuous mode
Dec 06 10:14:16 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:16Z|00067|binding|INFO|Releasing lport 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc from this chassis (sb_readonly=0)
Dec 06 10:14:16 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:16Z|00068|binding|INFO|Setting lport 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc down in Southbound
Dec 06 10:14:16 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:16.279 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-36939d22-422f-458f-92f5-9d57586edeca', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36939d22-422f-458f-92f5-9d57586edeca', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8b45ed0d762747b4a27ad78d879f59e8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44318dec-0297-43bb-9acd-7dd1c9b801f2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:16 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:16.281 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 0d21f8b1-1c42-49c6-b23d-1bbbeb5764dc in datapath 36939d22-422f-458f-92f5-9d57586edeca unbound from our chassis
Dec 06 10:14:16 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:16.284 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36939d22-422f-458f-92f5-9d57586edeca, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:16 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:16.285 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0512d92d-4c69-481d-afba-85c9ce51bb8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:16.292 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:14:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:14:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:16.829 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:16Z, description=, device_id=9588c462-2236-4443-8871-1214f0871ce4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcbd310>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcbd280>], id=847a9809-6760-48bb-9d9c-06fbef3d7c32, ip_allocation=immediate, mac_address=fa:16:3e:a8:ca:5e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=373, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:16Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:16.841 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:16.845 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:17 np0005548789.localdomain systemd[1]: tmp-crun.JCEAoE.mount: Deactivated successfully.
Dec 06 10:14:17 np0005548789.localdomain podman[310984]: 2025-12-06 10:14:17.065708865 +0000 UTC m=+0.061753753 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:17 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:14:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:14:17 np0005548789.localdomain podman[310999]: 2025-12-06 10:14:17.159003351 +0000 UTC m=+0.070264680 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:14:17 np0005548789.localdomain podman[310999]: 2025-12-06 10:14:17.190251418 +0000 UTC m=+0.101512747 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:17 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:14:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:17.303 263652 INFO neutron.agent.dhcp.agent [None req-c3ea56a6-a536-42f9-82c5-a0ad1ac6f4ba - - - - - -] DHCP configuration for ports {'847a9809-6760-48bb-9d9c-06fbef3d7c32'} is completed
Dec 06 10:14:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:17.525 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:17 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:14:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:17 np0005548789.localdomain podman[311047]: 2025-12-06 10:14:17.870221452 +0000 UTC m=+0.059487453 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:17Z|00069|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:14:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:17.939 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:17 np0005548789.localdomain ceph-mon[298582]: pgmap v89: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:14:17 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/989352054' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:18 np0005548789.localdomain sshd[311097]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:18 np0005548789.localdomain podman[311087]: 2025-12-06 10:14:18.665313725 +0000 UTC m=+0.064688931 container kill fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:14:18 np0005548789.localdomain systemd[1]: tmp-crun.50wNEk.mount: Deactivated successfully.
Dec 06 10:14:18 np0005548789.localdomain dnsmasq[310830]: exiting on receipt of SIGTERM
Dec 06 10:14:18 np0005548789.localdomain systemd[1]: libpod-fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923.scope: Deactivated successfully.
Dec 06 10:14:18 np0005548789.localdomain sshd[311116]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:18 np0005548789.localdomain podman[311104]: 2025-12-06 10:14:18.743290448 +0000 UTC m=+0.058328879 container died fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:14:18 np0005548789.localdomain systemd[1]: tmp-crun.Lr6h01.mount: Deactivated successfully.
Dec 06 10:14:18 np0005548789.localdomain podman[311104]: 2025-12-06 10:14:18.78726927 +0000 UTC m=+0.102307661 container cleanup fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:18 np0005548789.localdomain systemd[1]: libpod-conmon-fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923.scope: Deactivated successfully.
Dec 06 10:14:18 np0005548789.localdomain podman[311105]: 2025-12-06 10:14:18.828441737 +0000 UTC m=+0.141193709 container remove fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36939d22-422f-458f-92f5-9d57586edeca, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:14:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3413727768' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:19.069 263652 INFO neutron.agent.dhcp.agent [None req-00688aeb-9ecb-47a3-bc4a-d83a6ef58ca8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:19.070 263652 INFO neutron.agent.dhcp.agent [None req-00688aeb-9ecb-47a3-bc4a-d83a6ef58ca8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:19.338 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:19 np0005548789.localdomain sshd[311116]: Received disconnect from 193.46.255.20 port 44154:11:  [preauth]
Dec 06 10:14:19 np0005548789.localdomain sshd[311116]: Disconnected from authenticating user root 193.46.255.20 port 44154 [preauth]
Dec 06 10:14:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9e3d569aeb509718e295757645bec02890965246579d7d2efa87e073e25a6102-merged.mount: Deactivated successfully.
Dec 06 10:14:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd947986845710586151bc7ddabe4490ad234e4fd121303154ccf3ff85955923-userdata-shm.mount: Deactivated successfully.
Dec 06 10:14:19 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d36939d22\x2d422f\x2d458f\x2d92f5\x2d9d57586edeca.mount: Deactivated successfully.
Dec 06 10:14:20 np0005548789.localdomain ceph-mon[298582]: pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Dec 06 10:14:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:21.843 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:21.858 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:21Z, description=, device_id=0e54fb37-e53e-4ada-9f5a-9b02f9c2b583, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc17850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc179a0>], id=e351be0c-02ee-47aa-b870-fb989dd95d2f, ip_allocation=immediate, mac_address=fa:16:3e:49:b2:2b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=421, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:21Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:22 np0005548789.localdomain ceph-mon[298582]: pgmap v91: 177 pgs: 177 active+clean; 192 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 33 op/s
Dec 06 10:14:22 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:14:22 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:22 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:22 np0005548789.localdomain podman[311151]: 2025-12-06 10:14:22.08814476 +0000 UTC m=+0.067576129 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:22.232 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:22.288 263652 INFO neutron.agent.dhcp.agent [None req-51e62f00-f029-41fd-9040-e8d3ceb145b6 - - - - - -] DHCP configuration for ports {'e351be0c-02ee-47aa-b870-fb989dd95d2f'} is completed
Dec 06 10:14:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:22.604 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:14:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:14:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:14:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:14:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:14:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19248 "" "Go-http-client/1.1"
Dec 06 10:14:24 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:14:24.050 2 INFO neutron.agent.securitygroups_rpc [None req-713c535f-db70-452f-a97f-68d844244da8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:14:24 np0005548789.localdomain ceph-mon[298582]: pgmap v92: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:26 np0005548789.localdomain ceph-mon[298582]: pgmap v93: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:26 np0005548789.localdomain sshd[311171]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.347 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:26.816 263652 INFO neutron.agent.linux.ip_lib [None req-c9b7ffbf-b077-45de-930d-9e646f528dda - - - - - -] Device tap674505ce-f8 cannot be used as it has no MAC address
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.839 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.847 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:26 np0005548789.localdomain kernel: device tap674505ce-f8 entered promiscuous mode
Dec 06 10:14:26 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016066.8507] manager: (tap674505ce-f8): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Dec 06 10:14:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:26Z|00070|binding|INFO|Claiming lport 674505ce-f881-462b-a185-46ca8116f551 for this chassis.
Dec 06 10:14:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:26Z|00071|binding|INFO|674505ce-f881-462b-a185-46ca8116f551: Claiming unknown
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.851 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:26 np0005548789.localdomain systemd-udevd[311183]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:26.865 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-2716abb4-8339-437b-9952-fd22a3d3f838', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2716abb4-8339-437b-9952-fd22a3d3f838', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '550f4c3bf626406eac0d7f6f917d607c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a194418-47d8-46fb-95d0-765c18cf4dc9, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=674505ce-f881-462b-a185-46ca8116f551) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:26.866 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 674505ce-f881-462b-a185-46ca8116f551 in datapath 2716abb4-8339-437b-9952-fd22a3d3f838 bound to our chassis
Dec 06 10:14:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:26.867 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4ce4da82-c04f-4bd0-8ba1-43ca2cb8db51 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:14:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:26.868 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2716abb4-8339-437b-9952-fd22a3d3f838, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:26.869 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8bdb85ad-96a1-4e13-8ee7-183910db6d79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: hostname: np0005548789.localdomain
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:26Z|00072|binding|INFO|Setting lport 674505ce-f881-462b-a185-46ca8116f551 ovn-installed in OVS
Dec 06 10:14:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:26Z|00073|binding|INFO|Setting lport 674505ce-f881-462b-a185-46ca8116f551 up in Southbound
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.900 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap674505ce-f8: No such device
Dec 06 10:14:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:26.945 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:26Z, description=, device_id=a657ed9a-4b90-4e6b-92ce-73d23dc898ea, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc4afa0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc4a130>], id=0aec7b47-5e43-49da-9f07-e6bebe4c2675, ip_allocation=immediate, mac_address=fa:16:3e:3f:86:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=459, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:26Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:26.997 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:27.001 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 5 addresses
Dec 06 10:14:27 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:27 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:27 np0005548789.localdomain podman[311231]: 2025-12-06 10:14:27.136582215 +0000 UTC m=+0.037020403 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:14:27 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:27.467 263652 INFO neutron.agent.dhcp.agent [None req-20cdf521-3e81-455f-b7fd-a3ad692b4482 - - - - - -] DHCP configuration for ports {'0aec7b47-5e43-49da-9f07-e6bebe4c2675'} is completed
Dec 06 10:14:27 np0005548789.localdomain sshd[311171]: Received disconnect from 14.194.101.210 port 59376:11: Bye Bye [preauth]
Dec 06 10:14:27 np0005548789.localdomain sshd[311171]: Disconnected from authenticating user root 14.194.101.210 port 59376 [preauth]
Dec 06 10:14:27 np0005548789.localdomain podman[311293]: 
Dec 06 10:14:27 np0005548789.localdomain podman[311293]: 2025-12-06 10:14:27.886519139 +0000 UTC m=+0.082240564 container create 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:14:27 np0005548789.localdomain systemd[1]: Started libpod-conmon-70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f.scope.
Dec 06 10:14:27 np0005548789.localdomain podman[311293]: 2025-12-06 10:14:27.838927557 +0000 UTC m=+0.034649042 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:27 np0005548789.localdomain systemd[1]: tmp-crun.eRyIB8.mount: Deactivated successfully.
Dec 06 10:14:27 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:27 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b70efbcf1a478b69865e43308846dd732728820bb0a4447d04b09e0d0cf1219/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:27 np0005548789.localdomain podman[311293]: 2025-12-06 10:14:27.970671159 +0000 UTC m=+0.166392634 container init 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:14:27 np0005548789.localdomain ceph-mon[298582]: pgmap v94: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 1.8 MiB/s wr, 99 op/s
Dec 06 10:14:27 np0005548789.localdomain podman[311293]: 2025-12-06 10:14:27.982855368 +0000 UTC m=+0.178576843 container start 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[311312]: started, version 2.85 cachesize 150
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[311312]: DNS service limited to local subnets
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[311312]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[311312]: warning: no upstream servers configured
Dec 06 10:14:27 np0005548789.localdomain dnsmasq-dhcp[311312]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:27 np0005548789.localdomain dnsmasq[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/addn_hosts - 0 addresses
Dec 06 10:14:27 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/host
Dec 06 10:14:27 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/opts
Dec 06 10:14:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:28.161 263652 INFO neutron.agent.dhcp.agent [None req-830a4aa6-d8ce-4027-9178-de848068f18c - - - - - -] DHCP configuration for ports {'5314d21d-0307-4cbc-ac4b-f181460f47a3'} is completed
Dec 06 10:14:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:28.658 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:14:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:14:28 np0005548789.localdomain systemd[1]: tmp-crun.5uIFN8.mount: Deactivated successfully.
Dec 06 10:14:28 np0005548789.localdomain podman[311313]: 2025-12-06 10:14:28.944635291 +0000 UTC m=+0.101777164 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 06 10:14:28 np0005548789.localdomain podman[311314]: 2025-12-06 10:14:28.982598922 +0000 UTC m=+0.135859828 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:14:29 np0005548789.localdomain podman[311313]: 2025-12-06 10:14:29.003812705 +0000 UTC m=+0.160954568 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:14:29 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:14:29 np0005548789.localdomain podman[311314]: 2025-12-06 10:14:29.017411376 +0000 UTC m=+0.170672282 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:14:29 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:14:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:14:29.199 2 INFO neutron.agent.securitygroups_rpc [None req-42741e53-1189-4d3e-a617-18fc0438f9c5 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:14:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:29.584 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:29Z, description=, device_id=a657ed9a-4b90-4e6b-92ce-73d23dc898ea, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe38458fe80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcd5a60>], id=1de05677-105c-406f-9cd7-f01d2cbdcdcd, ip_allocation=immediate, mac_address=fa:16:3e:c4:04:30, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:23Z, description=, dns_domain=, id=2716abb4-8339-437b-9952-fd22a3d3f838, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1521297530-network, port_security_enabled=True, project_id=550f4c3bf626406eac0d7f6f917d607c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=433, status=ACTIVE, subnets=['bacb9850-7f83-4210-a14f-8a65cd67ed70'], tags=[], tenant_id=550f4c3bf626406eac0d7f6f917d607c, updated_at=2025-12-06T10:14:24Z, vlan_transparent=None, network_id=2716abb4-8339-437b-9952-fd22a3d3f838, port_security_enabled=False, project_id=550f4c3bf626406eac0d7f6f917d607c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=484, status=DOWN, tags=[], tenant_id=550f4c3bf626406eac0d7f6f917d607c, updated_at=2025-12-06T10:14:29Z on network 2716abb4-8339-437b-9952-fd22a3d3f838
Dec 06 10:14:29 np0005548789.localdomain dnsmasq[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/addn_hosts - 1 addresses
Dec 06 10:14:29 np0005548789.localdomain podman[311372]: 2025-12-06 10:14:29.799920217 +0000 UTC m=+0.058490533 container kill 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:29 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/host
Dec 06 10:14:29 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/opts
Dec 06 10:14:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:30.017 263652 INFO neutron.agent.dhcp.agent [None req-f30b7f6b-2a16-4fc5-83a3-d9b073c2d4c5 - - - - - -] DHCP configuration for ports {'1de05677-105c-406f-9cd7-f01d2cbdcdcd'} is completed
Dec 06 10:14:30 np0005548789.localdomain ceph-mon[298582]: pgmap v95: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Dec 06 10:14:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:30.211 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:29Z, description=, device_id=fcb0956b-3e0b-42ed-82cc-dda3a3b5cf85, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2a910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2a8b0>], id=6e17a10f-dbbc-42b2-aeeb-b43e917b0e3c, ip_allocation=immediate, mac_address=fa:16:3e:80:e4:79, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=485, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:29Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:30 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:14:30 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:30 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:30 np0005548789.localdomain podman[311409]: 2025-12-06 10:14:30.432386481 +0000 UTC m=+0.060859624 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:14:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:30.561 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:29Z, description=, device_id=a657ed9a-4b90-4e6b-92ce-73d23dc898ea, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5bc10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5bfd0>], id=1de05677-105c-406f-9cd7-f01d2cbdcdcd, ip_allocation=immediate, mac_address=fa:16:3e:c4:04:30, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:23Z, description=, dns_domain=, id=2716abb4-8339-437b-9952-fd22a3d3f838, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1521297530-network, port_security_enabled=True, project_id=550f4c3bf626406eac0d7f6f917d607c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=433, status=ACTIVE, subnets=['bacb9850-7f83-4210-a14f-8a65cd67ed70'], tags=[], tenant_id=550f4c3bf626406eac0d7f6f917d607c, updated_at=2025-12-06T10:14:24Z, vlan_transparent=None, network_id=2716abb4-8339-437b-9952-fd22a3d3f838, port_security_enabled=False, project_id=550f4c3bf626406eac0d7f6f917d607c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=484, status=DOWN, tags=[], tenant_id=550f4c3bf626406eac0d7f6f917d607c, updated_at=2025-12-06T10:14:29Z on network 2716abb4-8339-437b-9952-fd22a3d3f838
Dec 06 10:14:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:30.673 263652 INFO neutron.agent.dhcp.agent [None req-7c139c2c-6361-457b-82f3-9a8e206f2299 - - - - - -] DHCP configuration for ports {'6e17a10f-dbbc-42b2-aeeb-b43e917b0e3c'} is completed
Dec 06 10:14:30 np0005548789.localdomain dnsmasq[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/addn_hosts - 1 addresses
Dec 06 10:14:30 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/host
Dec 06 10:14:30 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/opts
Dec 06 10:14:30 np0005548789.localdomain podman[311448]: 2025-12-06 10:14:30.776912681 +0000 UTC m=+0.064177565 container kill 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:14:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:31.018 263652 INFO neutron.agent.dhcp.agent [None req-0640b2e6-57f5-4931-980d-f8db1e7d2171 - - - - - -] DHCP configuration for ports {'1de05677-105c-406f-9cd7-f01d2cbdcdcd'} is completed
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.199 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.227 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.227 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.228 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.228 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:31 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3766673990' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.716 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.800 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.801 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.848 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.883 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:31.895 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.053 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.055 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11389MB free_disk=41.77429962158203GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.056 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.056 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:32 np0005548789.localdomain ceph-mon[298582]: pgmap v96: 177 pgs: 177 active+clean; 192 MiB data, 776 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:14:32 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3766673990' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.155 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.155 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.155 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.258 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:32 np0005548789.localdomain systemd[1]: tmp-crun.so2RQB.mount: Deactivated successfully.
Dec 06 10:14:32 np0005548789.localdomain podman[311519]: 2025-12-06 10:14:32.46357968 +0000 UTC m=+0.063089973 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:32 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 5 addresses
Dec 06 10:14:32 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:32 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4228215768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.725 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.733 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:14:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:32Z|00074|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.748 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.768 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.769 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:32.778 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4228215768' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:33 np0005548789.localdomain dnsmasq[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/addn_hosts - 0 addresses
Dec 06 10:14:33 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/host
Dec 06 10:14:33 np0005548789.localdomain podman[311569]: 2025-12-06 10:14:33.245189504 +0000 UTC m=+0.048388748 container kill 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:14:33 np0005548789.localdomain dnsmasq-dhcp[311312]: read /var/lib/neutron/dhcp/2716abb4-8339-437b-9952-fd22a3d3f838/opts
Dec 06 10:14:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:33.473 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:33 np0005548789.localdomain kernel: device tap674505ce-f8 left promiscuous mode
Dec 06 10:14:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:33Z|00075|binding|INFO|Releasing lport 674505ce-f881-462b-a185-46ca8116f551 from this chassis (sb_readonly=0)
Dec 06 10:14:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:33Z|00076|binding|INFO|Setting lport 674505ce-f881-462b-a185-46ca8116f551 down in Southbound
Dec 06 10:14:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:33.491 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-2716abb4-8339-437b-9952-fd22a3d3f838', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2716abb4-8339-437b-9952-fd22a3d3f838', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '550f4c3bf626406eac0d7f6f917d607c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a194418-47d8-46fb-95d0-765c18cf4dc9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=674505ce-f881-462b-a185-46ca8116f551) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:33.493 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:33.495 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 674505ce-f881-462b-a185-46ca8116f551 in datapath 2716abb4-8339-437b-9952-fd22a3d3f838 unbound from our chassis
Dec 06 10:14:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:33.495 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:33.498 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2716abb4-8339-437b-9952-fd22a3d3f838, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:33.499 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[022ea27f-9813-4b10-b73d-72166a23f636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:34 np0005548789.localdomain ceph-mon[298582]: pgmap v97: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 116 op/s
Dec 06 10:14:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:34.751 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:34.752 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:34.752 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:14:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:34.752 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:14:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:34Z|00077|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.035 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:35 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:14:35 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:35 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:35 np0005548789.localdomain podman[311607]: 2025-12-06 10:14:35.06582072 +0000 UTC m=+0.061174334 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.071 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.072 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.072 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.072 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: tmp-crun.RYWESP.mount: Deactivated successfully.
Dec 06 10:14:35 np0005548789.localdomain podman[311620]: 2025-12-06 10:14:35.183030011 +0000 UTC m=+0.093692719 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350)
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:14:35 np0005548789.localdomain podman[311620]: 2025-12-06 10:14:35.219843017 +0000 UTC m=+0.130505675 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:14:35 np0005548789.localdomain podman[311647]: 2025-12-06 10:14:35.268518382 +0000 UTC m=+0.066186237 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:14:35 np0005548789.localdomain podman[311647]: 2025-12-06 10:14:35.276803413 +0000 UTC m=+0.074471298 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:14:35 np0005548789.localdomain dnsmasq[311312]: exiting on receipt of SIGTERM
Dec 06 10:14:35 np0005548789.localdomain podman[311681]: 2025-12-06 10:14:35.389997323 +0000 UTC m=+0.039824998 container kill 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: libpod-70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f.scope: Deactivated successfully.
Dec 06 10:14:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/838118298' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:35 np0005548789.localdomain podman[311696]: 2025-12-06 10:14:35.450430564 +0000 UTC m=+0.043418077 container died 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:14:35 np0005548789.localdomain podman[311696]: 2025-12-06 10:14:35.494656575 +0000 UTC m=+0.087644028 container remove 70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2716abb4-8339-437b-9952-fd22a3d3f838, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:14:35 np0005548789.localdomain systemd[1]: libpod-conmon-70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f.scope: Deactivated successfully.
Dec 06 10:14:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:35.523 263652 INFO neutron.agent.dhcp.agent [None req-8060e107-0d2d-468d-9810-0d8609c8862f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.742 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.758 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.759 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.759 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:35.760 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:35.842 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:14:36 np0005548789.localdomain systemd[1]: tmp-crun.oCAuY6.mount: Deactivated successfully.
Dec 06 10:14:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2b70efbcf1a478b69865e43308846dd732728820bb0a4447d04b09e0d0cf1219-merged.mount: Deactivated successfully.
Dec 06 10:14:36 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70d88d91b5e20c247f4a6f4c63d79ee38d3e85e687937e8f7f521dc46eaf340f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:14:36 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d2716abb4\x2d8339\x2d437b\x2d9952\x2dfd22a3d3f838.mount: Deactivated successfully.
Dec 06 10:14:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:36.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:36.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:36 np0005548789.localdomain ceph-mon[298582]: pgmap v98: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 475 KiB/s rd, 2.1 MiB/s wr, 50 op/s
Dec 06 10:14:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:36.851 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:36.898 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:37.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:37.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:37.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:14:37 np0005548789.localdomain ceph-mon[298582]: pgmap v99: 177 pgs: 177 active+clean; 213 MiB data, 849 MiB used, 41 GiB / 42 GiB avail; 475 KiB/s rd, 2.1 MiB/s wr, 50 op/s
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:38.317 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e104 e104: 6 total, 6 up, 6 in
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:38.991459) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016078991527, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1060, "num_deletes": 252, "total_data_size": 1315745, "memory_usage": 1341648, "flush_reason": "Manual Compaction"}
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec 06 10:14:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4100360430' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079000091, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 849236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20839, "largest_seqno": 21894, "table_properties": {"data_size": 844735, "index_size": 2164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10723, "raw_average_key_size": 20, "raw_value_size": 835388, "raw_average_value_size": 1634, "num_data_blocks": 90, "num_entries": 511, "num_filter_entries": 511, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016021, "oldest_key_time": 1765016021, "file_creation_time": 1765016078, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 8674 microseconds, and 3579 cpu microseconds.
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.000135) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 849236 bytes OK
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.000155) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.003382) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.003404) EVENT_LOG_v1 {"time_micros": 1765016079003397, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.003424) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 1310509, prev total WAL file size 1310509, number of live WAL files 2.
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.004077) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(829KB)], [33(19MB)]
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079004158, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 21113828, "oldest_snapshot_seqno": -1}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12153 keys, 19283165 bytes, temperature: kUnknown
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079115264, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 19283165, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19215590, "index_size": 36114, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30405, "raw_key_size": 326825, "raw_average_key_size": 26, "raw_value_size": 19010254, "raw_average_value_size": 1564, "num_data_blocks": 1367, "num_entries": 12153, "num_filter_entries": 12153, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016079, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.115690) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 19283165 bytes
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.117635) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.7 rd, 173.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 19.3 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(47.6) write-amplify(22.7) OK, records in: 12683, records dropped: 530 output_compression: NoCompression
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.117676) EVENT_LOG_v1 {"time_micros": 1765016079117657, "job": 18, "event": "compaction_finished", "compaction_time_micros": 111287, "compaction_time_cpu_micros": 53740, "output_level": 6, "num_output_files": 1, "total_output_size": 19283165, "num_input_records": 12683, "num_output_records": 12153, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079118007, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016079121004, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.003931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.121130) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.121138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.121142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.121144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:14:39.121147) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:39.542 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:39Z, description=, device_id=2e797ff2-91d9-444a-be81-92fadd15ae3f, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc85250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc85610>], id=623f4d15-d3e9-4201-b94a-f57e73649098, ip_allocation=immediate, mac_address=fa:16:3e:c1:a3:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=525, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:39Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:39 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 5 addresses
Dec 06 10:14:39 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:39 np0005548789.localdomain podman[311737]: 2025-12-06 10:14:39.798099574 +0000 UTC m=+0.078743357 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:14:39 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:40.009 263652 INFO neutron.agent.dhcp.agent [None req-cd843266-ec6a-4bb4-90b4-2e35f1e940c5 - - - - - -] DHCP configuration for ports {'623f4d15-d3e9-4201-b94a-f57e73649098'} is completed
Dec 06 10:14:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:40.011 263652 INFO neutron.agent.linux.ip_lib [None req-70ea0e9b-44da-4f7e-8e7f-a82adf4090f9 - - - - - -] Device tap9f077348-ed cannot be used as it has no MAC address
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: osdmap e104: 6 total, 6 up, 6 in
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1657712167' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: pgmap v101: 177 pgs: 177 active+clean; 257 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.8 MiB/s wr, 108 op/s
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/227781456' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/492198068' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.039 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain kernel: device tap9f077348-ed entered promiscuous mode
Dec 06 10:14:40 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016080.0511] manager: (tap9f077348-ed): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 06 10:14:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:40Z|00078|binding|INFO|Claiming lport 9f077348-ed05-4cf4-8524-593431fbafaf for this chassis.
Dec 06 10:14:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:40Z|00079|binding|INFO|9f077348-ed05-4cf4-8524-593431fbafaf: Claiming unknown
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.052 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain systemd-udevd[311766]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:40.071 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-df3c5fcc-9cd4-4d33-9970-a165c712aad3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df3c5fcc-9cd4-4d33-9970-a165c712aad3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '024b6fbc052c4ed7a93c855bd2ae77da', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd535f07-7612-46c6-87c9-bf69c15a9a5d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=9f077348-ed05-4cf4-8524-593431fbafaf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:40.073 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 9f077348-ed05-4cf4-8524-593431fbafaf in datapath df3c5fcc-9cd4-4d33-9970-a165c712aad3 bound to our chassis
Dec 06 10:14:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:40.075 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 64f8fc93-0da5-442a-a910-eb65f721f2a4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:14:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:40.076 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df3c5fcc-9cd4-4d33-9970-a165c712aad3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:40.076 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[eab9cd6f-02d4-4261-a5c7-16dcfaeec1a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.087 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:14:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:40Z|00080|binding|INFO|Setting lport 9f077348-ed05-4cf4-8524-593431fbafaf ovn-installed in OVS
Dec 06 10:14:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:40Z|00081|binding|INFO|Setting lport 9f077348-ed05-4cf4-8524-593431fbafaf up in Southbound
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.094 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.099 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.122 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.149 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain podman[311770]: 2025-12-06 10:14:40.202511647 +0000 UTC m=+0.100227498 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Dec 06 10:14:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:40.228 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:40 np0005548789.localdomain podman[311770]: 2025-12-06 10:14:40.243238092 +0000 UTC m=+0.140953953 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:14:40 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:14:40 np0005548789.localdomain systemd[1]: tmp-crun.d3IUwJ.mount: Deactivated successfully.
Dec 06 10:14:40 np0005548789.localdomain podman[311841]: 
Dec 06 10:14:40 np0005548789.localdomain podman[311841]: 2025-12-06 10:14:40.996069833 +0000 UTC m=+0.088188413 container create dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:14:41 np0005548789.localdomain systemd[1]: Started libpod-conmon-dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81.scope.
Dec 06 10:14:41 np0005548789.localdomain podman[311841]: 2025-12-06 10:14:40.953604036 +0000 UTC m=+0.045722636 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3664938098' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:41 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a826d515f5bdfea048cffebccb4edfc28363d9139b831b1071c42234067ec609/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e105 e105: 6 total, 6 up, 6 in
Dec 06 10:14:41 np0005548789.localdomain podman[311841]: 2025-12-06 10:14:41.076918093 +0000 UTC m=+0.169036663 container init dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:14:41 np0005548789.localdomain podman[311841]: 2025-12-06 10:14:41.089743222 +0000 UTC m=+0.181861792 container start dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: started, version 2.85 cachesize 150
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: DNS service limited to local subnets
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: warning: no upstream servers configured
Dec 06 10:14:41 np0005548789.localdomain dnsmasq-dhcp[311859]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/addn_hosts - 0 addresses
Dec 06 10:14:41 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/host
Dec 06 10:14:41 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/opts
Dec 06 10:14:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:41.161 263652 INFO neutron.agent.dhcp.agent [None req-5a4831f2-83b0-42c5-9287-fea2f04a31d9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:40Z, description=, device_id=2e797ff2-91d9-444a-be81-92fadd15ae3f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7e2e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7eac0>], id=a17f9d15-299c-474e-9834-9e63c98a6a26, ip_allocation=immediate, mac_address=fa:16:3e:44:4d:23, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:37Z, description=, dns_domain=, id=df3c5fcc-9cd4-4d33-9970-a165c712aad3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1583596222-network, port_security_enabled=True, project_id=024b6fbc052c4ed7a93c855bd2ae77da, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43497, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=518, status=ACTIVE, subnets=['eaa29c6a-37af-4221-a05f-34273ec978f2'], tags=[], tenant_id=024b6fbc052c4ed7a93c855bd2ae77da, updated_at=2025-12-06T10:14:38Z, vlan_transparent=None, network_id=df3c5fcc-9cd4-4d33-9970-a165c712aad3, port_security_enabled=False, project_id=024b6fbc052c4ed7a93c855bd2ae77da, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=544, status=DOWN, tags=[], tenant_id=024b6fbc052c4ed7a93c855bd2ae77da, updated_at=2025-12-06T10:14:40Z on network df3c5fcc-9cd4-4d33-9970-a165c712aad3
Dec 06 10:14:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:41.260 263652 INFO neutron.agent.dhcp.agent [None req-6610962e-6428-4ab0-b76b-53db7804607c - - - - - -] DHCP configuration for ports {'a6b83e42-613c-482c-9542-5aae284b7256'} is completed
Dec 06 10:14:41 np0005548789.localdomain podman[311878]: 2025-12-06 10:14:41.380359008 +0000 UTC m=+0.071738085 container kill dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:41 np0005548789.localdomain dnsmasq[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/addn_hosts - 1 addresses
Dec 06 10:14:41 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/host
Dec 06 10:14:41 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/opts
Dec 06 10:14:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:41.720 263652 INFO neutron.agent.dhcp.agent [None req-935df81a-a49c-473d-a6f3-b996a76f6fea - - - - - -] DHCP configuration for ports {'a17f9d15-299c-474e-9834-9e63c98a6a26'} is completed
Dec 06 10:14:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:41.854 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:41.901 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:42 np0005548789.localdomain ceph-mon[298582]: pgmap v102: 177 pgs: 177 active+clean; 257 MiB data, 870 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 3.8 MiB/s wr, 108 op/s
Dec 06 10:14:42 np0005548789.localdomain ceph-mon[298582]: osdmap e105: 6 total, 6 up, 6 in
Dec 06 10:14:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/95423717' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4088784122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e106 e106: 6 total, 6 up, 6 in
Dec 06 10:14:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:42.177 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:40Z, description=, device_id=2e797ff2-91d9-444a-be81-92fadd15ae3f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbec220>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbecf40>], id=a17f9d15-299c-474e-9834-9e63c98a6a26, ip_allocation=immediate, mac_address=fa:16:3e:44:4d:23, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:37Z, description=, dns_domain=, id=df3c5fcc-9cd4-4d33-9970-a165c712aad3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-1583596222-network, port_security_enabled=True, project_id=024b6fbc052c4ed7a93c855bd2ae77da, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43497, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=518, status=ACTIVE, subnets=['eaa29c6a-37af-4221-a05f-34273ec978f2'], tags=[], tenant_id=024b6fbc052c4ed7a93c855bd2ae77da, updated_at=2025-12-06T10:14:38Z, vlan_transparent=None, network_id=df3c5fcc-9cd4-4d33-9970-a165c712aad3, port_security_enabled=False, project_id=024b6fbc052c4ed7a93c855bd2ae77da, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=544, status=DOWN, tags=[], tenant_id=024b6fbc052c4ed7a93c855bd2ae77da, updated_at=2025-12-06T10:14:40Z on network df3c5fcc-9cd4-4d33-9970-a165c712aad3
Dec 06 10:14:42 np0005548789.localdomain dnsmasq[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/addn_hosts - 1 addresses
Dec 06 10:14:42 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/host
Dec 06 10:14:42 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/opts
Dec 06 10:14:42 np0005548789.localdomain systemd[1]: tmp-crun.DWkabP.mount: Deactivated successfully.
Dec 06 10:14:42 np0005548789.localdomain podman[311917]: 2025-12-06 10:14:42.410901694 +0000 UTC m=+0.066933180 container kill dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:14:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:42.701 263652 INFO neutron.agent.dhcp.agent [None req-9048a336-b13b-4d30-822e-ae84c0da904d - - - - - -] DHCP configuration for ports {'a17f9d15-299c-474e-9834-9e63c98a6a26'} is completed
Dec 06 10:14:43 np0005548789.localdomain ceph-mon[298582]: osdmap e106: 6 total, 6 up, 6 in
Dec 06 10:14:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1318618644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3370650055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:14:43 np0005548789.localdomain podman[311937]: 2025-12-06 10:14:43.900719077 +0000 UTC m=+0.065523296 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:14:43 np0005548789.localdomain podman[311937]: 2025-12-06 10:14:43.913128563 +0000 UTC m=+0.077932862 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:14:43 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:14:44 np0005548789.localdomain ceph-mon[298582]: pgmap v105: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 11 MiB/s wr, 304 op/s
Dec 06 10:14:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:44.344 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:42Z, description=, device_id=8ac18363-2c8c-4254-a57a-690b1714b140, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdf550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdf610>], id=3f202222-16a8-4488-bcc9-0691af80a9ba, ip_allocation=immediate, mac_address=fa:16:3e:6f:70:a0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=554, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:43Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:44 np0005548789.localdomain systemd[1]: tmp-crun.Gw48Jw.mount: Deactivated successfully.
Dec 06 10:14:44 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:14:44 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:44 np0005548789.localdomain podman[311975]: 2025-12-06 10:14:44.591257761 +0000 UTC m=+0.072313882 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:14:44 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:44.821 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:44.823 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:14:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:44.855 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:44.906 263652 INFO neutron.agent.dhcp.agent [None req-583292b8-a6c5-4bc8-bb21-4ee822382ed1 - - - - - -] DHCP configuration for ports {'3f202222-16a8-4488-bcc9-0691af80a9ba'} is completed
Dec 06 10:14:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2238572858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:45.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:46 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:14:46.015 2 INFO neutron.agent.securitygroups_rpc [req-842dc70c-4c90-4d04-97b8-ca0a150f47f3 req-694d7e2d-322f-485d-ac12-0a632bb0d8f8 3a50fae64027482ba5b10005ed97189e 024b6fbc052c4ed7a93c855bd2ae77da - - default default] Security group rule updated ['e6cef3ed-f2f1-4e9f-8bb7-b8303074aa1b']
Dec 06 10:14:46 np0005548789.localdomain ceph-mon[298582]: pgmap v106: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 9.2 MiB/s wr, 208 op/s
Dec 06 10:14:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:46.368 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:14:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:14:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:14:46 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:14:46.815 2 INFO neutron.agent.securitygroups_rpc [req-3aac6f95-4738-40dc-9407-49685a717c88 req-a8e1311a-c6c3-4f2f-8fef-2b7b3e5084e1 3a50fae64027482ba5b10005ed97189e 024b6fbc052c4ed7a93c855bd2ae77da - - default default] Security group rule updated ['e6cef3ed-f2f1-4e9f-8bb7-b8303074aa1b']
Dec 06 10:14:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:46.856 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:46.904 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:46 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e107 e107: 6 total, 6 up, 6 in
Dec 06 10:14:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:47.134 263652 INFO neutron.agent.linux.ip_lib [None req-e15adfb8-c949-47d1-b9c5-a07dd781f185 - - - - - -] Device tap2f6c7dc0-af cannot be used as it has no MAC address
Dec 06 10:14:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:47.166 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:47 np0005548789.localdomain kernel: device tap2f6c7dc0-af entered promiscuous mode
Dec 06 10:14:47 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016087.1738] manager: (tap2f6c7dc0-af): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Dec 06 10:14:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:47Z|00082|binding|INFO|Claiming lport 2f6c7dc0-af46-4cc2-99f3-f46a11be455c for this chassis.
Dec 06 10:14:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:47Z|00083|binding|INFO|2f6c7dc0-af46-4cc2-99f3-f46a11be455c: Claiming unknown
Dec 06 10:14:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:47.177 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:47 np0005548789.localdomain systemd-udevd[312007]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.186 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-feb354e1-97d5-4c74-804a-eeb06e5bb155', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-feb354e1-97d5-4c74-804a-eeb06e5bb155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4185da56d12649bc8653dd9db208c0a0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd335efc-b05b-4aaa-a30a-c891a594ccf4, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=2f6c7dc0-af46-4cc2-99f3-f46a11be455c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.188 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 2f6c7dc0-af46-4cc2-99f3-f46a11be455c in datapath feb354e1-97d5-4c74-804a-eeb06e5bb155 bound to our chassis
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.190 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network feb354e1-97d5-4c74-804a-eeb06e5bb155 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.191 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8a48b358-7bf7-4270-8250-2c4856cd7d34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:47Z|00084|binding|INFO|Setting lport 2f6c7dc0-af46-4cc2-99f3-f46a11be455c ovn-installed in OVS
Dec 06 10:14:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:47Z|00085|binding|INFO|Setting lport 2f6c7dc0-af46-4cc2-99f3-f46a11be455c up in Southbound
Dec 06 10:14:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:47.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2f6c7dc0-af: No such device
Dec 06 10:14:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:47.252 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:47.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:47 np0005548789.localdomain systemd[1]: tmp-crun.ppU5LU.mount: Deactivated successfully.
Dec 06 10:14:47 np0005548789.localdomain podman[312015]: 2025-12-06 10:14:47.297809774 +0000 UTC m=+0.070792436 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.304 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:47 np0005548789.localdomain podman[312015]: 2025-12-06 10:14:47.35216194 +0000 UTC m=+0.125144632 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 06 10:14:47 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:14:47 np0005548789.localdomain ceph-mon[298582]: osdmap e107: 6 total, 6 up, 6 in
Dec 06 10:14:47 np0005548789.localdomain ceph-mon[298582]: pgmap v108: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 9.3 MiB/s wr, 210 op/s
Dec 06 10:14:48 np0005548789.localdomain podman[312105]: 
Dec 06 10:14:48 np0005548789.localdomain podman[312105]: 2025-12-06 10:14:48.209181859 +0000 UTC m=+0.092996849 container create bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:14:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9.scope.
Dec 06 10:14:48 np0005548789.localdomain podman[312105]: 2025-12-06 10:14:48.165806525 +0000 UTC m=+0.049621545 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:48 np0005548789.localdomain systemd[1]: tmp-crun.Ks6Hug.mount: Deactivated successfully.
Dec 06 10:14:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17d5803262906bbfc10a2359d065da806a9d9644144fa240df1cddd30ea542d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:48 np0005548789.localdomain podman[312105]: 2025-12-06 10:14:48.295890096 +0000 UTC m=+0.179705096 container init bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:14:48 np0005548789.localdomain podman[312105]: 2025-12-06 10:14:48.302790876 +0000 UTC m=+0.186605866 container start bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[312123]: started, version 2.85 cachesize 150
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[312123]: DNS service limited to local subnets
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[312123]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[312123]: warning: no upstream servers configured
Dec 06 10:14:48 np0005548789.localdomain dnsmasq-dhcp[312123]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/addn_hosts - 0 addresses
Dec 06 10:14:48 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/host
Dec 06 10:14:48 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/opts
Dec 06 10:14:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:48.564 263652 INFO neutron.agent.dhcp.agent [None req-98aa3ae0-29fa-4189-8228-5041e80f18dc - - - - - -] DHCP configuration for ports {'fe6507e2-e590-4d81-bf58-28ec08e1216a'} is completed
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.654 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Creating tmpfile /var/lib/nova/instances/tmpe77a5ohg to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.690 282197 DEBUG nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe77a5ohg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Dec 06 10:14:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:48.699 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:48Z, description=, device_id=7e399e99-e656-47a1-9fb2-96abab06c114, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcd5a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fccb160>], id=09b837f1-40ed-4eeb-8b33-2fe63cdb818e, ip_allocation=immediate, mac_address=fa:16:3e:28:6c:da, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=605, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:48Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.713 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.714 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.723 282197 INFO nova.compute.rpcapi [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Dec 06 10:14:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:48.723 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:48 np0005548789.localdomain podman[312140]: 2025-12-06 10:14:48.936731705 +0000 UTC m=+0.073444606 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:48 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:14:48 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:48 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1632793843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:14:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:49.238 263652 INFO neutron.agent.dhcp.agent [None req-ccb857ae-a091-4b34-b048-483c3f1e56f0 - - - - - -] DHCP configuration for ports {'09b837f1-40ed-4eeb-8b33-2fe63cdb818e'} is completed
Dec 06 10:14:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:49.748 282197 DEBUG nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe77a5ohg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='87dc2ce3-2b16-4764-9803-711c2d12c20f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Dec 06 10:14:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:49.789 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquiring lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:14:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:49.789 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquired lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:14:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:49.790 282197 DEBUG nova.network.neutron [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:14:50 np0005548789.localdomain ceph-mon[298582]: pgmap v109: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.0 MiB/s wr, 263 op/s
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.121 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:50.282 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:49Z, description=, device_id=110bfe4d-8dd3-4386-b8da-4c950d9b90e9, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4a60>], id=bfbbd672-ac59-4d4f-97b0-0bfce9d5e0c5, ip_allocation=immediate, mac_address=fa:16:3e:22:72:81, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=610, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:14:49Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:14:50 np0005548789.localdomain podman[312178]: 2025-12-06 10:14:50.487966969 +0000 UTC m=+0.059860335 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:14:50 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 8 addresses
Dec 06 10:14:50 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:14:50 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:14:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:50.692 263652 INFO neutron.agent.dhcp.agent [None req-ba1fc9df-6730-465f-870d-4869d0b8fe05 - - - - - -] DHCP configuration for ports {'bfbbd672-ac59-4d4f-97b0-0bfce9d5e0c5'} is completed
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.699 282197 DEBUG nova.network.neutron [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Updating instance_info_cache with network_info: [{"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.759 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Releasing lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.762 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe77a5ohg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='87dc2ce3-2b16-4764-9803-711c2d12c20f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.763 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Creating instance directory: /var/lib/nova/instances/87dc2ce3-2b16-4764-9803-711c2d12c20f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.764 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Ensure instance console log exists: /var/lib/nova/instances/87dc2ce3-2b16-4764-9803-711c2d12c20f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.765 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.766 282197 DEBUG nova.virt.libvirt.vif [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1999616987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548790.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1999616987',id=7,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:14:45Z,launched_on='np0005548790.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005548790.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7897d6398eb64eb29c66df8db792e581',ramdisk_id='',reservation_id='r-tcv45ne4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-265776820',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-265776820-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:14:45Z,user_data=None,user_id='ac2e85103fd14829ad4e6df2357da95b',uuid=87dc2ce3-2b16-4764-9803-711c2d12c20f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.767 282197 DEBUG nova.network.os_vif_util [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Converting VIF {"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.769 282197 DEBUG nova.network.os_vif_util [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.769 282197 DEBUG os_vif [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.771 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.772 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.772 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.777 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.777 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape87832d3-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.778 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape87832d3-ff, col_values=(('external_ids', {'iface-id': 'e87832d3-ffc3-44e0-9f77-cd2eb6073d62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:f5:37', 'vm-uuid': '87dc2ce3-2b16-4764-9803-711c2d12c20f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.816 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.819 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.822 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.823 282197 INFO os_vif [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff')
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.823 282197 DEBUG nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.824 282197 DEBUG nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe77a5ohg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='87dc2ce3-2b16-4764-9803-711c2d12c20f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Dec 06 10:14:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:50.915 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:51 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2395424858' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:51.908 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:52 np0005548789.localdomain ceph-mon[298582]: pgmap v110: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 6.2 MiB/s wr, 234 op/s
Dec 06 10:14:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/840869338' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:14:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:52.295 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:51Z, description=, device_id=7e399e99-e656-47a1-9fb2-96abab06c114, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc064f0>], id=3632540c-2981-4cf2-a512-17df5b6faa8d, ip_allocation=immediate, mac_address=fa:16:3e:ac:38:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:44Z, description=, dns_domain=, id=feb354e1-97d5-4c74-804a-eeb06e5bb155, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1665562525-network, port_security_enabled=True, project_id=4185da56d12649bc8653dd9db208c0a0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=570, status=ACTIVE, subnets=['0d3c1a86-b134-4467-9f13-385eed16e944'], tags=[], tenant_id=4185da56d12649bc8653dd9db208c0a0, updated_at=2025-12-06T10:14:45Z, vlan_transparent=None, network_id=feb354e1-97d5-4c74-804a-eeb06e5bb155, port_security_enabled=False, project_id=4185da56d12649bc8653dd9db208c0a0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=616, status=DOWN, tags=[], tenant_id=4185da56d12649bc8653dd9db208c0a0, updated_at=2025-12-06T10:14:51Z on network feb354e1-97d5-4c74-804a-eeb06e5bb155
Dec 06 10:14:52 np0005548789.localdomain dnsmasq[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/addn_hosts - 1 addresses
Dec 06 10:14:52 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/host
Dec 06 10:14:52 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/opts
Dec 06 10:14:52 np0005548789.localdomain podman[312220]: 2025-12-06 10:14:52.510162083 +0000 UTC m=+0.065860136 container kill bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:14:52 np0005548789.localdomain sudo[312235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:14:52 np0005548789.localdomain sudo[312235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:52 np0005548789.localdomain sudo[312235]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:52 np0005548789.localdomain sudo[312260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:14:52 np0005548789.localdomain sudo[312260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:52.767 263652 INFO neutron.agent.dhcp.agent [None req-498da11d-8da3-44d9-9d74-d37ab421f5eb - - - - - -] DHCP configuration for ports {'3632540c-2981-4cf2-a512-17df5b6faa8d'} is completed
Dec 06 10:14:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e108 e108: 6 total, 6 up, 6 in
Dec 06 10:14:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:53 np0005548789.localdomain sudo[312260]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:53 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:14:53.702 2 INFO neutron.agent.securitygroups_rpc [None req-2bc0f0e9-228c-4272-bb0d-cc31a9019510 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:14:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:53.718 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:53 np0005548789.localdomain sudo[312311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:14:53 np0005548789.localdomain sudo[312311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:14:53 np0005548789.localdomain sudo[312311]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:53 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:53.825 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:14:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:14:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:14:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159752 "" "Go-http-client/1.1"
Dec 06 10:14:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:14:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20207 "" "Go-http-client/1.1"
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: pgmap v111: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 221 op/s
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: osdmap e108: 6 total, 6 up, 6 in
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:14:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:55.817 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain ceph-mon[298582]: pgmap v113: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 8.7 MiB/s rd, 5.8 MiB/s wr, 273 op/s
Dec 06 10:14:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:56.686 263652 INFO neutron.agent.linux.ip_lib [None req-5b3e8c07-fe2b-42c8-95dc-9e47a5c87336 - - - - - -] Device tapff588d77-fd cannot be used as it has no MAC address
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.708 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain kernel: device tapff588d77-fd entered promiscuous mode
Dec 06 10:14:56 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016096.7176] manager: (tapff588d77-fd): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec 06 10:14:56 np0005548789.localdomain systemd-udevd[312340]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.723 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:56Z|00086|binding|INFO|Claiming lport ff588d77-fd65-43a9-bd18-9402d0aef61a for this chassis.
Dec 06 10:14:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:56Z|00087|binding|INFO|ff588d77-fd65-43a9-bd18-9402d0aef61a: Claiming unknown
Dec 06 10:14:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:56.742 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da995d8e002548889747013c0eeca935', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cc41455-e125-49b5-8c35-a9f7e38c8e70, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=ff588d77-fd65-43a9-bd18-9402d0aef61a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:56.745 160509 INFO neutron.agent.ovn.metadata.agent [-] Port ff588d77-fd65-43a9-bd18-9402d0aef61a in datapath deb7774c-e96b-4e7f-88d7-ed9d740915f4 bound to our chassis
Dec 06 10:14:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:56.747 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network deb7774c-e96b-4e7f-88d7-ed9d740915f4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:14:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:56.748 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[237947a5-5a99-423b-8004-d49acec8760b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:56Z|00088|binding|INFO|Setting lport ff588d77-fd65-43a9-bd18-9402d0aef61a ovn-installed in OVS
Dec 06 10:14:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:56Z|00089|binding|INFO|Setting lport ff588d77-fd65-43a9-bd18-9402d0aef61a up in Southbound
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.769 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapff588d77-fd: No such device
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.807 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.829 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.911 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.959 282197 DEBUG nova.network.neutron [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Port e87832d3-ffc3-44e0-9f77-cd2eb6073d62 updated with migration profile {'migrating_to': 'np0005548789.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Dec 06 10:14:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:56.961 282197 DEBUG nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpe77a5ohg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='87dc2ce3-2b16-4764-9803-711c2d12c20f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/addn_hosts - 0 addresses
Dec 06 10:14:57 np0005548789.localdomain podman[312400]: 2025-12-06 10:14:57.159974879 +0000 UTC m=+0.064110314 container kill dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/host
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[311859]: read /var/lib/neutron/dhcp/df3c5fcc-9cd4-4d33-9970-a165c712aad3/opts
Dec 06 10:14:57 np0005548789.localdomain sshd[312417]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:14:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:57.179 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:14:51Z, description=, device_id=7e399e99-e656-47a1-9fb2-96abab06c114, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc98970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc98820>], id=3632540c-2981-4cf2-a512-17df5b6faa8d, ip_allocation=immediate, mac_address=fa:16:3e:ac:38:dd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:44Z, description=, dns_domain=, id=feb354e1-97d5-4c74-804a-eeb06e5bb155, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestJSON-1665562525-network, port_security_enabled=True, project_id=4185da56d12649bc8653dd9db208c0a0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23466, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=570, status=ACTIVE, subnets=['0d3c1a86-b134-4467-9f13-385eed16e944'], tags=[], tenant_id=4185da56d12649bc8653dd9db208c0a0, updated_at=2025-12-06T10:14:45Z, vlan_transparent=None, network_id=feb354e1-97d5-4c74-804a-eeb06e5bb155, port_security_enabled=False, project_id=4185da56d12649bc8653dd9db208c0a0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=616, status=DOWN, tags=[], tenant_id=4185da56d12649bc8653dd9db208c0a0, updated_at=2025-12-06T10:14:51Z on network feb354e1-97d5-4c74-804a-eeb06e5bb155
Dec 06 10:14:57 np0005548789.localdomain sshd[312417]: Accepted publickey for nova from 172.17.0.108 port 43334 ssh2: ECDSA SHA256:d3QEZWuD7sjgJDZ2zlkF0Iu+WveFEzqnvMCo/RH6ucs
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 42436.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 06 10:14:57 np0005548789.localdomain systemd-logind[766]: New session 75 of user nova.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Starting User Manager for UID 42436...
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Dec 06 10:14:57 np0005548789.localdomain podman[312444]: 2025-12-06 10:14:57.387257125 +0000 UTC m=+0.057560684 container kill bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/addn_hosts - 1 addresses
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/host
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/opts
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Queued start job for default target Main User Target.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Created slice User Application Slice.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Reached target Paths.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Reached target Timers.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Starting D-Bus User Message Bus Socket...
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Starting Create User's Volatile Files and Directories...
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Finished Create User's Volatile Files and Directories.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Reached target Sockets.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Reached target Basic System.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Reached target Main User Target.
Dec 06 10:14:57 np0005548789.localdomain systemd[312446]: Startup finished in 152ms.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started User Manager for UID 42436.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started Session 75 of User nova.
Dec 06 10:14:57 np0005548789.localdomain sshd[312417]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Dec 06 10:14:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:57.554 263652 INFO neutron.agent.dhcp.agent [None req-29c685c6-210d-489d-8583-2f4ccab49cc6 - - - - - -] DHCP configuration for ports {'3632540c-2981-4cf2-a512-17df5b6faa8d'} is completed
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started libvirt secret daemon.
Dec 06 10:14:57 np0005548789.localdomain podman[312512]: 
Dec 06 10:14:57 np0005548789.localdomain podman[312512]: 2025-12-06 10:14:57.779266874 +0000 UTC m=+0.083653626 container create 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:57 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016097.7921] manager: (tape87832d3-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Dec 06 10:14:57 np0005548789.localdomain kernel: device tape87832d3-ff entered promiscuous mode
Dec 06 10:14:57 np0005548789.localdomain systemd-udevd[312344]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:14:57 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016097.8105] device (tape87832d3-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 10:14:57 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016097.8110] device (tape87832d3-ff): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 06 10:14:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:57Z|00090|binding|INFO|Claiming lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 for this additional chassis.
Dec 06 10:14:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:57Z|00091|binding|INFO|e87832d3-ffc3-44e0-9f77-cd2eb6073d62: Claiming fa:16:3e:0e:f5:37 10.100.0.14
Dec 06 10:14:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:57Z|00092|binding|INFO|Claiming lport 3b69daca-b91a-4923-9795-2e6a02ee3d59 for this additional chassis.
Dec 06 10:14:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:57Z|00093|binding|INFO|3b69daca-b91a-4923-9795-2e6a02ee3d59: Claiming fa:16:3e:a8:e1:a6 19.80.0.214
Dec 06 10:14:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:57.831 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:57.835 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:57 np0005548789.localdomain podman[312512]: 2025-12-06 10:14:57.740636334 +0000 UTC m=+0.045023106 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started libpod-conmon-8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362.scope.
Dec 06 10:14:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:57Z|00094|binding|INFO|Setting lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 ovn-installed in OVS
Dec 06 10:14:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:57.858 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:14:57 np0005548789.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Dec 06 10:14:57 np0005548789.localdomain systemd-machined[84444]: New machine qemu-3-instance-00000007.
Dec 06 10:14:57 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06549e5dbf4ea1c819a27ad89b0090c0fd564fb1fbcc2e1eabbb66d37085811c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:57 np0005548789.localdomain podman[312512]: 2025-12-06 10:14:57.876324755 +0000 UTC m=+0.180711507 container init 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:57 np0005548789.localdomain podman[312512]: 2025-12-06 10:14:57.885540964 +0000 UTC m=+0.189927716 container start 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312566]: started, version 2.85 cachesize 150
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312566]: DNS service limited to local subnets
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312566]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312566]: warning: no upstream servers configured
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[312566]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:14:57 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 0 addresses
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:14:57 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:14:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:14:57 np0005548789.localdomain ceph-mon[298582]: pgmap v114: 177 pgs: 177 active+clean; 350 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 4.7 MiB/s wr, 221 op/s
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.137 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:58Z|00095|binding|INFO|Releasing lport 9f077348-ed05-4cf4-8524-593431fbafaf from this chassis (sb_readonly=1)
Dec 06 10:14:58 np0005548789.localdomain kernel: device tap9f077348-ed left promiscuous mode
Dec 06 10:14:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:58Z|00096|if_status|INFO|Not setting lport 9f077348-ed05-4cf4-8524-593431fbafaf down as sb is readonly
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.161 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765016098.1613727, 87dc2ce3-2b16-4764-9803-711c2d12c20f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.162 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] VM Started (Lifecycle Event)
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:14:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:14:58Z|00097|binding|INFO|Setting lport 9f077348-ed05-4cf4-8524-593431fbafaf down in Southbound
Dec 06 10:14:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:58.203 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-df3c5fcc-9cd4-4d33-9970-a165c712aad3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df3c5fcc-9cd4-4d33-9970-a165c712aad3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '024b6fbc052c4ed7a93c855bd2ae77da', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd535f07-7612-46c6-87c9-bf69c15a9a5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=9f077348-ed05-4cf4-8524-593431fbafaf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:58.205 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 9f077348-ed05-4cf4-8524-593431fbafaf in datapath df3c5fcc-9cd4-4d33-9970-a165c712aad3 unbound from our chassis
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.208 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:58.210 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df3c5fcc-9cd4-4d33-9970-a165c712aad3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:14:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:14:58.212 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8b02852c-79cc-4146-ac84-72f4d4e51c42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:14:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:14:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:14:58.257 263652 INFO neutron.agent.dhcp.agent [None req-6f08096c-8cef-4708-b239-5eda78c66e32 - - - - - -] DHCP configuration for ports {'431aeba8-5962-4449-b69d-46c4360741a7'} is completed
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.951 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765016098.951262, 87dc2ce3-2b16-4764-9803-711c2d12c20f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.952 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] VM Resumed (Lifecycle Event)
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.979 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:14:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:58.984 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:14:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:14:59.012 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] During the sync_power process the instance has moved from host np0005548790.localdomain to host np0005548789.localdomain
Dec 06 10:14:59 np0005548789.localdomain sshd[312486]: Received disconnect from 172.17.0.108 port 43334:11: disconnected by user
Dec 06 10:14:59 np0005548789.localdomain sshd[312486]: Disconnected from user nova 172.17.0.108 port 43334
Dec 06 10:14:59 np0005548789.localdomain sshd[312417]: pam_unix(sshd:session): session closed for user nova
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Dec 06 10:14:59 np0005548789.localdomain systemd-logind[766]: Session 75 logged out. Waiting for processes to exit.
Dec 06 10:14:59 np0005548789.localdomain systemd-logind[766]: Removed session 75.
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: tmp-crun.BueLQA.mount: Deactivated successfully.
Dec 06 10:14:59 np0005548789.localdomain podman[312616]: 2025-12-06 10:14:59.348720642 +0000 UTC m=+0.091053180 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:14:59 np0005548789.localdomain podman[312616]: 2025-12-06 10:14:59.358737895 +0000 UTC m=+0.101070413 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:14:59 np0005548789.localdomain podman[312615]: 2025-12-06 10:14:59.454357613 +0000 UTC m=+0.200200657 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:59 np0005548789.localdomain podman[312615]: 2025-12-06 10:14:59.487378453 +0000 UTC m=+0.233221427 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:59 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:15:00 np0005548789.localdomain ceph-mon[298582]: pgmap v115: 177 pgs: 177 active+clean; 283 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 6.3 MiB/s wr, 256 op/s
Dec 06 10:15:00 np0005548789.localdomain systemd[1]: tmp-crun.IyrE0m.mount: Deactivated successfully.
Dec 06 10:15:00 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:00Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:f5:37 10.100.0.14
Dec 06 10:15:00 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:00Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:f5:37 10.100.0.14
Dec 06 10:15:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:00.846 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:01.871 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:00Z, description=, device_id=f44ab79a-f9b8-4237-b1dc-a24e7d22c236, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd01d90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd01f10>], id=5f9b5a36-6f9d-4432-a50f-3ba7cd01f2c4, ip_allocation=immediate, mac_address=fa:16:3e:ad:2b:28, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=650, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:15:01Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:15:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:01.945 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:01 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e109 e109: 6 total, 6 up, 6 in
Dec 06 10:15:02 np0005548789.localdomain podman[312673]: 2025-12-06 10:15:02.074556229 +0000 UTC m=+0.049155663 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:15:02 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 9 addresses
Dec 06 10:15:02 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:02 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:02 np0005548789.localdomain systemd[1]: tmp-crun.J0jLJw.mount: Deactivated successfully.
Dec 06 10:15:02 np0005548789.localdomain ceph-mon[298582]: pgmap v116: 177 pgs: 177 active+clean; 283 MiB data, 934 MiB used, 41 GiB / 42 GiB avail; 7.9 MiB/s rd, 6.3 MiB/s wr, 256 op/s
Dec 06 10:15:02 np0005548789.localdomain ceph-mon[298582]: osdmap e109: 6 total, 6 up, 6 in
Dec 06 10:15:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:02.343 263652 INFO neutron.agent.dhcp.agent [None req-ce92131b-5312-498a-a700-98ab0b647cb8 - - - - - -] DHCP configuration for ports {'5f9b5a36-6f9d-4432-a50f-3ba7cd01f2c4'} is completed
Dec 06 10:15:03 np0005548789.localdomain sshd[312694]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00098|binding|INFO|Claiming lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 for this chassis.
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00099|binding|INFO|e87832d3-ffc3-44e0-9f77-cd2eb6073d62: Claiming fa:16:3e:0e:f5:37 10.100.0.14
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00100|binding|INFO|Claiming lport 3b69daca-b91a-4923-9795-2e6a02ee3d59 for this chassis.
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00101|binding|INFO|3b69daca-b91a-4923-9795-2e6a02ee3d59: Claiming fa:16:3e:a8:e1:a6 19.80.0.214
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00102|binding|INFO|Setting lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 up in Southbound
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00103|binding|INFO|Setting lport 3b69daca-b91a-4923-9795-2e6a02ee3d59 up in Southbound
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.055 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:f5:37 10.100.0.14'], port_security=['fa:16:3e:0e:f5:37 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-876689022', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-876689022', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'bfad329a-0ea3-4b02-8e91-9d15749f8c9b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548790.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6898c302-0153-460c-9cb1-4c62ebc9ff31, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e87832d3-ffc3-44e0-9f77-cd2eb6073d62) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.058 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:e1:a6 19.80.0.214'], port_security=['fa:16:3e:a8:e1:a6 19.80.0.214'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['e87832d3-ffc3-44e0-9f77-cd2eb6073d62'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-546955816', 'neutron:cidrs': '19.80.0.214/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-932e7489-8895-41d4-92c6-0d944505e7e6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-546955816', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'bfad329a-0ea3-4b02-8e91-9d15749f8c9b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=f9bb405c-aea0-4a81-a300-475f8e1e8050, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3b69daca-b91a-4923-9795-2e6a02ee3d59) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.060 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e87832d3-ffc3-44e0-9f77-cd2eb6073d62 in datapath 47d636a7-c520-4320-aa94-bfb41f418584 bound to our chassis
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.065 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.076 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[388dc684-1a5b-4f19-8015-ecdc7b8c8026]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.077 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap47d636a7-c1 in ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.080 160674 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap47d636a7-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.080 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[524b8678-7ced-47df-ac30-8655196df868]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.083 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[123f2226-5a5d-46c5-8a4e-a74d9cae2366]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.103 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5cc2a8-8ee0-4a8b-b08a-5cf39cdf51c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.120 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[6d086798-55b1-4514-b2a2-246701cf9bcb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.157 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[00904b77-ae9f-4428-9e88-0d9c975f713a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.162 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce041db-b30c-46ec-b4dd-6845d50495e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016104.1646] manager: (tap47d636a7-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Dec 06 10:15:04 np0005548789.localdomain systemd-udevd[312701]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:04 np0005548789.localdomain ceph-mon[298582]: pgmap v118: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 2.6 MiB/s wr, 179 op/s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.199 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[7d23e7aa-b7f2-4a80-81e4-feb5213d499b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.203 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[a376bb66-b496-4285-a754-b7ed72d3da94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap47d636a7-c1: link becomes ready
Dec 06 10:15:04 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap47d636a7-c0: link becomes ready
Dec 06 10:15:04 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016104.2230] device (tap47d636a7-c0): carrier: link connected
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.230 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[46b7f9f4-3658-43e4-b623-45049db537d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.251 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[bd629d4d-37d7-4633-b88d-f02d59adc186]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap47d636a7-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:94:11:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252240, 'reachable_time': 42492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312722, 'error': None, 'target': 'ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.270 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d24c0559-3966-4145-b612-1608bd642564]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:1187'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1252240, 'tstamp': 1252240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312723, 'error': None, 'target': 'ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.291 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5c42f402-3216-43c6-a5d9-cc52cdc23348]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap47d636a7-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:94:11:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252240, 'reachable_time': 42492, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312724, 'error': None, 'target': 'ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.329 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[7895e3c0-ab53-430c-a14e-3f98205bdd5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.398 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4501a9c4-a64e-4ebe-9177-bb2fbff76977]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.400 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47d636a7-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.401 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.401 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47d636a7-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:04 np0005548789.localdomain kernel: device tap47d636a7-c0 entered promiscuous mode
Dec 06 10:15:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:04.404 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:04.407 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.409 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap47d636a7-c0, col_values=(('external_ids', {'iface-id': '8839eeed-ff6b-46d9-b40d-610788617728'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:04Z|00104|binding|INFO|Releasing lport 8839eeed-ff6b-46d9-b40d-610788617728 from this chassis (sb_readonly=0)
Dec 06 10:15:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:04.410 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:04.421 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.423 160509 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/47d636a7-c520-4320-aa94-bfb41f418584.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/47d636a7-c520-4320-aa94-bfb41f418584.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.424 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5bae47bb-7f6f-4e44-824b-9fe29269dda7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.425 160509 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: global
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     log         /dev/log local0 debug
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     log-tag     haproxy-metadata-proxy-47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     user        root
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     group       root
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     maxconn     1024
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     pidfile     /var/lib/neutron/external/pids/47d636a7-c520-4320-aa94-bfb41f418584.pid.haproxy
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     daemon
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: defaults
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     log global
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     mode http
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     option httplog
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     option dontlognull
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     option http-server-close
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     option forwardfor
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     retries                 3
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-request    30s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout connect         30s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout client          32s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout server          32s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-keep-alive 30s
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: listen listener
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     bind 169.254.169.254:80
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:     http-request add-header X-OVN-Network-ID 47d636a7-c520-4320-aa94-bfb41f418584
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.426 160509 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584', 'env', 'PROCESS_TAG=haproxy-47d636a7-c520-4320-aa94-bfb41f418584', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/47d636a7-c520-4320-aa94-bfb41f418584.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 10:15:04 np0005548789.localdomain sshd[312694]: Received disconnect from 118.219.234.233 port 57196:11: Bye Bye [preauth]
Dec 06 10:15:04 np0005548789.localdomain sshd[312694]: Disconnected from authenticating user root 118.219.234.233 port 57196 [preauth]
Dec 06 10:15:04 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:04.670 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-72030b45-f187-4b30-a9d9-59e0042b4b0f req-b4b9ad16-a4fa-4cc7-b4bb-1c52c2b8f48b f52779cce5374723ad2618b5c2916973 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] This port is not SRIOV, skip binding for port e87832d3-ffc3-44e0-9f77-cd2eb6073d62.
Dec 06 10:15:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:04.823 282197 INFO nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Post operation of migration started
Dec 06 10:15:04 np0005548789.localdomain podman[312757]: 
Dec 06 10:15:04 np0005548789.localdomain podman[312757]: 2025-12-06 10:15:04.856279572 +0000 UTC m=+0.061749154 container create 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:15:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0.scope.
Dec 06 10:15:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c58caa5621f3279794f7dc107a894db9a252904b5522821832a2bf549b22bd7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:04 np0005548789.localdomain podman[312757]: 2025-12-06 10:15:04.821058803 +0000 UTC m=+0.026528545 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 10:15:04 np0005548789.localdomain podman[312757]: 2025-12-06 10:15:04.922309754 +0000 UTC m=+0.127779346 container init 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:15:04 np0005548789.localdomain podman[312757]: 2025-12-06 10:15:04.931617886 +0000 UTC m=+0.137087458 container start 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:15:04 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [NOTICE]   (312775) : New worker (312777) forked
Dec 06 10:15:04 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [NOTICE]   (312775) : Loading success.
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.975 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 3b69daca-b91a-4923-9795-2e6a02ee3d59 in datapath 932e7489-8895-41d4-92c6-0d944505e7e6 bound to our chassis
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.978 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 932e7489-8895-41d4-92c6-0d944505e7e6
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.985 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[cc15a672-c359-4620-bcf9-77985b2a7beb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.986 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap932e7489-81 in ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.988 160674 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap932e7489-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.988 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d724c277-2738-4be8-8f1f-752e0f61ff96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.989 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[04d16f33-2159-4859-9d40-f3d87f10ad3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:04.996 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[56cbbb71-627c-46a2-9a7c-4fba11f33927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.006 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0c1c62fe-33fd-43af-952b-a212b3028732]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.019 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5a30b9-ecdc-4b5b-b268-0c907aeba40c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016105.0248] manager: (tap932e7489-80): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.023 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffb6f34-2bba-4318-8ff3-9078ef0e5d80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain systemd-udevd[312716]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.051 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[bc850976-62fd-437c-857e-5c57560fd499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.053 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[baeda593-b222-417e-b859-68412ebe617f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap932e7489-80: link becomes ready
Dec 06 10:15:05 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016105.0719] device (tap932e7489-80): carrier: link connected
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.076 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[f01cb6b8-0e9b-465b-9d8c-427b63e7346d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.094 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3b417f9c-63f1-43af-adb3-ea6f6d5bd4be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap932e7489-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:4b:f3:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252325, 'reachable_time': 24039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312796, 'error': None, 'target': 'ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.108 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3333ba94-05a8-4529-8897-003387f4eeb7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:f3ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1252325, 'tstamp': 1252325}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312797, 'error': None, 'target': 'ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.125 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f0df1adc-1544-4968-a782-2f26ac813b82]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap932e7489-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:4b:f3:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252325, 'reachable_time': 24039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312798, 'error': None, 'target': 'ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.152 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[25e86e29-1f10-4ae4-8289-1a1865d7e561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.217 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[89af31fc-e075-44b7-b743-596a10c31e90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.219 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap932e7489-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.220 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.221 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap932e7489-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.224 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain kernel: device tap932e7489-80 entered promiscuous mode
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.227 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.230 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap932e7489-80, col_values=(('external_ids', {'iface-id': '9a87eef5-19db-4fcf-a021-4f61b153af33'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.232 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:05Z|00105|binding|INFO|Releasing lport 9a87eef5-19db-4fcf-a021-4f61b153af33 from this chassis (sb_readonly=0)
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.233 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.234 160509 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/932e7489-8895-41d4-92c6-0d944505e7e6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/932e7489-8895-41d4-92c6-0d944505e7e6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.235 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[86f15ef4-8181-4798-8298-04cfe630b156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.236 160509 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: global
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     log         /dev/log local0 debug
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     log-tag     haproxy-metadata-proxy-932e7489-8895-41d4-92c6-0d944505e7e6
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     user        root
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     group       root
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     maxconn     1024
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     pidfile     /var/lib/neutron/external/pids/932e7489-8895-41d4-92c6-0d944505e7e6.pid.haproxy
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     daemon
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: defaults
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     log global
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     mode http
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     option httplog
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     option dontlognull
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     option http-server-close
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     option forwardfor
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     retries                 3
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-request    30s
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout connect         30s
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout client          32s
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout server          32s
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-keep-alive 30s
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: listen listener
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     bind 169.254.169.254:80
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:     http-request add-header X-OVN-Network-ID 932e7489-8895-41d4-92c6-0d944505e7e6
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 10:15:05 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:05.237 160509 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6', 'env', 'PROCESS_TAG=haproxy-932e7489-8895-41d4-92c6-0d944505e7e6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/932e7489-8895-41d4-92c6-0d944505e7e6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.238 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain podman[312830]: 
Dec 06 10:15:05 np0005548789.localdomain podman[312830]: 2025-12-06 10:15:05.700025084 +0000 UTC m=+0.104644466 container create 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c.scope.
Dec 06 10:15:05 np0005548789.localdomain podman[312830]: 2025-12-06 10:15:05.641525209 +0000 UTC m=+0.046144611 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: tmp-crun.ebm9JE.mount: Deactivated successfully.
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.760 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquiring lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.761 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquired lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.761 282197 DEBUG nova.network.neutron [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:05 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cd54fc14950974dc4be0ab802b5b1b70a4a6778ec88aa2829505a69bc3f2898/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:05 np0005548789.localdomain podman[312830]: 2025-12-06 10:15:05.780647159 +0000 UTC m=+0.185266541 container init 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:15:05 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [NOTICE]   (312868) : New worker (312870) forked
Dec 06 10:15:05 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [NOTICE]   (312868) : Loading success.
Dec 06 10:15:05 np0005548789.localdomain podman[312830]: 2025-12-06 10:15:05.841388181 +0000 UTC m=+0.246007553 container start 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:05.887 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:05 np0005548789.localdomain podman[312844]: 2025-12-06 10:15:05.930845254 +0000 UTC m=+0.183837516 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=)
Dec 06 10:15:05 np0005548789.localdomain podman[312845]: 2025-12-06 10:15:05.840945928 +0000 UTC m=+0.095436066 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Dec 06 10:15:05 np0005548789.localdomain podman[312844]: 2025-12-06 10:15:05.968497647 +0000 UTC m=+0.221489939 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec 06 10:15:05 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:15:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:06.020 2 INFO neutron.agent.securitygroups_rpc [None req-32f5fe5b-2f75-4cad-9292-d5acba05dc94 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:15:06 np0005548789.localdomain podman[312845]: 2025-12-06 10:15:06.024588568 +0000 UTC m=+0.279078686 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec 06 10:15:06 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:15:06 np0005548789.localdomain ceph-mon[298582]: pgmap v119: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 2.6 MiB/s wr, 178 op/s
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.434 282197 DEBUG nova.network.neutron [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Updating instance_info_cache with network_info: [{"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.773 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Releasing lock "refresh_cache-87dc2ce3-2b16-4764-9803-711c2d12c20f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.804 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.804 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.805 282197 DEBUG oslo_concurrency.lockutils [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.812 282197 INFO nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 06 10:15:06 np0005548789.localdomain virtqemud[203911]: Domain id=3 name='instance-00000007' uuid=87dc2ce3-2b16-4764-9803-711c2d12c20f is tainted: custom-monitor
Dec 06 10:15:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:06.974 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:07.823 282197 INFO nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 06 10:15:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:07.915 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:15:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:07.919 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}172727ab3d6cbaaa6568bf97b3413b148d1c73ec77830424529759a582bd30ea" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 10:15:07 np0005548789.localdomain ceph-mon[298582]: pgmap v120: 177 pgs: 177 active+clean; 304 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 2.6 MiB/s wr, 178 op/s
Dec 06 10:15:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:08.062 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.116 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 954 Content-Type: application/json Date: Sat, 06 Dec 2025 10:15:07 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a6bde3db-2216-49af-8ec4-1150061ba601 x-openstack-request-id: req-a6bde3db-2216-49af-8ec4-1150061ba601 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.117 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "3b9dcd46-fa1b-4714-ba2b-665da2f67af6", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/3b9dcd46-fa1b-4714-ba2b-665da2f67af6"}]}, {"id": "72bdd1eb-059b-401d-8f8a-ec7c66937f24", "name": "m1.micro", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/72bdd1eb-059b-401d-8f8a-ec7c66937f24"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/72bdd1eb-059b-401d-8f8a-ec7c66937f24"}]}, {"id": "a0a7498e-22eb-495c-a2e3-89ba9e483bf6", "name": "m1.nano", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.117 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a6bde3db-2216-49af-8ec4-1150061ba601 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.119 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}172727ab3d6cbaaa6568bf97b3413b148d1c73ec77830424529759a582bd30ea" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.130 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 493 Content-Type: application/json Date: Sat, 06 Dec 2025 10:15:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-5a9ae02a-ac3f-43b5-ab54-a8a480dd3fdd x-openstack-request-id: req-5a9ae02a-ac3f-43b5-ab54-a8a480dd3fdd _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.131 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "a0a7498e-22eb-495c-a2e3-89ba9e483bf6", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.131 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/a0a7498e-22eb-495c-a2e3-89ba9e483bf6 used request id req-5a9ae02a-ac3f-43b5-ab54-a8a480dd3fdd request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.132 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7897d6398eb64eb29c66df8db792e581', 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'hostId': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.138 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.142 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 87dc2ce3-2b16-4764-9803-711c2d12c20f / tape87832d3-ff inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.143 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.incoming.bytes volume: 1446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9315d243-e01c-426d-b74d-f3afbde4f29f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.133230', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '70fd4942-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': 'cc946e0847f9e7aadbe1de75754ac0aff23f1828a1fb8812275d1d98e02d84a9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1446, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.133230', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '70fdfdd8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': 'bc06b5d66562964ce558b9d94417fdd9a320b4e2828184bc55983d41c06f634f'}]}, 'timestamp': '2025-12-06 10:15:08.143628', '_unique_id': 'a15e3d5b659b496880e613ae5d4ba3d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.146 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.148 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.149 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.outgoing.packets volume: 53 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e235f2d3-3146-49bd-a6ab-64e5a4d3d333', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.148723', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '70feddde-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': 'd5bf00a4d5ed30cf8675e6ba3598597be760a28ae230e2824f496f4cd09a33f1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 53, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.148723', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '70fef260-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '72189fca8c756eb2fb2f8caff23463cac63577c887bd9404ddf2bd40a5e7efff'}]}, 'timestamp': '2025-12-06 10:15:08.149910', '_unique_id': 'cc584b8f71c441fc83f038f3431676a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.151 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.166 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.167 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.179 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.180 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.allocation volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9da37fbe-f106-493b-a85c-1ab57331fd26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.152854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7101a2ee-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': '7e5f5ca5a066a068387ed2e9a09971aca6c53ef25750ee81602694f90e3f1251'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.152854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7101ba18-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': '4c31d75a9902ea875f718dda05f66370656500a54df419019cb0b342ce8f00d1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.152854', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7103a59e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': 'c5d3c4a422d1f217ca7601ba057f34108977c712f7d8877320db312933b75da3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.152854', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7103be80-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': '13801bd793946732d62b554f3b4fce1dd4dc00c71fc4a63b018f864c7699f766'}]}, 'timestamp': '2025-12-06 10:15:08.181284', '_unique_id': 'df59266bc76d41c69ee43c7bedeace4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.182 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.184 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.201 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.219 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/memory.usage volume: 40.44921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfb78e92-41d0-41bf-8819-0580553dbf3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:15:08.184918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7106ecae-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.450827631, 'message_signature': '4203aeb816eecdfafc5ff9db501d5537016e3c73b7784d7c7e32efdce78764b5'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44921875, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'timestamp': '2025-12-06T10:15:08.184918', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '7109a480-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.468585349, 'message_signature': '63a070114e5f6f65398563e2f38688354b3967c8632da928dce05714e5ddedd9'}]}, 'timestamp': '2025-12-06 10:15:08.219898', '_unique_id': '7b4d6e90b6364e3f81feef1c9428e12b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.220 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.221 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3323cd67-84ba-47df-9456-1a8f4fc87307', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.221809', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7109fdc2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': 'a9781eb69a6084327ad7229bec70b4961616d7e7e49811fc94fcba72a2d6b183'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.221809', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '710a0704-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': 'eea6561975db61dbe0d3c6b8bfed1c659d80d2b1b9870b90772b48bcc9672c57'}]}, 'timestamp': '2025-12-06 10:15:08.222335', '_unique_id': '82de2371e5e94774a52b9e169d11f23f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.222 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.223 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.223 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>]
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d8e2951-d645-4f62-9d92-d7a765341cee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.224016', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '710a5204-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': '469eff839e74cfd84ed2a3d0038cb2882dfb49b1e260818cd0cda433600052bc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.224016', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '710a5b28-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': 'ae63fffd87a3b18156ab8b0ae2fd35e4ee7c54841d4b501efb65d2ecbf7160dc'}]}, 'timestamp': '2025-12-06 10:15:08.224490', '_unique_id': '9430ce47d8f547d09b2c3bc932942cd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.224 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.225 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bb42570-80af-470b-b641-78fb3b1f5d67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.225763', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '710a971e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': 'd530f48e146c40d2c712d9138705b3129d75387b03c2997a1bbf4ff4d650de02'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.225763', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '710aa04c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '7f8acd60ee5ca43e18b9f312be60a864a5739465c5570db82251315e7a2c2103'}]}, 'timestamp': '2025-12-06 10:15:08.226260', '_unique_id': '9104b59ca9614e038477d01f350e6673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.226 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.247 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.247 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.274 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.requests volume: 182 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.275 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.requests volume: 69 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afe6664f-0f8a-4374-a6d7-d1a96df987e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.227498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '710dd9e2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '410ff09559e30ab159230594068f678c35156e39d12d353a1d727c0e24e4626b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.227498', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '710de392-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '01eaa1b51e52a836e96e2f868be6af62463a1c364fd233ae1828660476670f26'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 182, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.227498', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711221f0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '89b2f6a9c2717c87a3c53a781d6a4b08437c0c79c88b2b5cb4fb8168f70ac80e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 69, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.227498', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711239e2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'a74bea9b9ca9298a26a308a6095767a4a2610b92786fc27fd4b72c155b87e912'}]}, 'timestamp': '2025-12-06 10:15:08.276191', '_unique_id': '0a68e23a1c98467a82a2340f6fdc615f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.277 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.279 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.280 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4758e61b-e52e-4c9c-aa46-0f5b0d0af980', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.279693', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7112dc62-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': '66fc6831dff658a2f0412640718292509fd041f99e1dec06dd18cf1924080ef1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.279693', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '7112f1ac-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': 'd6dd9e29e9e663bfa4ce6bed0e47db2a6285d37d8e4394d86093b4d683dd7654'}]}, 'timestamp': '2025-12-06 10:15:08.280996', '_unique_id': 'a1fd7b67ff864908892b841d3a567a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.282 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.284 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.284 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>]
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.285 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.285 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.286 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.requests volume: 74 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.286 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81341788-2953-4375-b5c5-d19ab094ef73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.285114', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7113aab6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': 'c0ff32f61af8991c423deec5da540f3cc2b446eba59cff1307751a99cb631e16'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.285114', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7113c064-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '050be5b277164fda021c81fac924fd081504e3286a4ea8355209cb31de79c491'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 74, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.285114', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7113d432-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'd0d0d556247a60dab6e5d9fec9bdeec5185d00c8e81caaedb6c667473c524534'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.285114', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7113e8fa-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '78af484019149e84fa5ee11bf9387ec573dcf5365aa11ad2bbb71d04440279db'}]}, 'timestamp': '2025-12-06 10:15:08.287215', '_unique_id': 'e59a3ed4628146c6934742881cd47282'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.288 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.289 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.290 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.290 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.291 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.bytes volume: 3067392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.291 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.bytes volume: 159824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '044624b5-a6b7-4fb6-bd06-8f564bc6ca83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.290153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '711473ec-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '397cbba4be60d3ba5e4ffd3edea0a8b9f510a1a0a3ffd0d0753bbc03059327d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.290153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7114881e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '05f7ff1fbda52417a322c45ad97bef56cb78c9d849b0a0665c59c9be2f06bc36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3067392, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.290153', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71149a20-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'aa77ca9e9dddacfd291193814f0724f000c7b949e9ba8735a5214a04afb989c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 159824, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.290153', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7114ad4e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'e49a87f60bdb3401ac344536440e39a8ac81b49ee42f64e972aa224ff4e642cd'}]}, 'timestamp': '2025-12-06 10:15:08.292233', '_unique_id': '2a36dd7e05644eaba67efaad6770f1cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.293 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.295 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.296 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb5f10c6-2d12-47b8-93db-70735501d786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.295426', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '711540ba-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': '607600144dd9ea380036ad1a64643b2df9d041bbf8bd9f96e97ebd34684f65fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.295426', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '7115547e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '1e06d4a8ddc77acbe232ff87c9e25a3e803986140a242808609822d1e9ed4226'}]}, 'timestamp': '2025-12-06 10:15:08.296538', '_unique_id': '2ca9f37365dd4be78125e1ae553be0a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.298 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.299 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.300 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.300 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.301 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45f3cd03-8231-46e7-a5ec-fec9395ee5e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.299798', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7115e68c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': 'edc7b8e7c95effb208a8a372a773109f01441f7f14ba2ce1b917e6a36bd30b73'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.299798', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7115f6b8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': 'ca0fb2b43c0380292f42d2ec0e2b9fd63cf7d6545cf426385c1b750934cf457b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.299798', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71160748-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': 'e75ae73c50f5a5d7ea9ec88138be42f854645815df948605e47a35b78eba1c59'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.299798', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7116165c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': 'c5394f7cc9758b8d127827b51e4b3d991e5bbcc6d94bb4b8638a593c5eacf50f'}]}, 'timestamp': '2025-12-06 10:15:08.301469', '_unique_id': '9aa8ae4f45d04d4795f8dcbcac4d3a48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.302 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.303 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.304 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>]
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.304 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.304 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.305 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.latency volume: 184282236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.305 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.read.latency volume: 70306913 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77bc7a07-8c18-45f7-9771-0b978edd233c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.304497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71169d20-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': 'b4a57d8879192631c983a93254c914b5ccc02338c69cafba38b42e48dac71c7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.304497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7116ae64-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '5dfd2cb318834b211954c8962c9a2e676f5a7ba3b9728bb022a446437fa7dc65'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 184282236, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.304497', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7116bd64-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '635585a8c38812342566f1476e91008b5d4137e5e73fdc6ce8160f626ecd6d17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 70306913, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.304497', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7116d3d0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'fdbe3804adf5df1c270b79fa491c67795e3920eef5407b16f269745892f42855'}]}, 'timestamp': '2025-12-06 10:15:08.306319', '_unique_id': 'b9f73347add6410892a9e6440288d5b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.307 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.308 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.308 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 16190000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.309 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/cpu volume: 1030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4eebff3-fe5d-45ee-942a-c49c00243eb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16190000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:15:08.308573', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '71173d98-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.450827631, 'message_signature': 'eacc644887d356b897de5bc11f3e7e334aaca3b816364fef57bbe01df022565c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1030000000, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'timestamp': '2025-12-06T10:15:08.308573', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '71174d92-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.468585349, 'message_signature': 'e337cc736ab49f5a11aa565d5d1a0805a1b6d00d666e5b7f20d6663431ee4702'}]}, 'timestamp': '2025-12-06 10:15:08.309437', '_unique_id': 'ea4ff056d0c34fe4859393ea51aa78aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.310 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.311 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.311 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1999616987>]
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.312 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.312 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.313 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.latency volume: 1537084443 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.313 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f9c9df0-575d-4780-a879-efaaab615507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.312223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7117cb0a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '7d86399a810a506a63809593ee51e86f0a179db7ebab0b52f1f0e615f036e30c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.312223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7117dbfe-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '8149b7ce2939975bc8f18e0350d5072768fb64904df3cfcc40fc30811852086a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1537084443, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.312223', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7117eb58-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': 'c9d62d50bf9fad8362c8f5795d398893dbc2c493865dea55e725e55ebc75d6be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.312223', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7117fa4e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '6a451eae16bfb543514ab01c756c60546dabc0ed1b9ee44bf23a29dd97612609'}]}, 'timestamp': '2025-12-06 10:15:08.313909', '_unique_id': 'b761968316174d2ea71f85d9201c01c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.314 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.316 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.316 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6600945e-2358-4a7e-81b1-dff9000027be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.316146', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '711864a2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': 'd6d214b9c7df8b87712477a2d762c7eab8fb2084ac2230bf7432a14d98005057'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.316146', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '71187640-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '527303367fb2cd028db543f07bf3e21d34e0fbc405c44f81ffcab0d0efb76601'}]}, 'timestamp': '2025-12-06 10:15:08.317049', '_unique_id': '93c775480fcc4364aaee7a734df30f2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.318 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.319 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.319 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '537aeeac-b0c8-47ca-9823-30f578e533e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.319205', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '7118dc20-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': '7e4308da4775b48edf9b84cbebb2d4ec08187efbd9212f84298afd7366b7fbcd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.319205', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '7118ee0e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '25ae2aecaa7a57b1404a52f5ad7c267fed461c04fcee022a79c32c53e3dab68c'}]}, 'timestamp': '2025-12-06 10:15:08.320116', '_unique_id': '05e074acb52a45bd8a497301c05bd4db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.321 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.322 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.322 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.323 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.bytes volume: 47214592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.323 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c364bc2-34f0-40b2-9a81-f2f452ec305e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.322292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71195484-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '4a889da976a072ec6c31110d25fa45d7ae75dcaebee570b6d17b36fc11ea8e1c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.322292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '711965aa-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.47683263, 'message_signature': '6b27aedfea11accc07eb7c33d3cc53f5f9d071094a2b53e7a79678370ef6de15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 47214592, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.322292', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711974d2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '936c19bdf7c266706711edc9a0a0298fbb558d6f26b9cd3131af22f1ecbab8c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.322292', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7119838c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.49694731, 'message_signature': '4ac4e380bf03d6535b07d844a0f77e14cf5307e996fb66a720a845f79bf0d3a6'}]}, 'timestamp': '2025-12-06 10:15:08.323954', '_unique_id': 'a308647c51484bf4833b5fe1cf63b410'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.324 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.326 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.326 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.327 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.327 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71c79d6d-34c4-4d52-8b0f-c829ee5a7216', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:15:08.326223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7119ee12-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': '9520c7f485948740e98a8d71b7b1fc9dd632d4e67b9919f5d2df6f9b775c0466'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:15:08.326223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7119fee8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.402242477, 'message_signature': '92b05b294b3910dc36251d80bd23347d9deae27b68968131bb5145e5d85b196d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-vda', 'timestamp': '2025-12-06T10:15:08.326223', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '711a11e4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': '6249dc4dd098304bb3b13f910d7edf61acbd1411d5c0b594c5d11509857f9592'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f-sda', 'timestamp': '2025-12-06T10:15:08.326223', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'instance-00000007', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '711a276a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.417401107, 'message_signature': 'eb43741d9a1a0854be86d5eaf8edc208bd64faf1d8c1f29b03785952f7f00438'}]}, 'timestamp': '2025-12-06 10:15:08.328172', '_unique_id': 'e49dc9c2691b4a409d80644cbb2ea7c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.329 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.331 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.331 12 DEBUG ceilometer.compute.pollsters [-] 87dc2ce3-2b16-4764-9803-711c2d12c20f/network.outgoing.bytes volume: 4184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41ba7b7d-e7d2-4bcf-a522-5f8404e8114b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:15:08.331263', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '711ab4f0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.382597781, 'message_signature': '2fe461631223c528f110226489a5a5f0b2790fa1fb547a54009816ffc69c3652'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4184, 'user_id': 'ac2e85103fd14829ad4e6df2357da95b', 'user_name': None, 'project_id': '7897d6398eb64eb29c66df8db792e581', 'project_name': None, 'resource_id': 'instance-00000007-87dc2ce3-2b16-4764-9803-711c2d12c20f-tape87832d3-ff', 'timestamp': '2025-12-06T10:15:08.331263', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1999616987', 'name': 'tape87832d3-ff', 'instance_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'instance_type': 'm1.nano', 'host': 'b262cbaba97a5f617b7f4005a27ec79e86949cad326c374cbf942dbe', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': 'a0a7498e-22eb-495c-a2e3-89ba9e483bf6', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6a944ab6-8965-4055-b7fc-af6e395005ea'}, 'image_ref': '6a944ab6-8965-4055-b7fc-af6e395005ea', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:f5:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape87832d3-ff'}, 'message_id': '711acb3e-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12526.388408647, 'message_signature': '2cdcde42de35c980e246c30010bd20989c7f1c92426a6eac83a75c4571a1f352'}]}, 'timestamp': '2025-12-06 10:15:08.332350', '_unique_id': 'e822da23d2b14f73846cb5e2b3058938'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:15:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:15:08.333 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:15:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:08.833 282197 INFO nova.virt.libvirt.driver [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 06 10:15:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:08.841 282197 DEBUG nova.compute.manager [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:08.861 282197 DEBUG nova.objects.instance [None req-72030b45-f187-4b30-a9d9-59e0042b4b0f 310ca84346a742a09c8478aa7405cb30 5bd426c09dd743399e71eb5c44db45cb - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 06 10:15:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e110 e110: 6 total, 6 up, 6 in
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 42436...
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Activating special unit Exit the Session...
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped target Main User Target.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped target Basic System.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped target Paths.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped target Sockets.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped target Timers.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Closed D-Bus User Message Bus Socket.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Removed slice User Application Slice.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Reached target Shutdown.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Finished Exit the Session.
Dec 06 10:15:09 np0005548789.localdomain systemd[312446]: Reached target Exit the Session.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 42436.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 06 10:15:09 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Dec 06 10:15:10 np0005548789.localdomain ceph-mon[298582]: osdmap e110: 6 total, 6 up, 6 in
Dec 06 10:15:10 np0005548789.localdomain ceph-mon[298582]: pgmap v122: 177 pgs: 177 active+clean; 306 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 1.2 MiB/s wr, 144 op/s
Dec 06 10:15:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:15:10 np0005548789.localdomain sshd[312898]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:10.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:10 np0005548789.localdomain podman[312897]: 2025-12-06 10:15:10.95581272 +0000 UTC m=+0.106565444 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:15:10 np0005548789.localdomain podman[312897]: 2025-12-06 10:15:10.975355273 +0000 UTC m=+0.126107996 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:10 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:15:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e111 e111: 6 total, 6 up, 6 in
Dec 06 10:15:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/263018422' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2133323897' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:11.139 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:10Z, description=, device_id=f44ab79a-f9b8-4237-b1dc-a24e7d22c236, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc02dc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc023a0>], id=cb8f0dad-295c-4ff7-a2e3-6c05095b4764, ip_allocation=immediate, mac_address=fa:16:3e:da:77:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1078460514-network, port_security_enabled=True, project_id=da995d8e002548889747013c0eeca935, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20432, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=633, status=ACTIVE, subnets=['21dfbaea-2209-4e97-94d1-e29a4f3ba83b'], tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:14:55Z, vlan_transparent=None, network_id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, port_security_enabled=False, project_id=da995d8e002548889747013c0eeca935, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=663, status=DOWN, tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:15:10Z on network deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:11 np0005548789.localdomain systemd[1]: tmp-crun.cnmPXV.mount: Deactivated successfully.
Dec 06 10:15:11 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 8 addresses
Dec 06 10:15:11 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:11 np0005548789.localdomain podman[312934]: 2025-12-06 10:15:11.154194888 +0000 UTC m=+0.075098139 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:11 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:11 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 1 addresses
Dec 06 10:15:11 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:11 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:11 np0005548789.localdomain podman[312970]: 2025-12-06 10:15:11.448186995 +0000 UTC m=+0.063739604 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:15:11 np0005548789.localdomain sshd[312898]: Received disconnect from 64.227.102.57 port 40852:11: Bye Bye [preauth]
Dec 06 10:15:11 np0005548789.localdomain sshd[312898]: Disconnected from authenticating user root 64.227.102.57 port 40852 [preauth]
Dec 06 10:15:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:11Z|00106|binding|INFO|Releasing lport 9a87eef5-19db-4fcf-a021-4f61b153af33 from this chassis (sb_readonly=0)
Dec 06 10:15:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:11Z|00107|binding|INFO|Releasing lport 8839eeed-ff6b-46d9-b40d-610788617728 from this chassis (sb_readonly=0)
Dec 06 10:15:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:11Z|00108|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:11.557 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:11.768 263652 INFO neutron.agent.dhcp.agent [None req-eacbe306-a8d4-4e64-a6e2-7cd18882a09b - - - - - -] DHCP configuration for ports {'cb8f0dad-295c-4ff7-a2e3-6c05095b4764'} is completed
Dec 06 10:15:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:12.012 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:12 np0005548789.localdomain ceph-mon[298582]: pgmap v124: 177 pgs: 177 active+clean; 306 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 793 KiB/s rd, 64 KiB/s wr, 70 op/s
Dec 06 10:15:12 np0005548789.localdomain ceph-mon[298582]: osdmap e111: 6 total, 6 up, 6 in
Dec 06 10:15:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e112 e112: 6 total, 6 up, 6 in
Dec 06 10:15:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:12.235 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:12 np0005548789.localdomain dnsmasq[311859]: exiting on receipt of SIGTERM
Dec 06 10:15:12 np0005548789.localdomain podman[313009]: 2025-12-06 10:15:12.254939506 +0000 UTC m=+0.060338352 container kill dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:15:12 np0005548789.localdomain systemd[1]: libpod-dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81.scope: Deactivated successfully.
Dec 06 10:15:12 np0005548789.localdomain podman[313023]: 2025-12-06 10:15:12.329979511 +0000 UTC m=+0.055106042 container died dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:12 np0005548789.localdomain systemd[1]: tmp-crun.MbTTl6.mount: Deactivated successfully.
Dec 06 10:15:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a826d515f5bdfea048cffebccb4edfc28363d9139b831b1071c42234067ec609-merged.mount: Deactivated successfully.
Dec 06 10:15:12 np0005548789.localdomain podman[313023]: 2025-12-06 10:15:12.381959398 +0000 UTC m=+0.107085899 container remove dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-df3c5fcc-9cd4-4d33-9970-a165c712aad3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:12 np0005548789.localdomain systemd[1]: libpod-conmon-dbf1d79cf37c9a95da8c3fbfd17ed1db394008f8d27c162fbd5d4fc5de50fe81.scope: Deactivated successfully.
Dec 06 10:15:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:12.429 263652 INFO neutron.agent.dhcp.agent [None req-73cb6db9-d341-4c31-a329-722c7dea2032 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:12.591 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:12.996 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:10Z, description=, device_id=f44ab79a-f9b8-4237-b1dc-a24e7d22c236, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2f4c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2f910>], id=cb8f0dad-295c-4ff7-a2e3-6c05095b4764, ip_allocation=immediate, mac_address=fa:16:3e:da:77:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1078460514-network, port_security_enabled=True, project_id=da995d8e002548889747013c0eeca935, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20432, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=633, status=ACTIVE, subnets=['21dfbaea-2209-4e97-94d1-e29a4f3ba83b'], tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:14:55Z, vlan_transparent=None, network_id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, port_security_enabled=False, project_id=da995d8e002548889747013c0eeca935, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=663, status=DOWN, tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:15:10Z on network deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:13 np0005548789.localdomain ceph-mon[298582]: osdmap e112: 6 total, 6 up, 6 in
Dec 06 10:15:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:13 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2ddf3c5fcc\x2d9cd4\x2d4d33\x2d9970\x2da165c712aad3.mount: Deactivated successfully.
Dec 06 10:15:13 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 1 addresses
Dec 06 10:15:13 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:13 np0005548789.localdomain podman[313067]: 2025-12-06 10:15:13.319168596 +0000 UTC m=+0.054625598 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:15:13 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:13 np0005548789.localdomain systemd[1]: tmp-crun.zc8DCq.mount: Deactivated successfully.
Dec 06 10:15:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:13.618 263652 INFO neutron.agent.dhcp.agent [None req-957ed8d5-3292-44c7-9943-9e84fb378784 - - - - - -] DHCP configuration for ports {'cb8f0dad-295c-4ff7-a2e3-6c05095b4764'} is completed
Dec 06 10:15:14 np0005548789.localdomain dnsmasq[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/addn_hosts - 0 addresses
Dec 06 10:15:14 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/host
Dec 06 10:15:14 np0005548789.localdomain podman[313106]: 2025-12-06 10:15:14.107345962 +0000 UTC m=+0.069940132 container kill bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:15:14 np0005548789.localdomain dnsmasq-dhcp[312123]: read /var/lib/neutron/dhcp/feb354e1-97d5-4c74-804a-eeb06e5bb155/opts
Dec 06 10:15:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:15:14 np0005548789.localdomain ceph-mon[298582]: pgmap v126: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.9 MiB/s rd, 7.8 MiB/s wr, 266 op/s
Dec 06 10:15:14 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2629540065' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:14 np0005548789.localdomain podman[313119]: 2025-12-06 10:15:14.22200444 +0000 UTC m=+0.085195716 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:15:14 np0005548789.localdomain podman[313119]: 2025-12-06 10:15:14.257522378 +0000 UTC m=+0.120713704 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:15:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:15:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:15:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:15:14 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:15:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:14.376 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:14Z|00109|binding|INFO|Releasing lport 2f6c7dc0-af46-4cc2-99f3-f46a11be455c from this chassis (sb_readonly=0)
Dec 06 10:15:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:14Z|00110|binding|INFO|Setting lport 2f6c7dc0-af46-4cc2-99f3-f46a11be455c down in Southbound
Dec 06 10:15:14 np0005548789.localdomain kernel: device tap2f6c7dc0-af left promiscuous mode
Dec 06 10:15:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:14.396 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:14.695 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-feb354e1-97d5-4c74-804a-eeb06e5bb155', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-feb354e1-97d5-4c74-804a-eeb06e5bb155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4185da56d12649bc8653dd9db208c0a0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd335efc-b05b-4aaa-a30a-c891a594ccf4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=2f6c7dc0-af46-4cc2-99f3-f46a11be455c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:14.698 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 2f6c7dc0-af46-4cc2-99f3-f46a11be455c in datapath feb354e1-97d5-4c74-804a-eeb06e5bb155 unbound from our chassis
Dec 06 10:15:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:14.703 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network feb354e1-97d5-4c74-804a-eeb06e5bb155, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:14.704 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0a304e23-9df4-497b-a015-4d51954fc2e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:15 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/975209313' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:15.894 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:16 np0005548789.localdomain ceph-mon[298582]: pgmap v127: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.7 MiB/s wr, 171 op/s
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:15:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:15:16 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e113 e113: 6 total, 6 up, 6 in
Dec 06 10:15:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:17.015 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:17 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2224318560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:17 np0005548789.localdomain ceph-mon[298582]: osdmap e113: 6 total, 6 up, 6 in
Dec 06 10:15:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:17.545 2 INFO neutron.agent.securitygroups_rpc [None req-5e443fd1-82aa-48be-b4ff-976554ebf448 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group rule updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:17Z|00111|binding|INFO|Releasing lport 9a87eef5-19db-4fcf-a021-4f61b153af33 from this chassis (sb_readonly=0)
Dec 06 10:15:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:17Z|00112|binding|INFO|Releasing lport 8839eeed-ff6b-46d9-b40d-610788617728 from this chassis (sb_readonly=0)
Dec 06 10:15:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:17Z|00113|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:15:17 np0005548789.localdomain podman[313169]: 2025-12-06 10:15:17.741618015 +0000 UTC m=+0.088019190 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:15:17 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:17 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:17 np0005548789.localdomain systemd[1]: tmp-crun.NGrjuz.mount: Deactivated successfully.
Dec 06 10:15:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:17.771 2 INFO neutron.agent.securitygroups_rpc [None req-54187745-6fe9-48d8-bbb3-7e399880134e da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group rule updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:17 np0005548789.localdomain systemd[1]: tmp-crun.EgQVOy.mount: Deactivated successfully.
Dec 06 10:15:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:17.797 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:17 np0005548789.localdomain podman[313177]: 2025-12-06 10:15:17.801619566 +0000 UTC m=+0.112951938 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:15:17 np0005548789.localdomain podman[313177]: 2025-12-06 10:15:17.861825412 +0000 UTC m=+0.173157804 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:17 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:15:18 np0005548789.localdomain dnsmasq[312123]: exiting on receipt of SIGTERM
Dec 06 10:15:18 np0005548789.localdomain podman[313228]: 2025-12-06 10:15:18.137450322 +0000 UTC m=+0.040787258 container kill bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:15:18 np0005548789.localdomain systemd[1]: libpod-bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9.scope: Deactivated successfully.
Dec 06 10:15:18 np0005548789.localdomain ceph-mon[298582]: pgmap v129: 177 pgs: 177 active+clean; 387 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 7.8 MiB/s wr, 173 op/s
Dec 06 10:15:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4212878909' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/212981679' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:18 np0005548789.localdomain podman[313242]: 2025-12-06 10:15:18.205179607 +0000 UTC m=+0.051959828 container died bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:18 np0005548789.localdomain podman[313242]: 2025-12-06 10:15:18.248807519 +0000 UTC m=+0.095587750 container cleanup bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:18 np0005548789.localdomain systemd[1]: libpod-conmon-bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9.scope: Deactivated successfully.
Dec 06 10:15:18 np0005548789.localdomain podman[313243]: 2025-12-06 10:15:18.287329138 +0000 UTC m=+0.129757526 container remove bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-feb354e1-97d5-4c74-804a-eeb06e5bb155, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:15:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:18.313 263652 INFO neutron.agent.dhcp.agent [None req-b0c6e341-e194-4e94-87e0-7a3d2080e559 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:18.439 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.655 282197 DEBUG nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.655 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.656 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.656 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.656 282197 DEBUG nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] No waiting events found dispatching network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.657 282197 WARNING nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received unexpected event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 for instance with vm_state active and task_state None.
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.657 282197 DEBUG nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.657 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.658 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.658 282197 DEBUG oslo_concurrency.lockutils [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.658 282197 DEBUG nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] No waiting events found dispatching network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:18.659 282197 WARNING nova.compute.manager [req-75910a2f-80eb-488b-b75f-d1d834665e19 req-136732f4-6e5c-4f56-8969-2d1ed0f41471 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received unexpected event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 for instance with vm_state active and task_state None.
Dec 06 10:15:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-17d5803262906bbfc10a2359d065da806a9d9644144fa240df1cddd30ea542d6-merged.mount: Deactivated successfully.
Dec 06 10:15:18 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf1840969a38cdc03ff2f446425ad666bbcdfe5ae29c7d80e81b374fc1a9dff9-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:18 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2dfeb354e1\x2d97d5\x2d4c74\x2d804a\x2deeb06e5bb155.mount: Deactivated successfully.
Dec 06 10:15:19 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/412049606' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:20 np0005548789.localdomain ceph-mon[298582]: pgmap v130: 177 pgs: 177 active+clean; 352 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 9.0 MiB/s rd, 8.5 MiB/s wr, 280 op/s
Dec 06 10:15:20 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/46663597' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.703 282197 DEBUG nova.compute.manager [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.703 282197 DEBUG oslo_concurrency.lockutils [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.704 282197 DEBUG oslo_concurrency.lockutils [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.705 282197 DEBUG oslo_concurrency.lockutils [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.705 282197 DEBUG nova.compute.manager [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] No waiting events found dispatching network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.706 282197 WARNING nova.compute.manager [req-05a2636c-fccb-4682-9d32-0ff3b66d77a7 req-11adf25e-0fd3-4002-8f7d-42187a1c57d2 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Received unexpected event network-vif-plugged-e87832d3-ffc3-44e0-9f77-cd2eb6073d62 for instance with vm_state active and task_state None.
Dec 06 10:15:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:20.942 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:21 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e114 e114: 6 total, 6 up, 6 in
Dec 06 10:15:22 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:15:22 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:22 np0005548789.localdomain podman[313287]: 2025-12-06 10:15:22.004082024 +0000 UTC m=+0.051521924 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:22 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.053 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00114|binding|INFO|Releasing lport 9a87eef5-19db-4fcf-a021-4f61b153af33 from this chassis (sb_readonly=0)
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00115|binding|INFO|Releasing lport 8839eeed-ff6b-46d9-b40d-610788617728 from this chassis (sb_readonly=0)
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00116|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.176 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:22 np0005548789.localdomain ceph-mon[298582]: pgmap v131: 177 pgs: 177 active+clean; 352 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.6 MiB/s wr, 250 op/s
Dec 06 10:15:22 np0005548789.localdomain ceph-mon[298582]: osdmap e114: 6 total, 6 up, 6 in
Dec 06 10:15:22 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1244880858' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:22 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:22.678 2 INFO neutron.agent.securitygroups_rpc [req-32f9c27c-7e39-487b-9f96-37ea07c2a545 req-64092713-96b8-4823-87de-00cf06a3e614 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group member updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:22.733 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:22Z, description=, device_id=89419cdc-1b37-4fdd-ad4b-013514e141a9, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5ebb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5ec40>], id=22b2d742-fd5b-4bf4-898c-5da61dccc8af, ip_allocation=immediate, mac_address=fa:16:3e:df:0c:e8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:14:54Z, description=, dns_domain=, id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1078460514-network, port_security_enabled=True, project_id=da995d8e002548889747013c0eeca935, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20432, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=633, status=ACTIVE, subnets=['21dfbaea-2209-4e97-94d1-e29a4f3ba83b'], tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:14:55Z, vlan_transparent=None, network_id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, port_security_enabled=True, project_id=da995d8e002548889747013c0eeca935, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['581a4637-eff2-45f4-92f3-d575b736a840'], standard_attr_id=711, status=DOWN, tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:15:22Z on network deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:22 np0005548789.localdomain sshd[313323]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.917 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Acquiring lock "87dc2ce3-2b16-4764-9803-711c2d12c20f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.918 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.918 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Acquiring lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.919 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.920 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.921 282197 INFO nova.compute.manager [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Terminating instance
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.923 282197 DEBUG nova.compute.manager [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 06 10:15:22 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 2 addresses
Dec 06 10:15:22 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:22 np0005548789.localdomain podman[313326]: 2025-12-06 10:15:22.978932463 +0000 UTC m=+0.073935065 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:22 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:22 np0005548789.localdomain systemd[1]: tmp-crun.02v3Ck.mount: Deactivated successfully.
Dec 06 10:15:22 np0005548789.localdomain kernel: device tape87832d3-ff left promiscuous mode
Dec 06 10:15:22 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016122.9862] device (tape87832d3-ff): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00117|binding|INFO|Releasing lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 from this chassis (sb_readonly=0)
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00118|binding|INFO|Setting lport e87832d3-ffc3-44e0-9f77-cd2eb6073d62 down in Southbound
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00119|binding|INFO|Releasing lport 3b69daca-b91a-4923-9795-2e6a02ee3d59 from this chassis (sb_readonly=0)
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00120|binding|INFO|Setting lport 3b69daca-b91a-4923-9795-2e6a02ee3d59 down in Southbound
Dec 06 10:15:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:22Z|00121|binding|INFO|Removing iface tape87832d3-ff ovn-installed in OVS
Dec 06 10:15:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:22.994 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.002 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:f5:37 10.100.0.14'], port_security=['fa:16:3e:0e:f5:37 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-876689022', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '87dc2ce3-2b16-4764-9803-711c2d12c20f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-47d636a7-c520-4320-aa94-bfb41f418584', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-876689022', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'bfad329a-0ea3-4b02-8e91-9d15749f8c9b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6898c302-0153-460c-9cb1-4c62ebc9ff31, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e87832d3-ffc3-44e0-9f77-cd2eb6073d62) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:23Z|00122|binding|INFO|Releasing lport 9a87eef5-19db-4fcf-a021-4f61b153af33 from this chassis (sb_readonly=0)
Dec 06 10:15:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:23Z|00123|binding|INFO|Releasing lport 8839eeed-ff6b-46d9-b40d-610788617728 from this chassis (sb_readonly=0)
Dec 06 10:15:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:23Z|00124|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.005 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:e1:a6 19.80.0.214'], port_security=['fa:16:3e:a8:e1:a6 19.80.0.214'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['e87832d3-ffc3-44e0-9f77-cd2eb6073d62'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-546955816', 'neutron:cidrs': '19.80.0.214/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-932e7489-8895-41d4-92c6-0d944505e7e6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-546955816', 'neutron:project_id': '7897d6398eb64eb29c66df8db792e581', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'bfad329a-0ea3-4b02-8e91-9d15749f8c9b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=f9bb405c-aea0-4a81-a300-475f8e1e8050, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=3b69daca-b91a-4923-9795-2e6a02ee3d59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.007 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e87832d3-ffc3-44e0-9f77-cd2eb6073d62 in datapath 47d636a7-c520-4320-aa94-bfb41f418584 unbound from our chassis
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.011 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 47d636a7-c520-4320-aa94-bfb41f418584, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.012 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[57c5a341-953a-4e6f-afb0-66a6ac7ab0cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.013 160509 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584 namespace which is not needed anymore
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.025 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 2.800s CPU time.
Dec 06 10:15:23 np0005548789.localdomain systemd-machined[84444]: Machine qemu-3-instance-00000007 terminated.
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.033 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.160 282197 INFO nova.virt.libvirt.driver [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Instance destroyed successfully.
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.161 282197 DEBUG nova.objects.instance [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lazy-loading 'resources' on Instance uuid 87dc2ce3-2b16-4764-9803-711c2d12c20f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.172 282197 DEBUG nova.virt.libvirt.vif [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-06T10:14:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1999616987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548789.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1999616987',id=7,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:14:45Z,launched_on='np0005548790.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005548789.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7897d6398eb64eb29c66df8db792e581',ramdisk_id='',reservation_id='r-tcv45ne4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-265776820',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-265776820-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:15:08Z,user_data=None,user_id='ac2e85103fd14829ad4e6df2357da95b',uuid=87dc2ce3-2b16-4764-9803-711c2d12c20f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.173 282197 DEBUG nova.network.os_vif_util [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Converting VIF {"id": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "address": "fa:16:3e:0e:f5:37", "network": {"id": "47d636a7-c520-4320-aa94-bfb41f418584", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1313845827-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "7897d6398eb64eb29c66df8db792e581", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape87832d3-ff", "ovs_interfaceid": "e87832d3-ffc3-44e0-9f77-cd2eb6073d62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.174 282197 DEBUG nova.network.os_vif_util [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.175 282197 DEBUG os_vif [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.177 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.177 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape87832d3-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: tmp-crun.vIdInn.mount: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.215 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [NOTICE]   (312775) : haproxy version is 2.8.14-c23fe91
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [NOTICE]   (312775) : path to executable is /usr/sbin/haproxy
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [WARNING]  (312775) : Exiting Master process...
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [WARNING]  (312775) : Exiting Master process...
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.219 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [ALERT]    (312775) : Current worker (312777) exited with code 143 (Terminated)
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584[312771]: [WARNING]  (312775) : All workers exited. Exiting... (0)
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: libpod-30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.221 282197 INFO os_vif [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:f5:37,bridge_name='br-int',has_traffic_filtering=True,id=e87832d3-ffc3-44e0-9f77-cd2eb6073d62,network=Network(47d636a7-c520-4320-aa94-bfb41f418584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tape87832d3-ff')
Dec 06 10:15:23 np0005548789.localdomain podman[313368]: 2025-12-06 10:15:23.23205495 +0000 UTC m=+0.113509384 container died 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:15:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:23.265 263652 INFO neutron.agent.dhcp.agent [None req-300ea3ab-a6b2-4b27-9192-3b7890bf5a3c - - - - - -] DHCP configuration for ports {'22b2d742-fd5b-4bf4-898c-5da61dccc8af'} is completed
Dec 06 10:15:23 np0005548789.localdomain podman[313368]: 2025-12-06 10:15:23.268232487 +0000 UTC m=+0.149686911 container cleanup 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:15:23 np0005548789.localdomain podman[313395]: 2025-12-06 10:15:23.296255277 +0000 UTC m=+0.061360722 container cleanup 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: libpod-conmon-30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain podman[313424]: 2025-12-06 10:15:23.369497629 +0000 UTC m=+0.075172812 container remove 30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.372 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[066bf514-db4d-4353-9c59-c3f7dd825bda]: (4, ('Sat Dec  6 10:15:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584 (30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0)\n30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0\nSat Dec  6 10:15:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584 (30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0)\n30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.374 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[04278edd-3bf8-415b-92f6-4cbe64cc7d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.375 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47d636a7-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.376 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain kernel: device tap47d636a7-c0 left promiscuous mode
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.378 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.384 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.384 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f8c986-4794-409a-81a0-56787e2cb568]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.401 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e0335da9-4a90-4feb-bf17-55ed0250bf24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.402 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c5062c76-d79c-4ca5-81c9-f23cc8f29da8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.416 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[42c5bba4-a925-47c1-afa6-e3c53692e6c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252232, 'reachable_time': 40375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313445, 'error': None, 'target': 'ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.418 160720 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-47d636a7-c520-4320-aa94-bfb41f418584 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.418 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[fb66873e-ed04-4a51-816a-68fad6626b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.419 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 3b69daca-b91a-4923-9795-2e6a02ee3d59 in datapath 932e7489-8895-41d4-92c6-0d944505e7e6 unbound from our chassis
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.420 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 932e7489-8895-41d4-92c6-0d944505e7e6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.421 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c17a9cf2-977e-499b-9a81-f17fda8ee91a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.421 160509 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6 namespace which is not needed anymore
Dec 06 10:15:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:23.484 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005548788.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:22Z, description=, device_id=89419cdc-1b37-4fdd-ad4b-013514e141a9, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fdb5bb0>], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fdb58b0>], id=22b2d742-fd5b-4bf4-898c-5da61dccc8af, ip_allocation=immediate, mac_address=fa:16:3e:df:0c:e8, name=, network_id=deb7774c-e96b-4e7f-88d7-ed9d740915f4, port_security_enabled=True, project_id=da995d8e002548889747013c0eeca935, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['581a4637-eff2-45f4-92f3-d575b736a840'], standard_attr_id=711, status=DOWN, tags=[], tenant_id=da995d8e002548889747013c0eeca935, updated_at=2025-12-06T10:15:23Z on network deb7774c-e96b-4e7f-88d7-ed9d740915f4
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [NOTICE]   (312868) : haproxy version is 2.8.14-c23fe91
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [NOTICE]   (312868) : path to executable is /usr/sbin/haproxy
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [WARNING]  (312868) : Exiting Master process...
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [ALERT]    (312868) : Current worker (312870) exited with code 143 (Terminated)
Dec 06 10:15:23 np0005548789.localdomain neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6[312846]: [WARNING]  (312868) : All workers exited. Exiting... (0)
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: libpod-612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 2 addresses
Dec 06 10:15:23 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:23 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:23 np0005548789.localdomain podman[313503]: 2025-12-06 10:15:23.655199104 +0000 UTC m=+0.045975545 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:15:23 np0005548789.localdomain podman[313463]: 2025-12-06 10:15:23.686508134 +0000 UTC m=+0.167790910 container died 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:15:23 np0005548789.localdomain podman[313463]: 2025-12-06 10:15:23.709264144 +0000 UTC m=+0.190546900 container cleanup 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:23 np0005548789.localdomain podman[313492]: 2025-12-06 10:15:23.718983929 +0000 UTC m=+0.132482889 container cleanup 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:23 np0005548789.localdomain systemd[1]: libpod-conmon-612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c.scope: Deactivated successfully.
Dec 06 10:15:23 np0005548789.localdomain podman[313529]: 2025-12-06 10:15:23.781493815 +0000 UTC m=+0.058974810 container remove 612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.787 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d9bd1c63-7c1e-4bab-b769-4c1e8962bce3]: (4, ('Sat Dec  6 10:15:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6 (612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c)\n612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c\nSat Dec  6 10:15:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6 (612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c)\n612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.788 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[38186457-3592-4611-819b-cba29bd8de15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.788 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap932e7489-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.790 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain kernel: device tap932e7489-80 left promiscuous mode
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.795 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.797 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b106c2da-f411-46e5-9050-c7edbb0c673b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.812 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f2befca4-9408-4940-92c2-45538027c411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.812 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[27dbd2b5-aedb-439d-ae91-9ecbb11553c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.824 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[07eb7743-0f5d-4ba9-9860-60e22b56bed2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1252319, 'reachable_time': 23571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313549, 'error': None, 'target': 'ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.825 160720 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-932e7489-8895-41d4-92c6-0d944505e7e6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 10:15:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:23.825 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[7a531c97-87bb-4928-947a-8c5ad60f041f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.838 282197 INFO nova.virt.libvirt.driver [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Deleting instance files /var/lib/nova/instances/87dc2ce3-2b16-4764-9803-711c2d12c20f_del
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.839 282197 INFO nova.virt.libvirt.driver [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Deletion of /var/lib/nova/instances/87dc2ce3-2b16-4764-9803-711c2d12c20f_del complete
Dec 06 10:15:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:23.885 263652 INFO neutron.agent.dhcp.agent [None req-efd4d040-c2ef-4514-8d3a-d6defd2372ce - - - - - -] DHCP configuration for ports {'22b2d742-fd5b-4bf4-898c-5da61dccc8af'} is completed
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.891 282197 INFO nova.compute.manager [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Took 0.97 seconds to destroy the instance on the hypervisor.
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.892 282197 DEBUG oslo.service.loopingcall [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.892 282197 DEBUG nova.compute.manager [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 06 10:15:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:23.893 282197 DEBUG nova.network.neutron [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 06 10:15:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:15:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:15:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:15:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157928 "" "Go-http-client/1.1"
Dec 06 10:15:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:15:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19735 "" "Go-http-client/1.1"
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0cd54fc14950974dc4be0ab802b5b1b70a4a6778ec88aa2829505a69bc3f2898-merged.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-612bd559f2204770efc500c44b1ad1302ada83654a900876cb5eb3da5fa5d50c-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: run-netns-ovnmeta\x2d932e7489\x2d8895\x2d41d4\x2d92c6\x2d0d944505e7e6.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9c58caa5621f3279794f7dc107a894db9a252904b5522821832a2bf549b22bd7-merged.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30f8df0ce350363c4f7f35e7678b3d71ec43583d72e5615def6c2519fa7edac0-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain systemd[1]: run-netns-ovnmeta\x2d47d636a7\x2dc520\x2d4320\x2daa94\x2dbfb41f418584.mount: Deactivated successfully.
Dec 06 10:15:24 np0005548789.localdomain ceph-mon[298582]: pgmap v133: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.4 MiB/s rd, 11 MiB/s wr, 345 op/s
Dec 06 10:15:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1760790147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2447503790' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.440 282197 DEBUG nova.network.neutron [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.456 282197 INFO nova.compute.manager [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Took 1.56 seconds to deallocate network for instance.
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.518 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.519 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.522 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:25 np0005548789.localdomain sshd[313323]: Received disconnect from 179.33.210.213 port 41252:11: Bye Bye [preauth]
Dec 06 10:15:25 np0005548789.localdomain sshd[313323]: Disconnected from authenticating user root 179.33.210.213 port 41252 [preauth]
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.562 282197 INFO nova.scheduler.client.report [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Deleted allocations for instance 87dc2ce3-2b16-4764-9803-711c2d12c20f
Dec 06 10:15:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:25.638 282197 DEBUG oslo_concurrency.lockutils [None req-9d46d05d-bff5-4191-8d08-2ac7960589d8 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Lock "87dc2ce3-2b16-4764-9803-711c2d12c20f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:26 np0005548789.localdomain ceph-mon[298582]: pgmap v134: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.4 MiB/s rd, 11 MiB/s wr, 343 op/s
Dec 06 10:15:26 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2463120775' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:15:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:26.979 2 INFO neutron.agent.securitygroups_rpc [None req-4bf7090f-619c-441c-8a74-44ff051b2a47 ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:15:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 e115: 6 total, 6 up, 6 in
Dec 06 10:15:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:27.057 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:27 np0005548789.localdomain ceph-mon[298582]: osdmap e115: 6 total, 6 up, 6 in
Dec 06 10:15:27 np0005548789.localdomain ceph-mon[298582]: pgmap v136: 177 pgs: 177 active+clean; 399 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 4.3 MiB/s rd, 8.5 MiB/s wr, 195 op/s
Dec 06 10:15:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:28.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:28 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:28.683 2 INFO neutron.agent.securitygroups_rpc [None req-9fa949f8-0732-40f0-9fd9-bacbdfb578db ac2e85103fd14829ad4e6df2357da95b 7897d6398eb64eb29c66df8db792e581 - - default default] Security group member updated ['bfad329a-0ea3-4b02-8e91-9d15749f8c9b']
Dec 06 10:15:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:28.865 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:28Z, description=, device_id=af0f743c-b34f-4641-9bca-6f879d4af6de, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fe2ff70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbd39d0>], id=72f817fe-8a65-4586-937f-6a6314c57627, ip_allocation=immediate, mac_address=fa:16:3e:e5:9e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=764, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:15:28Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:15:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:28.899 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:28.900 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:28.902 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:15:29 np0005548789.localdomain podman[313567]: 2025-12-06 10:15:29.076382397 +0000 UTC m=+0.045944164 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:29 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:29 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:29 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:29 np0005548789.localdomain systemd[1]: tmp-crun.I6aJPS.mount: Deactivated successfully.
Dec 06 10:15:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:29.247 263652 INFO neutron.agent.dhcp.agent [None req-0a3d6d5f-3c2f-4934-b687-092cf9c4a6cd - - - - - -] DHCP configuration for ports {'72f817fe-8a65-4586-937f-6a6314c57627'} is completed
Dec 06 10:15:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:29.345 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Creating tmpfile /var/lib/nova/instances/tmpm9_iowog to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Dec 06 10:15:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:29.346 282197 DEBUG nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm9_iowog',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Dec 06 10:15:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:29.602 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:15:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:15:29 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:29.904 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:29 np0005548789.localdomain podman[313588]: 2025-12-06 10:15:29.934362692 +0000 UTC m=+0.084900337 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:15:29 np0005548789.localdomain podman[313588]: 2025-12-06 10:15:29.94320561 +0000 UTC m=+0.093743255 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:15:29 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:15:30 np0005548789.localdomain podman[313587]: 2025-12-06 10:15:30.036323034 +0000 UTC m=+0.192104788 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:30 np0005548789.localdomain podman[313587]: 2025-12-06 10:15:30.069344796 +0000 UTC m=+0.225126520 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Dec 06 10:15:30 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:15:30 np0005548789.localdomain ceph-mon[298582]: pgmap v137: 177 pgs: 177 active+clean; 238 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 5.5 MiB/s rd, 8.5 MiB/s wr, 336 op/s
Dec 06 10:15:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:30.262 282197 DEBUG nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm9_iowog',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='ed40901b-0bfc-426a-bf70-48d87ce95aa6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Dec 06 10:15:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:30.285 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Acquiring lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:30.285 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Acquired lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:30.286 282197 DEBUG nova.network.neutron [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.099 282197 DEBUG nova.network.neutron [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Updating instance_info_cache with network_info: [{"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.121 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Releasing lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.124 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm9_iowog',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='ed40901b-0bfc-426a-bf70-48d87ce95aa6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.124 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Creating instance directory: /var/lib/nova/instances/ed40901b-0bfc-426a-bf70-48d87ce95aa6 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.125 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Ensure instance console log exists: /var/lib/nova/instances/ed40901b-0bfc-426a-bf70-48d87ce95aa6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.126 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.127 282197 DEBUG nova.virt.libvirt.vif [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-06T10:15:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-571789410',display_name='tempest-LiveMigrationTest-server-571789410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548790.localdomain',hostname='tempest-livemigrationtest-server-571789410',id=8,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:15:26Z,launched_on='np0005548790.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005548790.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9167331b2c424ef6961b096b551f8434',ramdisk_id='',reservation_id='r-9204byw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1593322913',owner_user_name='tempest-LiveMigrationTest-1593322913-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:15:26Z,user_data=None,user_id='b25d9e5ec9eb4368a764482a325b9dda',uuid=ed40901b-0bfc-426a-bf70-48d87ce95aa6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.128 282197 DEBUG nova.network.os_vif_util [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Converting VIF {"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.129 282197 DEBUG nova.network.os_vif_util [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.129 282197 DEBUG os_vif [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.130 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.131 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.131 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.136 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.137 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfeb6a13d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.138 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfeb6a13d-30, col_values=(('external_ids', {'iface-id': 'feb6a13d-305a-4541-a50e-4988833ecf82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:ea:4a', 'vm-uuid': 'ed40901b-0bfc-426a-bf70-48d87ce95aa6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.143 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.147 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.148 282197 INFO os_vif [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30')
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.148 282197 DEBUG nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.149 282197 DEBUG nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm9_iowog',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='ed40901b-0bfc-426a-bf70-48d87ce95aa6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.200 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.201 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.201 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.202 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.202 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:31 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/936145217' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.626 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.719 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.720 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.960 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.962 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11346MB free_disk=41.71154022216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.962 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:31.962 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.009 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Migration for instance ed40901b-0bfc-426a-bf70-48d87ce95aa6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.031 282197 INFO nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Updating resource usage from migration 0c4fb838-191a-43fb-92ad-31ab3b6d11ce
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.031 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Starting to track incoming migration 0c4fb838-191a-43fb-92ad-31ab3b6d11ce with flavor a0a7498e-22eb-495c-a2e3-89ba9e483bf6 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.061 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.078 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.096 282197 WARNING nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance ed40901b-0bfc-426a-bf70-48d87ce95aa6 has been moved to another host np0005548790.localdomain(np0005548790.localdomain). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.097 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.097 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:15:32 np0005548789.localdomain ceph-mon[298582]: pgmap v138: 177 pgs: 177 active+clean; 238 MiB data, 916 MiB used, 41 GiB / 42 GiB avail; 4.5 MiB/s rd, 7.0 MiB/s wr, 275 op/s
Dec 06 10:15:32 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/936145217' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.193 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/843309173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.693 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.701 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.718 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.766 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:15:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:32.766 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/843309173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:34 np0005548789.localdomain ceph-mon[298582]: pgmap v139: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 234 op/s
Dec 06 10:15:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:34.225 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:33Z, description=, device_id=e763caa9-7ac3-434d-b131-2742f1c4d17b, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdcc70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdca60>], id=17d01ee3-d0a0-42f3-8c73-1578e34c0b4f, ip_allocation=immediate, mac_address=fa:16:3e:71:8d:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=796, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:15:33Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:15:34 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 8 addresses
Dec 06 10:15:34 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:34 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:34 np0005548789.localdomain podman[313691]: 2025-12-06 10:15:34.458579789 +0000 UTC m=+0.048064980 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:34.506 282197 DEBUG nova.network.neutron [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Port feb6a13d-305a-4541-a50e-4988833ecf82 updated with migration profile {'migrating_to': 'np0005548789.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Dec 06 10:15:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:34.508 282197 DEBUG nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm9_iowog',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='ed40901b-0bfc-426a-bf70-48d87ce95aa6',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Dec 06 10:15:34 np0005548789.localdomain sshd[313713]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:34 np0005548789.localdomain sshd[313713]: Accepted publickey for nova from 172.17.0.108 port 41342 ssh2: ECDSA SHA256:d3QEZWuD7sjgJDZ2zlkF0Iu+WveFEzqnvMCo/RH6ucs
Dec 06 10:15:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:34.743 263652 INFO neutron.agent.dhcp.agent [None req-ec0ca4b4-e71e-4b88-acd1-954cbbb82b6b - - - - - -] DHCP configuration for ports {'17d01ee3-d0a0-42f3-8c73-1578e34c0b4f'} is completed
Dec 06 10:15:34 np0005548789.localdomain systemd[1]: Created slice User Slice of UID 42436.
Dec 06 10:15:34 np0005548789.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 06 10:15:34 np0005548789.localdomain systemd-logind[766]: New session 77 of user nova.
Dec 06 10:15:34 np0005548789.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 06 10:15:34 np0005548789.localdomain systemd[1]: Starting User Manager for UID 42436...
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Queued start job for default target Main User Target.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Created slice User Application Slice.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Reached target Paths.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Reached target Timers.
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Starting D-Bus User Message Bus Socket...
Dec 06 10:15:34 np0005548789.localdomain systemd[313717]: Starting Create User's Volatile Files and Directories...
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Listening on D-Bus User Message Bus Socket.
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Finished Create User's Volatile Files and Directories.
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Reached target Sockets.
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Reached target Basic System.
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Reached target Main User Target.
Dec 06 10:15:35 np0005548789.localdomain systemd[313717]: Startup finished in 161ms.
Dec 06 10:15:35 np0005548789.localdomain systemd[1]: Started User Manager for UID 42436.
Dec 06 10:15:35 np0005548789.localdomain systemd[1]: Started Session 77 of User nova.
Dec 06 10:15:35 np0005548789.localdomain sshd[313713]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Dec 06 10:15:35 np0005548789.localdomain kernel: device tapfeb6a13d-30 entered promiscuous mode
Dec 06 10:15:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016135.2126] manager: (tapfeb6a13d-30): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec 06 10:15:35 np0005548789.localdomain systemd-udevd[313775]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00125|binding|INFO|Claiming lport feb6a13d-305a-4541-a50e-4988833ecf82 for this additional chassis.
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00126|binding|INFO|feb6a13d-305a-4541-a50e-4988833ecf82: Claiming fa:16:3e:e5:ea:4a 10.100.0.10
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00127|binding|INFO|Claiming lport 99b309b3-9e3d-4a23-b110-d99707c2eb4e for this additional chassis.
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00128|binding|INFO|99b309b3-9e3d-4a23-b110-d99707c2eb4e: Claiming fa:16:3e:11:27:4d 19.80.0.152
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.219 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548789.localdomain podman[313753]: 2025-12-06 10:15:35.235247167 +0000 UTC m=+0.074443000 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:15:35 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:35 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:35 np0005548789.localdomain systemd[1]: tmp-crun.GdsLui.mount: Deactivated successfully.
Dec 06 10:15:35 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016135.2444] device (tapfeb6a13d-30): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Dec 06 10:15:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016135.2448] device (tapfeb6a13d-30): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.256 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548789.localdomain systemd-machined[84444]: New machine qemu-4-instance-00000008.
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00129|binding|INFO|Setting lport feb6a13d-305a-4541-a50e-4988833ecf82 ovn-installed in OVS
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.265 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.265 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548789.localdomain systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Dec 06 10:15:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:35Z|00130|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.490 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.594 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765016135.5938025, ed40901b-0bfc-426a-bf70-48d87ce95aa6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.594 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] VM Started (Lifecycle Event)
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.643 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.766 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.767 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.768 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.768 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.898 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.899 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.899 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:15:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:35.900 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.141 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.175 282197 DEBUG nova.virt.driver [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] Emitting event <LifecycleEvent: 1765016136.1753488, ed40901b-0bfc-426a-bf70-48d87ce95aa6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.176 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] VM Resumed (Lifecycle Event)
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.195 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.200 282197 DEBUG nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 06 10:15:36 np0005548789.localdomain ceph-mon[298582]: pgmap v140: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 234 op/s
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.226 282197 INFO nova.compute.manager [None req-c2cff4e5-3d1d-47f9-bef4-e7abce6c5152 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] During the sync_power process the instance has moved from host np0005548790.localdomain to host np0005548789.localdomain
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.334 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:36 np0005548789.localdomain sshd[313732]: Received disconnect from 172.17.0.108 port 41342:11: disconnected by user
Dec 06 10:15:36 np0005548789.localdomain sshd[313732]: Disconnected from user nova 172.17.0.108 port 41342
Dec 06 10:15:36 np0005548789.localdomain sshd[313713]: pam_unix(sshd:session): session closed for user nova
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Dec 06 10:15:36 np0005548789.localdomain systemd-logind[766]: Session 77 logged out. Waiting for processes to exit.
Dec 06 10:15:36 np0005548789.localdomain systemd-logind[766]: Removed session 77.
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: tmp-crun.ua8FNR.mount: Deactivated successfully.
Dec 06 10:15:36 np0005548789.localdomain podman[313839]: 2025-12-06 10:15:36.569036883 +0000 UTC m=+0.113942048 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:15:36 np0005548789.localdomain podman[313840]: 2025-12-06 10:15:36.626899788 +0000 UTC m=+0.170694698 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:15:36 np0005548789.localdomain podman[313840]: 2025-12-06 10:15:36.639270363 +0000 UTC m=+0.183065293 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:15:36 np0005548789.localdomain podman[313839]: 2025-12-06 10:15:36.650597246 +0000 UTC m=+0.195502361 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350)
Dec 06 10:15:36 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.753 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.788 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.788 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.790 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:36.791 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.066 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.184 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00131|binding|INFO|Claiming lport feb6a13d-305a-4541-a50e-4988833ecf82 for this chassis.
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00132|binding|INFO|feb6a13d-305a-4541-a50e-4988833ecf82: Claiming fa:16:3e:e5:ea:4a 10.100.0.10
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00133|binding|INFO|Claiming lport 99b309b3-9e3d-4a23-b110-d99707c2eb4e for this chassis.
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00134|binding|INFO|99b309b3-9e3d-4a23-b110-d99707c2eb4e: Claiming fa:16:3e:11:27:4d 19.80.0.152
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00135|binding|INFO|Setting lport feb6a13d-305a-4541-a50e-4988833ecf82 up in Southbound
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00136|binding|INFO|Setting lport 99b309b3-9e3d-4a23-b110-d99707c2eb4e up in Southbound
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.392 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:27:4d 19.80.0.152'], port_security=['fa:16:3e:11:27:4d 19.80.0.152'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['feb6a13d-305a-4541-a50e-4988833ecf82'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2060007817', 'neutron:cidrs': '19.80.0.152/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2060007817', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '3', 'neutron:security_group_ids': '4c82b56e-0fc5-4c7f-8922-ceb8236815fd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=927c8639-172d-4240-b8a1-85db1fd6c03d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=99b309b3-9e3d-4a23-b110-d99707c2eb4e) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.396 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ea:4a 10.100.0.10'], port_security=['fa:16:3e:e5:ea:4a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1146072664', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed40901b-0bfc-426a-bf70-48d87ce95aa6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45604602-bc87-4608-9881-9568cbf90870', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1146072664', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '9', 'neutron:security_group_ids': '4c82b56e-0fc5-4c7f-8922-ceb8236815fd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548790.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d40d335f-7e85-43c3-894d-993c12735497, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=feb6a13d-305a-4541-a50e-4988833ecf82) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.398 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 99b309b3-9e3d-4a23-b110-d99707c2eb4e in datapath 19043ea6-c6b2-4272-aa60-1b11a7b5bd93 bound to our chassis
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.402 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19043ea6-c6b2-4272-aa60-1b11a7b5bd93
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.413 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[dd044226-6764-49b6-90b1-1e2a8b665e75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.414 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19043ea6-c1 in ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.418 160674 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19043ea6-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.418 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[eb03a6ff-8bea-47eb-adc1-e85dafc84aee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:37.419 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c req-d809e72f-e65e-4220-83cc-53ce9206b29d f52779cce5374723ad2618b5c2916973 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] This port is not SRIOV, skip binding for port feb6a13d-305a-4541-a50e-4988833ecf82.
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.420 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0d23b896-8c87-4b97-b349-a11a495a99e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.427 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[9bbcfd87-2af0-4b11-a5f9-47315b31e15d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.439 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac2b6fa-cc9b-4d14-981a-a7cac84b657a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.462 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[6deb00f4-3275-4551-a0c8-1089de73851d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016137.4712] manager: (tap19043ea6-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Dec 06 10:15:37 np0005548789.localdomain systemd-udevd[313781]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.473 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c0de9c5b-eba1-4340-8bab-ab8b939fb63f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.500 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[e084804a-56b1-480d-b98d-b6f481fd53ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.503 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3776b4-3d6a-4916-afa4-e4e4659f9bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap19043ea6-c1: link becomes ready
Dec 06 10:15:37 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap19043ea6-c0: link becomes ready
Dec 06 10:15:37 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016137.5268] device (tap19043ea6-c0): carrier: link connected
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.534 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[7282731e-2dfa-4930-a30e-043dfa91c702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain systemd[1]: tmp-crun.vsRuIR.mount: Deactivated successfully.
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.552 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe5f017-e17f-4095-bd7d-51a951ffd2e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19043ea6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:00:81:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255570, 'reachable_time': 35481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313900, 'error': None, 'target': 'ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.559 282197 INFO nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Post operation of migration started
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.571 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cdce00-3e18-45f2-a327-dece9aa64e86]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:8115'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1255570, 'tstamp': 1255570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 313901, 'error': None, 'target': 'ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.584 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[574a2186-9941-4ec2-9f73-fbf88d0c7fc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19043ea6-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:00:81:15'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255570, 'reachable_time': 35481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 313902, 'error': None, 'target': 'ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.604 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[aa42e77d-a2f0-4f8e-9b41-809e05052d5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.644 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[146c87d7-8efd-4281-91cd-2e823e907e30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.647 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19043ea6-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.647 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.648 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19043ea6-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.702 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:37 np0005548789.localdomain kernel: device tap19043ea6-c0 entered promiscuous mode
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.706 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19043ea6-c0, col_values=(('external_ids', {'iface-id': 'b960e3cf-838e-4b32-93f1-7da76cedadcc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.707 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:37Z|00137|binding|INFO|Releasing lport b960e3cf-838e-4b32-93f1-7da76cedadcc from this chassis (sb_readonly=0)
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.712 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.712 160509 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19043ea6-c6b2-4272-aa60-1b11a7b5bd93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19043ea6-c6b2-4272-aa60-1b11a7b5bd93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.715 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7d9d15-85eb-4a8a-aefe-7a566ed48a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.716 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.717 160509 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: global
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log         /dev/log local0 debug
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log-tag     haproxy-metadata-proxy-19043ea6-c6b2-4272-aa60-1b11a7b5bd93
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     user        root
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     group       root
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     maxconn     1024
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     pidfile     /var/lib/neutron/external/pids/19043ea6-c6b2-4272-aa60-1b11a7b5bd93.pid.haproxy
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     daemon
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: defaults
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     log global
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     mode http
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option httplog
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option dontlognull
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option http-server-close
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     option forwardfor
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     retries                 3
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-request    30s
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout connect         30s
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout client          32s
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout server          32s
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-keep-alive 30s
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: listen listener
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     bind 169.254.169.254:80
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:     http-request add-header X-OVN-Network-ID 19043ea6-c6b2-4272-aa60-1b11a7b5bd93
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 10:15:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:37.718 160509 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'env', 'PROCESS_TAG=haproxy-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19043ea6-c6b2-4272-aa60-1b11a7b5bd93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.725 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Acquiring lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.725 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Acquired lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:15:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:37.725 282197 DEBUG nova.network.neutron [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 06 10:15:38 np0005548789.localdomain ceph-mon[298582]: pgmap v141: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 4.6 MiB/s rd, 20 KiB/s wr, 232 op/s
Dec 06 10:15:38 np0005548789.localdomain podman[313935]: 
Dec 06 10:15:38 np0005548789.localdomain podman[313935]: 2025-12-06 10:15:38.070600538 +0000 UTC m=+0.068523800 container create 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:15:38 np0005548789.localdomain systemd[1]: Started libpod-conmon-57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab.scope.
Dec 06 10:15:38 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:38 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/970f665873ff889fc4ce87e8eb815e45fa33cad2aebf50d32a77643cc655aa94/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:38 np0005548789.localdomain podman[313935]: 2025-12-06 10:15:38.038859225 +0000 UTC m=+0.036782517 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 10:15:38 np0005548789.localdomain podman[313935]: 2025-12-06 10:15:38.144604352 +0000 UTC m=+0.142527604 container init 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:15:38 np0005548789.localdomain podman[313935]: 2025-12-06 10:15:38.153353447 +0000 UTC m=+0.151276739 container start 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.158 282197 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016123.1571443, 87dc2ce3-2b16-4764-9803-711c2d12c20f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.159 282197 INFO nova.compute.manager [-] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] VM Stopped (Lifecycle Event)
Dec 06 10:15:38 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [NOTICE]   (313953) : New worker (313955) forked
Dec 06 10:15:38 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [NOTICE]   (313953) : Loading success.
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.184 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.198 160509 INFO neutron.agent.ovn.metadata.agent [-] Port feb6a13d-305a-4541-a50e-4988833ecf82 in datapath 45604602-bc87-4608-9881-9568cbf90870 bound to our chassis
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.204 160509 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.214 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4c075fae-73f4-474a-93cd-9655c07fd3c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.215 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap45604602-b1 in ovnmeta-45604602-bc87-4608-9881-9568cbf90870 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.217 160674 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap45604602-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.217 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0351ee-5481-4ad9-b196-684d9121f87a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.218 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[1d798907-195a-4cc3-9f04-93c3db6bc9a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.225 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[42b00b73-7e86-4efd-b02b-663b97d0d245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.234 282197 DEBUG nova.compute.manager [None req-3af31981-be92-4ad3-b33a-abddd1f5b0a5 - - - - - -] [instance: 87dc2ce3-2b16-4764-9803-711c2d12c20f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.249 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ea929242-b598-45af-9947-037068cf05fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.281 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[8ddd50a1-bb38-414d-a070-0aab29637304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016138.2903] manager: (tap45604602-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.290 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9a126116-86a2-4c51-ae15-e46540d40c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain systemd-udevd[313887]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.325 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[9f89b17c-5397-49be-b7cb-8556b7cd8eaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.329 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[aee4fd69-cbdd-44e1-a208-20b1599785fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap45604602-b0: link becomes ready
Dec 06 10:15:38 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016138.3521] device (tap45604602-b0): carrier: link connected
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.357 160700 DEBUG oslo.privsep.daemon [-] privsep: reply[8808a77c-8d61-4809-90df-f8055aadbb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.371 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[02df119f-47d4-400c-acd0-7b8c69906b43]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45604602-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:48:e6:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255653, 'reachable_time': 20864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 313974, 'error': None, 'target': 'ovnmeta-45604602-bc87-4608-9881-9568cbf90870', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.384 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5351d3b6-d616-47b0-b78c-b0876968feec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:e68f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1255653, 'tstamp': 1255653}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 313975, 'error': None, 'target': 'ovnmeta-45604602-bc87-4608-9881-9568cbf90870', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.400 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[95b1650c-f572-4687-82a1-2de79e9411fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45604602-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:48:e6:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255653, 'reachable_time': 20864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 313976, 'error': None, 'target': 'ovnmeta-45604602-bc87-4608-9881-9568cbf90870', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.429 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d16bad81-6f0d-4d56-b034-98b8aeaaeb65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.486 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[06c9c145-7cd0-4105-a878-cff4826a03bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.488 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45604602-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.489 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.489 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45604602-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.492 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:38 np0005548789.localdomain kernel: device tap45604602-b0 entered promiscuous mode
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.496 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap45604602-b0, col_values=(('external_ids', {'iface-id': 'd57132cf-ea52-419a-82d6-37dcdb5dd89a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.497 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:38Z|00138|binding|INFO|Releasing lport d57132cf-ea52-419a-82d6-37dcdb5dd89a from this chassis (sb_readonly=0)
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.509 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.509 160509 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/45604602-bc87-4608-9881-9568cbf90870.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/45604602-bc87-4608-9881-9568cbf90870.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.510 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[267cb533-ceed-4912-ab2c-6de95e9f713d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.511 160509 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: global
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     log         /dev/log local0 debug
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     log-tag     haproxy-metadata-proxy-45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     user        root
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     group       root
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     maxconn     1024
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     pidfile     /var/lib/neutron/external/pids/45604602-bc87-4608-9881-9568cbf90870.pid.haproxy
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     daemon
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: defaults
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     log global
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     mode http
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     option httplog
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     option dontlognull
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     option http-server-close
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     option forwardfor
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     retries                 3
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-request    30s
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout connect         30s
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout client          32s
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout server          32s
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     timeout http-keep-alive 30s
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: listen listener
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     bind 169.254.169.254:80
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     server metadata /var/lib/neutron/metadata_proxy
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:     http-request add-header X-OVN-Network-ID 45604602-bc87-4608-9881-9568cbf90870
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 06 10:15:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:38.512 160509 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-45604602-bc87-4608-9881-9568cbf90870', 'env', 'PROCESS_TAG=haproxy-45604602-bc87-4608-9881-9568cbf90870', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/45604602-bc87-4608-9881-9568cbf90870.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 06 10:15:38 np0005548789.localdomain systemd[1]: tmp-crun.hUFR5P.mount: Deactivated successfully.
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.708 282197 DEBUG nova.network.neutron [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Updating instance_info_cache with network_info: [{"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.732 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Releasing lock "refresh_cache-ed40901b-0bfc-426a-bf70-48d87ce95aa6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.747 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.748 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.749 282197 DEBUG oslo_concurrency.lockutils [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:38.754 282197 INFO nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 06 10:15:38 np0005548789.localdomain virtqemud[203911]: Domain id=4 name='instance-00000008' uuid=ed40901b-0bfc-426a-bf70-48d87ce95aa6 is tainted: custom-monitor
Dec 06 10:15:38 np0005548789.localdomain podman[314008]: 
Dec 06 10:15:38 np0005548789.localdomain podman[314008]: 2025-12-06 10:15:38.932832801 +0000 UTC m=+0.079632977 container create 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:15:38 np0005548789.localdomain systemd[1]: Started libpod-conmon-5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714.scope.
Dec 06 10:15:38 np0005548789.localdomain systemd[1]: tmp-crun.vFgCCB.mount: Deactivated successfully.
Dec 06 10:15:39 np0005548789.localdomain podman[314008]: 2025-12-06 10:15:38.900904502 +0000 UTC m=+0.047704708 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 06 10:15:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d7eab40805a009327f68fd560cfacb738e0b20b8a1c52a765c6668a441db2f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:39 np0005548789.localdomain podman[314008]: 2025-12-06 10:15:39.024997656 +0000 UTC m=+0.171797882 container init 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:39 np0005548789.localdomain podman[314008]: 2025-12-06 10:15:39.036276358 +0000 UTC m=+0.183076534 container start 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:15:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:15:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2236174487' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:15:39 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [NOTICE]   (314026) : New worker (314028) forked
Dec 06 10:15:39 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [NOTICE]   (314026) : Loading success.
Dec 06 10:15:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:39.489 2 INFO neutron.agent.securitygroups_rpc [None req-e5d6490d-2b46-4f4e-92e1-5479a93607f8 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:39.763 282197 INFO nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 06 10:15:40 np0005548789.localdomain ceph-mon[298582]: pgmap v142: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 17 KiB/s wr, 200 op/s
Dec 06 10:15:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1736992612' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:40Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:ea:4a 10.100.0.10
Dec 06 10:15:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:40Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:ea:4a 10.100.0.10
Dec 06 10:15:40 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:40.223 2 INFO neutron.agent.securitygroups_rpc [None req-806a1120-e80b-4f72-b62c-6adbb0e69b26 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:40.770 282197 INFO nova.virt.libvirt.driver [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 06 10:15:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:40.776 282197 DEBUG nova.compute.manager [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:15:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:40.796 282197 DEBUG nova.objects.instance [None req-0fd7e56b-8c90-4a54-9ae0-5e0dfc07787c c0f82d42124043bcb076ae248bc35f73 d60454a44a4b4482bf705ee4e3667605 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Dec 06 10:15:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2671465232' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:41.143 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:41 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:15:41 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:41 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:41 np0005548789.localdomain podman[314053]: 2025-12-06 10:15:41.169677068 +0000 UTC m=+0.050947997 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:15:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:15:41 np0005548789.localdomain podman[314066]: 2025-12-06 10:15:41.272971541 +0000 UTC m=+0.083287408 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 06 10:15:41 np0005548789.localdomain podman[314066]: 2025-12-06 10:15:41.282257202 +0000 UTC m=+0.092573059 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:41 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:15:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:41Z|00139|binding|INFO|Releasing lport b960e3cf-838e-4b32-93f1-7da76cedadcc from this chassis (sb_readonly=0)
Dec 06 10:15:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:41Z|00140|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:41Z|00141|binding|INFO|Releasing lport d57132cf-ea52-419a-82d6-37dcdb5dd89a from this chassis (sb_readonly=0)
Dec 06 10:15:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:41.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.071 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:42 np0005548789.localdomain ceph-mon[298582]: pgmap v143: 177 pgs: 177 active+clean; 238 MiB data, 871 MiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 106 op/s
Dec 06 10:15:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2676620455' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.347 282197 DEBUG nova.compute.manager [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.348 282197 DEBUG oslo_concurrency.lockutils [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.348 282197 DEBUG oslo_concurrency.lockutils [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.349 282197 DEBUG oslo_concurrency.lockutils [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.350 282197 DEBUG nova.compute.manager [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] No waiting events found dispatching network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.350 282197 WARNING nova.compute.manager [req-58a47339-d9a8-4d94-937f-4ff6614182b8 req-bc59a15f-4e32-4e8a-878c-b84e2fb1a9d5 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received unexpected event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 for instance with vm_state active and task_state None.
Dec 06 10:15:42 np0005548789.localdomain podman[314107]: 2025-12-06 10:15:42.498936117 +0000 UTC m=+0.059464515 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:42 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 5 addresses
Dec 06 10:15:42 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:42 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:42Z|00142|binding|INFO|Releasing lport b960e3cf-838e-4b32-93f1-7da76cedadcc from this chassis (sb_readonly=0)
Dec 06 10:15:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:42Z|00143|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:42Z|00144|binding|INFO|Releasing lport d57132cf-ea52-419a-82d6-37dcdb5dd89a from this chassis (sb_readonly=0)
Dec 06 10:15:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:42.728 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1782327393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4064141538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3377775268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:44 np0005548789.localdomain ceph-mon[298582]: pgmap v144: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 4.3 MiB/s wr, 233 op/s
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.457 282197 DEBUG nova.compute.manager [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.459 282197 DEBUG oslo_concurrency.lockutils [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.459 282197 DEBUG oslo_concurrency.lockutils [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.460 282197 DEBUG oslo_concurrency.lockutils [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.460 282197 DEBUG nova.compute.manager [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] No waiting events found dispatching network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.461 282197 WARNING nova.compute.manager [req-d1a6df8e-7261-4618-bfd1-aa863fcadd05 req-95ceee35-40dc-449a-9810-2971100999c9 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received unexpected event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 for instance with vm_state active and task_state None.
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.473 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:44 np0005548789.localdomain kernel: device tape1277966-bb left promiscuous mode
Dec 06 10:15:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:44Z|00145|binding|INFO|Releasing lport e1277966-bb4e-4c31-a08b-185a772cbf5b from this chassis (sb_readonly=0)
Dec 06 10:15:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:44Z|00146|binding|INFO|Setting lport e1277966-bb4e-4c31-a08b-185a772cbf5b down in Southbound
Dec 06 10:15:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:44.485 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae43cb4c-3e04-441f-9177-31d5e45dfad9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e1277966-bb4e-4c31-a08b-185a772cbf5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:44.486 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e1277966-bb4e-4c31-a08b-185a772cbf5b in datapath 8e238f59-5792-4ff4-95af-f993c8e9e14f unbound from our chassis
Dec 06 10:15:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:44.488 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e238f59-5792-4ff4-95af-f993c8e9e14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:44.489 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[04b6805d-2f86-41e3-8e86-744721b336b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:44.502 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:15:44 np0005548789.localdomain systemd[1]: tmp-crun.VJ6mAI.mount: Deactivated successfully.
Dec 06 10:15:44 np0005548789.localdomain podman[314130]: 2025-12-06 10:15:44.940647278 +0000 UTC m=+0.094980752 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:15:44 np0005548789.localdomain podman[314130]: 2025-12-06 10:15:44.949161866 +0000 UTC m=+0.103495340 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:15:44 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:15:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:46.162 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:46 np0005548789.localdomain ceph-mon[298582]: pgmap v145: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 682 KiB/s rd, 4.3 MiB/s wr, 132 op/s
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: Stopping User Manager for UID 42436...
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Activating special unit Exit the Session...
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped target Main User Target.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped target Basic System.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped target Paths.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped target Sockets.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped target Timers.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Closed D-Bus User Message Bus Socket.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Stopped Create User's Volatile Files and Directories.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Removed slice User Application Slice.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Reached target Shutdown.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Finished Exit the Session.
Dec 06 10:15:46 np0005548789.localdomain systemd[313717]: Reached target Exit the Session.
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: Stopped User Manager for UID 42436.
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:15:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:15:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:15:46 np0005548789.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Dec 06 10:15:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:47.073 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:47.305 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:47.306 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:47 np0005548789.localdomain sshd[314156]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:48 np0005548789.localdomain ceph-mon[298582]: pgmap v146: 177 pgs: 177 active+clean; 304 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 682 KiB/s rd, 4.3 MiB/s wr, 132 op/s
Dec 06 10:15:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.495 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Acquiring lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.498 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.498 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Acquiring lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.499 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.499 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.501 282197 INFO nova.compute.manager [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Terminating instance
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.502 282197 DEBUG nova.compute.manager [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 06 10:15:48 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:48.509 2 INFO neutron.agent.securitygroups_rpc [None req-0ed3e916-bdef-45c7-9c1d-50729e74f02a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:48 np0005548789.localdomain kernel: device tapfeb6a13d-30 left promiscuous mode
Dec 06 10:15:48 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016148.6118] device (tapfeb6a13d-30): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.622 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00147|binding|INFO|Releasing lport feb6a13d-305a-4541-a50e-4988833ecf82 from this chassis (sb_readonly=0)
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00148|binding|INFO|Setting lport feb6a13d-305a-4541-a50e-4988833ecf82 down in Southbound
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00149|binding|INFO|Releasing lport 99b309b3-9e3d-4a23-b110-d99707c2eb4e from this chassis (sb_readonly=0)
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00150|binding|INFO|Setting lport 99b309b3-9e3d-4a23-b110-d99707c2eb4e down in Southbound
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00151|binding|INFO|Removing iface tapfeb6a13d-30 ovn-installed in OVS
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.630 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.636 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:27:4d 19.80.0.152'], port_security=['fa:16:3e:11:27:4d 19.80.0.152'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['feb6a13d-305a-4541-a50e-4988833ecf82'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-2060007817', 'neutron:cidrs': '19.80.0.152/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-2060007817', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4c82b56e-0fc5-4c7f-8922-ceb8236815fd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=927c8639-172d-4240-b8a1-85db1fd6c03d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=99b309b3-9e3d-4a23-b110-d99707c2eb4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.638 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:ea:4a 10.100.0.10'], port_security=['fa:16:3e:e5:ea:4a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1146072664', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed40901b-0bfc-426a-bf70-48d87ce95aa6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45604602-bc87-4608-9881-9568cbf90870', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1146072664', 'neutron:project_id': '9167331b2c424ef6961b096b551f8434', 'neutron:revision_number': '12', 'neutron:security_group_ids': '4c82b56e-0fc5-4c7f-8922-ceb8236815fd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d40d335f-7e85-43c3-894d-993c12735497, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=feb6a13d-305a-4541-a50e-4988833ecf82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.639 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 99b309b3-9e3d-4a23-b110-d99707c2eb4e in datapath 19043ea6-c6b2-4272-aa60-1b11a7b5bd93 unbound from our chassis
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.641 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19043ea6-c6b2-4272-aa60-1b11a7b5bd93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00152|binding|INFO|Releasing lport b960e3cf-838e-4b32-93f1-7da76cedadcc from this chassis (sb_readonly=0)
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00153|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:48Z|00154|binding|INFO|Releasing lport d57132cf-ea52-419a-82d6-37dcdb5dd89a from this chassis (sb_readonly=0)
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.642 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[653e203b-8ef4-435e-9953-923dec3d6a2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:48.644 160509 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93 namespace which is not needed anymore
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 4.458s CPU time.
Dec 06 10:15:48 np0005548789.localdomain systemd-machined[84444]: Machine qemu-4-instance-00000008 terminated.
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.665 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.674 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.722 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: tmp-crun.kKuSWH.mount: Deactivated successfully.
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.729 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain podman[314163]: 2025-12-06 10:15:48.747652849 +0000 UTC m=+0.104348385 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.753 282197 INFO nova.virt.libvirt.driver [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Instance destroyed successfully.
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.754 282197 DEBUG nova.objects.instance [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lazy-loading 'resources' on Instance uuid ed40901b-0bfc-426a-bf70-48d87ce95aa6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.770 282197 DEBUG nova.virt.libvirt.vif [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-06T10:15:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-571789410',display_name='tempest-LiveMigrationTest-server-571789410',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005548789.localdomain',hostname='tempest-livemigrationtest-server-571789410',id=8,image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-06T10:15:26Z,launched_on='np0005548790.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005548789.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9167331b2c424ef6961b096b551f8434',ramdisk_id='',reservation_id='r-9204byw5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6a944ab6-8965-4055-b7fc-af6e395005ea',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-1593322913',owner_user_name='tempest-LiveMigrationTest-1593322913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-06T10:15:40Z,user_data=None,user_id='b25d9e5ec9eb4368a764482a325b9dda',uuid=ed40901b-0bfc-426a-bf70-48d87ce95aa6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.770 282197 DEBUG nova.network.os_vif_util [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Converting VIF {"id": "feb6a13d-305a-4541-a50e-4988833ecf82", "address": "fa:16:3e:e5:ea:4a", "network": {"id": "45604602-bc87-4608-9881-9568cbf90870", "bridge": "br-int", "label": "tempest-LiveMigrationTest-802114316-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "9167331b2c424ef6961b096b551f8434", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfeb6a13d-30", "ovs_interfaceid": "feb6a13d-305a-4541-a50e-4988833ecf82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.771 282197 DEBUG nova.network.os_vif_util [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.772 282197 DEBUG os_vif [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.773 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.774 282197 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfeb6a13d-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.776 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.778 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:15:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:48.780 282197 INFO os_vif [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:ea:4a,bridge_name='br-int',has_traffic_filtering=True,id=feb6a13d-305a-4541-a50e-4988833ecf82,network=Network(45604602-bc87-4608-9881-9568cbf90870),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapfeb6a13d-30')
Dec 06 10:15:48 np0005548789.localdomain podman[314163]: 2025-12-06 10:15:48.80170951 +0000 UTC m=+0.158404976 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [NOTICE]   (313953) : haproxy version is 2.8.14-c23fe91
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [NOTICE]   (313953) : path to executable is /usr/sbin/haproxy
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [WARNING]  (313953) : Exiting Master process...
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [WARNING]  (313953) : Exiting Master process...
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: tmp-crun.9fuanR.mount: Deactivated successfully.
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [ALERT]    (313953) : Current worker (313955) exited with code 143 (Terminated)
Dec 06 10:15:48 np0005548789.localdomain neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93[313949]: [WARNING]  (313953) : All workers exited. Exiting... (0)
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: libpod-57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab.scope: Deactivated successfully.
Dec 06 10:15:48 np0005548789.localdomain podman[314211]: 2025-12-06 10:15:48.897907477 +0000 UTC m=+0.131397856 container died 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:15:48 np0005548789.localdomain podman[314211]: 2025-12-06 10:15:48.946016526 +0000 UTC m=+0.179506895 container cleanup 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:15:48 np0005548789.localdomain podman[314247]: 2025-12-06 10:15:48.980354408 +0000 UTC m=+0.072290463 container cleanup 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:48 np0005548789.localdomain systemd[1]: libpod-conmon-57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab.scope: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain podman[314263]: 2025-12-06 10:15:49.03480908 +0000 UTC m=+0.072155690 container remove 57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.043 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcabb9a-a546-425f-8463-9d2a28434cc8]: (4, ('Sat Dec  6 10:15:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93 (57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab)\n57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab\nSat Dec  6 10:15:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93 (57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab)\n57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.045 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd7c4d6-3fcb-44f6-a42b-44721b04dc54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.046 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19043ea6-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:49 np0005548789.localdomain kernel: device tap19043ea6-c0 left promiscuous mode
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.051 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.060 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.063 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b2abb294-e1e0-43db-89a8-2010de181859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.077 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f6746242-1be8-4dd4-a4b3-adccdfbe91de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.078 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7adbe4-9189-459e-ba4d-93723da4a479]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.092 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc0fcc5-35eb-4ec1-96a4-86ec39d8bbd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255563, 'reachable_time': 40135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314278, 'error': None, 'target': 'ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.095 160720 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19043ea6-c6b2-4272-aa60-1b11a7b5bd93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.095 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[5a317dbf-cadf-4aeb-a790-11b4ea41a99c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.096 160509 INFO neutron.agent.ovn.metadata.agent [-] Port feb6a13d-305a-4541-a50e-4988833ecf82 in datapath 45604602-bc87-4608-9881-9568cbf90870 unbound from our chassis
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.100 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45604602-bc87-4608-9881-9568cbf90870, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.101 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3290096a-8cc8-4526-a355-218d3b761869]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.102 160509 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-45604602-bc87-4608-9881-9568cbf90870 namespace which is not needed anymore
Dec 06 10:15:49 np0005548789.localdomain sshd[314156]: Received disconnect from 154.113.10.34 port 36574:11: Bye Bye [preauth]
Dec 06 10:15:49 np0005548789.localdomain sshd[314156]: Disconnected from authenticating user root 154.113.10.34 port 36574 [preauth]
Dec 06 10:15:49 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [NOTICE]   (314026) : haproxy version is 2.8.14-c23fe91
Dec 06 10:15:49 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [NOTICE]   (314026) : path to executable is /usr/sbin/haproxy
Dec 06 10:15:49 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [WARNING]  (314026) : Exiting Master process...
Dec 06 10:15:49 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [ALERT]    (314026) : Current worker (314028) exited with code 143 (Terminated)
Dec 06 10:15:49 np0005548789.localdomain neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870[314022]: [WARNING]  (314026) : All workers exited. Exiting... (0)
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: libpod-5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714.scope: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain podman[314297]: 2025-12-06 10:15:49.292332661 +0000 UTC m=+0.066873029 container died 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:15:49 np0005548789.localdomain podman[314297]: 2025-12-06 10:15:49.327630281 +0000 UTC m=+0.102170589 container cleanup 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.352 282197 INFO nova.virt.libvirt.driver [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Deleting instance files /var/lib/nova/instances/ed40901b-0bfc-426a-bf70-48d87ce95aa6_del
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.354 282197 INFO nova.virt.libvirt.driver [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Deletion of /var/lib/nova/instances/ed40901b-0bfc-426a-bf70-48d87ce95aa6_del complete
Dec 06 10:15:49 np0005548789.localdomain podman[314311]: 2025-12-06 10:15:49.37705223 +0000 UTC m=+0.077328336 container cleanup 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: libpod-conmon-5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714.scope: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain podman[314325]: 2025-12-06 10:15:49.405795842 +0000 UTC m=+0.062433224 container remove 5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.409 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[352449f8-b931-49ff-b4e6-d048a3e34c9a]: (4, ('Sat Dec  6 10:15:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870 (5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714)\n5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714\nSat Dec  6 10:15:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-45604602-bc87-4608-9881-9568cbf90870 (5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714)\n5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.411 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d7156475-1e3a-4f8f-be03-d8fe6132f836]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.412 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45604602-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.414 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain kernel: device tap45604602-b0 left promiscuous mode
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.423 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.425 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[150104c9-9879-46d4-b5b6-a3a122c1384c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.442 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f76337b7-11cf-46fd-8802-6b6f0b120d91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.444 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9d81af-444f-4857-90bd-825582b9e7ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.463 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5693bac5-ec10-4e7d-85ae-04d7ea4750d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1255645, 'reachable_time': 44259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314345, 'error': None, 'target': 'ovnmeta-45604602-bc87-4608-9881-9568cbf90870', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.466 160720 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-45604602-bc87-4608-9881-9568cbf90870 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.466 160720 DEBUG oslo.privsep.daemon [-] privsep: reply[9c7c0616-1224-4c6f-aeae-47f261c0acc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.473 282197 INFO nova.compute.manager [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Took 0.97 seconds to destroy the instance on the hypervisor.
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.474 282197 DEBUG oslo.service.loopingcall [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.475 282197 DEBUG nova.compute.manager [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.475 282197 DEBUG nova.network.neutron [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 06 10:15:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:49.528 263652 INFO neutron.agent.linux.ip_lib [None req-021eeeb8-15de-4467-bbe5-42cc245776a5 - - - - - -] Device tap1d53082e-11 cannot be used as it has no MAC address
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.552 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain kernel: device tap1d53082e-11 entered promiscuous mode
Dec 06 10:15:49 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016149.5585] manager: (tap1d53082e-11): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Dec 06 10:15:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:49Z|00155|binding|INFO|Claiming lport 1d53082e-11ae-49e3-9448-7b2e1b2ec267 for this chassis.
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.559 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:49Z|00156|binding|INFO|1d53082e-11ae-49e3-9448-7b2e1b2ec267: Claiming unknown
Dec 06 10:15:49 np0005548789.localdomain systemd-udevd[314160]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.569 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-7bcb9995-c8be-445e-890a-c8635f090fa6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7bcb9995-c8be-445e-890a-c8635f090fa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '44e6bb9426fc43a084f983db0bd7f0ad', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6349ccef-9387-4e01-b0b2-fbf339bbd83f, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1d53082e-11ae-49e3-9448-7b2e1b2ec267) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.576 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1d53082e-11ae-49e3-9448-7b2e1b2ec267 in datapath 7bcb9995-c8be-445e-890a-c8635f090fa6 bound to our chassis
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.578 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7bcb9995-c8be-445e-890a-c8635f090fa6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:15:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:49.581 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[239ef574-f07a-46c6-850f-51ab86809bbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:49Z|00157|binding|INFO|Setting lport 1d53082e-11ae-49e3-9448-7b2e1b2ec267 ovn-installed in OVS
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.603 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:49Z|00158|binding|INFO|Setting lport 1d53082e-11ae-49e3-9448-7b2e1b2ec267 up in Southbound
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.605 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.669 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:49.680 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8d7eab40805a009327f68fd560cfacb738e0b20b8a1c52a765c6668a441db2f8-merged.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5178c8fa98670b4c6c4f4039300dbbe2725e333d5ba69ae0dc4e1e9f3ca6a714-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: run-netns-ovnmeta\x2d45604602\x2dbc87\x2d4608\x2d9881\x2d9568cbf90870.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-970f665873ff889fc4ce87e8eb815e45fa33cad2aebf50d32a77643cc655aa94-merged.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57bafd33260bc96ff76c85f7693ba9786f043d552185f6da913acb510f4f60ab-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain systemd[1]: run-netns-ovnmeta\x2d19043ea6\x2dc6b2\x2d4272\x2daa60\x2d1b11a7b5bd93.mount: Deactivated successfully.
Dec 06 10:15:49 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:49.994 2 INFO neutron.agent.securitygroups_rpc [req-da70e705-23ca-45d2-aa6c-68d8abc979e1 req-8ff8aa52-7146-4285-bb8a-51bbd99a36a5 da7bbd24eb95438897585b10577ea2e0 da995d8e002548889747013c0eeca935 - - default default] Security group member updated ['581a4637-eff2-45f4-92f3-d575b736a840']
Dec 06 10:15:50 np0005548789.localdomain ceph-mon[298582]: pgmap v147: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 683 KiB/s rd, 4.3 MiB/s wr, 134 op/s
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 1 addresses
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:50 np0005548789.localdomain podman[314401]: 2025-12-06 10:15:50.290099955 +0000 UTC m=+0.060258680 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:15:50 np0005548789.localdomain podman[314443]: 
Dec 06 10:15:50 np0005548789.localdomain podman[314443]: 2025-12-06 10:15:50.517664177 +0000 UTC m=+0.081633887 container create 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:50 np0005548789.localdomain systemd[1]: Started libpod-conmon-7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760.scope.
Dec 06 10:15:50 np0005548789.localdomain podman[314443]: 2025-12-06 10:15:50.472305101 +0000 UTC m=+0.036274831 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:15:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c28922d979d8687baf0d17c061c99ccd1f7b4506833ab64cdd748cd838b58f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:50 np0005548789.localdomain podman[314443]: 2025-12-06 10:15:50.615039951 +0000 UTC m=+0.179009651 container init 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:15:50 np0005548789.localdomain podman[314443]: 2025-12-06 10:15:50.625740155 +0000 UTC m=+0.189709825 container start 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[314461]: started, version 2.85 cachesize 150
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[314461]: DNS service limited to local subnets
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[314461]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[314461]: warning: no upstream servers configured
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[314461]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/addn_hosts - 0 addresses
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/host
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/opts
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.643 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:50Z, description=, device_id=dafd896d-42a7-4e64-be65-9942f12d900d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4670>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc33970>], id=bbbf4983-178c-402a-8c72-520f40e6ea28, ip_allocation=immediate, mac_address=fa:16:3e:f7:d6:18, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=869, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:15:50Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:15:50 np0005548789.localdomain systemd[1]: tmp-crun.Z0Em85.mount: Deactivated successfully.
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.735 263652 INFO neutron.agent.dhcp.agent [None req-9b66ce23-e78c-4b59-84cc-5ec79f319ecc - - - - - -] DHCP configuration for ports {'ada56b72-5c1e-433b-9bad-d65b17c1775a'} is completed
Dec 06 10:15:50 np0005548789.localdomain dnsmasq[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:15:50 np0005548789.localdomain podman[314478]: 2025-12-06 10:15:50.86493105 +0000 UTC m=+0.056432142 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:50 np0005548789.localdomain dnsmasq-dhcp[263859]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent [None req-d149bc2e-8ff4-4340-9073-147bbd6df0a3 - - - - - -] Unable to reload_allocations dhcp for 8e238f59-5792-4ff4-95af-f993c8e9e14f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tape1277966-bb not found in namespace qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f.
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tape1277966-bb not found in namespace qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f.
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.887 263652 ERROR neutron.agent.dhcp.agent 
Dec 06 10:15:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:50.894 263652 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.020 263652 INFO neutron.agent.dhcp.agent [None req-7636e4dd-e0d1-4665-8024-29e6984d21f3 - - - - - -] DHCP configuration for ports {'bbbf4983-178c-402a-8c72-520f40e6ea28'} is completed
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.287 263652 INFO neutron.agent.dhcp.agent [None req-90b892c8-0506-4dbe-af3a-d4a70d2244e9 - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.289 263652 INFO neutron.agent.dhcp.agent [-] Starting network 8e238f59-5792-4ff4-95af-f993c8e9e14f dhcp configuration
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.294 263652 INFO neutron.agent.dhcp.agent [-] Starting network f095d28f-14aa-4e63-9d6e-f230615c3946 dhcp configuration
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.295 263652 INFO neutron.agent.dhcp.agent [-] Finished network f095d28f-14aa-4e63-9d6e-f230615c3946 dhcp configuration
Dec 06 10:15:51 np0005548789.localdomain dnsmasq[263859]: exiting on receipt of SIGTERM
Dec 06 10:15:51 np0005548789.localdomain systemd[1]: libpod-e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7.scope: Deactivated successfully.
Dec 06 10:15:51 np0005548789.localdomain podman[314508]: 2025-12-06 10:15:51.434855307 +0000 UTC m=+0.052757451 container kill e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:15:51 np0005548789.localdomain podman[314528]: 2025-12-06 10:15:51.504071526 +0000 UTC m=+0.049546553 container died e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:15:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7-userdata-shm.mount: Deactivated successfully.
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.554 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain podman[314528]: 2025-12-06 10:15:51.558460026 +0000 UTC m=+0.103935003 container remove e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:15:51 np0005548789.localdomain systemd[1]: libpod-conmon-e4e5ac05442955f65040fee10f3aa342d80518f3e4a072b3d79fe4302f513cf7.scope: Deactivated successfully.
Dec 06 10:15:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:51.609 263652 INFO neutron.agent.linux.ip_lib [-] Device tape1277966-bb cannot be used as it has no MAC address
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.633 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain kernel: device tape1277966-bb entered promiscuous mode
Dec 06 10:15:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:51Z|00159|binding|INFO|Claiming lport e1277966-bb4e-4c31-a08b-185a772cbf5b for this chassis.
Dec 06 10:15:51 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016151.6425] manager: (tape1277966-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.643 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:51Z|00160|binding|INFO|e1277966-bb4e-4c31-a08b-185a772cbf5b: Claiming unknown
Dec 06 10:15:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:51Z|00161|binding|INFO|Setting lport e1277966-bb4e-4c31-a08b-185a772cbf5b ovn-installed in OVS
Dec 06 10:15:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:51Z|00162|binding|INFO|Setting lport e1277966-bb4e-4c31-a08b-185a772cbf5b up in Southbound
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.653 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:51.655 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.172/24', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e238f59-5792-4ff4-95af-f993c8e9e14f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae43cb4c-3e04-441f-9177-31d5e45dfad9, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e1277966-bb4e-4c31-a08b-185a772cbf5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:51.656 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e1277966-bb4e-4c31-a08b-185a772cbf5b in datapath 8e238f59-5792-4ff4-95af-f993c8e9e14f bound to our chassis
Dec 06 10:15:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:51.660 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port e972a0a4-c434-4624-85e8-2a72a8f17075 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:15:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:51.660 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e238f59-5792-4ff4-95af-f993c8e9e14f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:51.662 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b373f804-ed1b-4d2b-8151-56e439f10d07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.679 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.685 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape1277966-bb: No such device
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.719 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b9a1d5715f59223f418c72b8a9e9ba377238db2c64e0646a31ed33f6c0f74cb2-merged.mount: Deactivated successfully.
Dec 06 10:15:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:51.745 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:52.076 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:52 np0005548789.localdomain ceph-mon[298582]: pgmap v148: 177 pgs: 177 active+clean; 304 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 679 KiB/s rd, 4.3 MiB/s wr, 129 op/s
Dec 06 10:15:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1067841479' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:15:52 np0005548789.localdomain podman[314618]: 
Dec 06 10:15:52 np0005548789.localdomain podman[314618]: 2025-12-06 10:15:52.533454689 +0000 UTC m=+0.093770645 container create dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:15:52 np0005548789.localdomain systemd[1]: Started libpod-conmon-dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f.scope.
Dec 06 10:15:52 np0005548789.localdomain systemd[1]: tmp-crun.ZGIMcA.mount: Deactivated successfully.
Dec 06 10:15:52 np0005548789.localdomain podman[314618]: 2025-12-06 10:15:52.489888347 +0000 UTC m=+0.050204353 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:15:52 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:15:52 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb03c4cde9b0f29ff04b093db84c97dd4b49f1d4ea32d27ad712695572fe9220/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:15:52 np0005548789.localdomain podman[314618]: 2025-12-06 10:15:52.616664022 +0000 UTC m=+0.176979978 container init dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:15:52 np0005548789.localdomain podman[314618]: 2025-12-06 10:15:52.625702297 +0000 UTC m=+0.186018253 container start dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: started, version 2.85 cachesize 150
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: DNS service limited to local subnets
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: warning: no upstream servers configured
Dec 06 10:15:52 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCP, static leases only on 192.168.122.0, lease time 1d
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:15:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:52.695 263652 INFO neutron.agent.dhcp.agent [None req-02d524e0-d630-4406-b3b5-6b17be357144 - - - - - -] Finished network 8e238f59-5792-4ff4-95af-f993c8e9e14f dhcp configuration
Dec 06 10:15:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:52.696 263652 INFO neutron.agent.dhcp.agent [None req-90b892c8-0506-4dbe-af3a-d4a70d2244e9 - - - - - -] Synchronizing state complete
Dec 06 10:15:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:52.697 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:52Z, description=, device_id=e94155bd-29ed-456c-8250-93318746e895, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4e50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf44c0>], id=7b541b0c-c8b5-4bf5-a92b-45c17ae95d79, ip_allocation=immediate, mac_address=fa:16:3e:b9:63:28, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=873, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:15:52Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:15:52 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:52 np0005548789.localdomain podman[314655]: 2025-12-06 10:15:52.87570944 +0000 UTC m=+0.046031767 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.153 282197 DEBUG nova.compute.manager [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.153 282197 DEBUG oslo_concurrency.lockutils [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Acquiring lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.153 282197 DEBUG oslo_concurrency.lockutils [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.155 282197 DEBUG oslo_concurrency.lockutils [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.155 282197 DEBUG nova.compute.manager [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] No waiting events found dispatching network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.155 282197 WARNING nova.compute.manager [req-fa4866e3-05c4-48b1-ae31-89d7c470a335 req-840f4c4c-ea6c-44aa-8e15-f08277ba9082 0b60e97090454089ad6b8f4b898d0040 660705fe1cbe4111b6f2c99ce8d05c8c - - default default] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Received unexpected event network-vif-plugged-feb6a13d-305a-4541-a50e-4988833ecf82 for instance with vm_state active and task_state deleting.
Dec 06 10:15:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:53.183 263652 INFO neutron.agent.dhcp.agent [None req-8fe7eaf0-b599-4ad5-96d6-978781532ab9 - - - - - -] DHCP configuration for ports {'49b140a4-d9f8-482f-b1ba-2b28b09c2e14', '17d01ee3-d0a0-42f3-8c73-1578e34c0b4f', '3f202222-16a8-4488-bcc9-0691af80a9ba', 'bbbf4983-178c-402a-8c72-520f40e6ea28', '75f7252a-6b17-46d4-b761-60a0a33ef03b', '5f9b5a36-6f9d-4432-a50f-3ba7cd01f2c4', '03184373-6102-4573-83e2-c438dfc086ce', 'e1277966-bb4e-4c31-a08b-185a772cbf5b', '55ddb56c-afe2-4248-b1cd-f45aef0a3725', '6e17a10f-dbbc-42b2-aeeb-b43e917b0e3c', '8fd47356-f471-4742-820f-2e8ea70c8e0e'} is completed
Dec 06 10:15:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:53.355 263652 INFO neutron.agent.dhcp.agent [None req-c8b16c74-a807-4bc4-a43d-fa77b4f76723 - - - - - -] DHCP configuration for ports {'7b541b0c-c8b5-4bf5-a92b-45c17ae95d79'} is completed
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.776 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.844 282197 DEBUG nova.network.neutron [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.866 282197 INFO nova.compute.manager [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Took 4.39 seconds to deallocate network for instance.
Dec 06 10:15:53 np0005548789.localdomain sudo[314676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:15:53 np0005548789.localdomain sudo[314676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:53 np0005548789.localdomain sudo[314676]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:15:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:15:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:15:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159751 "" "Go-http-client/1.1"
Dec 06 10:15:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:15:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20213 "" "Go-http-client/1.1"
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.965 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.967 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:53.970 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:53 np0005548789.localdomain sudo[314694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:15:53 np0005548789.localdomain sudo[314694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:54.037 282197 INFO nova.scheduler.client.report [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Deleted allocations for instance ed40901b-0bfc-426a-bf70-48d87ce95aa6
Dec 06 10:15:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:54.156 282197 DEBUG oslo_concurrency.lockutils [None req-e61abce1-6552-4ff7-89fe-964ccbdef70d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Lock "ed40901b-0bfc-426a-bf70-48d87ce95aa6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:54 np0005548789.localdomain ceph-mon[298582]: pgmap v149: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 716 KiB/s rd, 4.3 MiB/s wr, 184 op/s
Dec 06 10:15:54 np0005548789.localdomain sudo[314694]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:54 np0005548789.localdomain podman[314760]: 2025-12-06 10:15:54.688229177 +0000 UTC m=+0.042163140 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:15:54 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:54 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:54 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:54 np0005548789.localdomain sudo[314782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:15:54 np0005548789.localdomain sudo[314782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:15:54 np0005548789.localdomain sudo[314782]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:55.025 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:15:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:15:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:15:55 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 7 addresses
Dec 06 10:15:55 np0005548789.localdomain podman[314816]: 2025-12-06 10:15:55.657526447 +0000 UTC m=+0.041360335 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:15:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:15:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:15:56 np0005548789.localdomain ceph-mon[298582]: pgmap v150: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:15:56 np0005548789.localdomain podman[314856]: 2025-12-06 10:15:56.266725145 +0000 UTC m=+0.051594216 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:56 np0005548789.localdomain dnsmasq[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/addn_hosts - 0 addresses
Dec 06 10:15:56 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/host
Dec 06 10:15:56 np0005548789.localdomain dnsmasq-dhcp[312566]: read /var/lib/neutron/dhcp/deb7774c-e96b-4e7f-88d7-ed9d740915f4/opts
Dec 06 10:15:56 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:56.496 2 INFO neutron.agent.securitygroups_rpc [None req-97ddf7c5-61a2-4ea7-a37a-afceb032745e 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:15:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:56.549 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:15:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:56.668 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:56Z|00163|binding|INFO|Releasing lport ff588d77-fd65-43a9-bd18-9402d0aef61a from this chassis (sb_readonly=0)
Dec 06 10:15:56 np0005548789.localdomain kernel: device tapff588d77-fd left promiscuous mode
Dec 06 10:15:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:56Z|00164|binding|INFO|Setting lport ff588d77-fd65-43a9-bd18-9402d0aef61a down in Southbound
Dec 06 10:15:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:56.674 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:56Z, description=, device_id=e94155bd-29ed-456c-8250-93318746e895, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd71940>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd71100>], id=e452c5fc-e3cc-46cd-9292-74c6f34d2647, ip_allocation=immediate, mac_address=fa:16:3e:82:3c:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:47Z, description=, dns_domain=, id=7bcb9995-c8be-445e-890a-c8635f090fa6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-219944885-network, port_security_enabled=True, project_id=44e6bb9426fc43a084f983db0bd7f0ad, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14244, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=846, status=ACTIVE, subnets=['f922be8a-8295-4360-8d2b-6f7f6ff5fc6d'], tags=[], tenant_id=44e6bb9426fc43a084f983db0bd7f0ad, updated_at=2025-12-06T10:15:48Z, vlan_transparent=None, network_id=7bcb9995-c8be-445e-890a-c8635f090fa6, port_security_enabled=False, project_id=44e6bb9426fc43a084f983db0bd7f0ad, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=883, status=DOWN, tags=[], tenant_id=44e6bb9426fc43a084f983db0bd7f0ad, updated_at=2025-12-06T10:15:56Z on network 7bcb9995-c8be-445e-890a-c8635f090fa6
Dec 06 10:15:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:15:56Z|00165|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:15:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:56.686 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-deb7774c-e96b-4e7f-88d7-ed9d740915f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'da995d8e002548889747013c0eeca935', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9cc41455-e125-49b5-8c35-a9f7e38c8e70, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=ff588d77-fd65-43a9-bd18-9402d0aef61a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:15:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:56.687 160509 INFO neutron.agent.ovn.metadata.agent [-] Port ff588d77-fd65-43a9-bd18-9402d0aef61a in datapath deb7774c-e96b-4e7f-88d7-ed9d740915f4 unbound from our chassis
Dec 06 10:15:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:56.691 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network deb7774c-e96b-4e7f-88d7-ed9d740915f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:15:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:15:56.691 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c02ccf-3b2f-47f1-899c-ca37c61b4ee5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:15:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:56.692 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:56.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:56.731 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:56 np0005548789.localdomain dnsmasq[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/addn_hosts - 1 addresses
Dec 06 10:15:56 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/host
Dec 06 10:15:56 np0005548789.localdomain podman[314896]: 2025-12-06 10:15:56.85505378 +0000 UTC m=+0.051653578 container kill 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:56 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/opts
Dec 06 10:15:56 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:15:56.991 2 INFO neutron.agent.securitygroups_rpc [None req-e28bc6dc-5f9c-4334-81fe-cd06724fee5d b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:15:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:57.024 263652 INFO neutron.agent.dhcp.agent [None req-d2591ab1-8127-4135-bbd3-4b064b18b219 - - - - - -] DHCP configuration for ports {'e452c5fc-e3cc-46cd-9292-74c6f34d2647'} is completed
Dec 06 10:15:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:57.078 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:15:58 np0005548789.localdomain ceph-mon[298582]: pgmap v151: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:15:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:15:58 np0005548789.localdomain sshd[314916]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:15:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:15:58.779 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:15:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:15:59.787 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:15:56Z, description=, device_id=e94155bd-29ed-456c-8250-93318746e895, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd716a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd01700>], id=e452c5fc-e3cc-46cd-9292-74c6f34d2647, ip_allocation=immediate, mac_address=fa:16:3e:82:3c:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:15:47Z, description=, dns_domain=, id=7bcb9995-c8be-445e-890a-c8635f090fa6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-219944885-network, port_security_enabled=True, project_id=44e6bb9426fc43a084f983db0bd7f0ad, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14244, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=846, status=ACTIVE, subnets=['f922be8a-8295-4360-8d2b-6f7f6ff5fc6d'], tags=[], tenant_id=44e6bb9426fc43a084f983db0bd7f0ad, updated_at=2025-12-06T10:15:48Z, vlan_transparent=None, network_id=7bcb9995-c8be-445e-890a-c8635f090fa6, port_security_enabled=False, project_id=44e6bb9426fc43a084f983db0bd7f0ad, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=883, status=DOWN, tags=[], tenant_id=44e6bb9426fc43a084f983db0bd7f0ad, updated_at=2025-12-06T10:15:56Z on network 7bcb9995-c8be-445e-890a-c8635f090fa6
Dec 06 10:15:59 np0005548789.localdomain sshd[314916]: Received disconnect from 14.194.101.210 port 46996:11: Bye Bye [preauth]
Dec 06 10:15:59 np0005548789.localdomain sshd[314916]: Disconnected from authenticating user root 14.194.101.210 port 46996 [preauth]
Dec 06 10:15:59 np0005548789.localdomain dnsmasq[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/addn_hosts - 1 addresses
Dec 06 10:15:59 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/host
Dec 06 10:15:59 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/opts
Dec 06 10:15:59 np0005548789.localdomain podman[314935]: 2025-12-06 10:15:59.991923525 +0000 UTC m=+0.064448335 container kill 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:15:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:16:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:16:00 np0005548789.localdomain podman[314947]: 2025-12-06 10:16:00.113101321 +0000 UTC m=+0.104759358 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:16:00 np0005548789.localdomain podman[314947]: 2025-12-06 10:16:00.120163785 +0000 UTC m=+0.111821862 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:16:00 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:16:00 np0005548789.localdomain ceph-mon[298582]: pgmap v152: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 28 KiB/s wr, 56 op/s
Dec 06 10:16:00 np0005548789.localdomain podman[314969]: 2025-12-06 10:16:00.20207375 +0000 UTC m=+0.072943084 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:16:00 np0005548789.localdomain podman[314969]: 2025-12-06 10:16:00.206846285 +0000 UTC m=+0.077715609 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:00 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:16:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:00.340 263652 INFO neutron.agent.dhcp.agent [None req-9abaec00-89f9-49f2-9312-d0303475dcd3 - - - - - -] DHCP configuration for ports {'e452c5fc-e3cc-46cd-9292-74c6f34d2647'} is completed
Dec 06 10:16:00 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:00.364 2 INFO neutron.agent.securitygroups_rpc [None req-96bdfd29-c14f-4ef8-b3b0-32d637d65e93 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:01 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:01.607 2 INFO neutron.agent.securitygroups_rpc [None req-7d84f32e-96fa-49ab-97a7-a8cf557247b9 b25d9e5ec9eb4368a764482a325b9dda 9167331b2c424ef6961b096b551f8434 - - default default] Security group member updated ['4c82b56e-0fc5-4c7f-8922-ceb8236815fd']
Dec 06 10:16:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:02.082 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:02 np0005548789.localdomain ceph-mon[298582]: pgmap v153: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Dec 06 10:16:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:02.787 263652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpcr8m7m0g/privsep.sock']
Dec 06 10:16:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:03Z|00166|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:03.300 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:03.436 263652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 10:16:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:03.315 314999 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 10:16:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:03.318 314999 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 10:16:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:03.320 314999 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 06 10:16:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:03.320 314999 INFO oslo.privsep.daemon [-] privsep daemon running as pid 314999
Dec 06 10:16:03 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCPRELEASE(tape1277966-bb) 192.168.122.197 fa:16:3e:f7:d6:18
Dec 06 10:16:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:03.748 282197 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765016148.7465506, ed40901b-0bfc-426a-bf70-48d87ce95aa6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 06 10:16:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:03.748 282197 INFO nova.compute.manager [-] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] VM Stopped (Lifecycle Event)
Dec 06 10:16:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:03.776 282197 DEBUG nova.compute.manager [None req-212ba95c-4e57-49cd-a9fd-9640f0e0cec8 - - - - - -] [instance: ed40901b-0bfc-426a-bf70-48d87ce95aa6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 06 10:16:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:03.811 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:04 np0005548789.localdomain ceph-mon[298582]: pgmap v154: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.3 KiB/s wr, 55 op/s
Dec 06 10:16:04 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 6 addresses
Dec 06 10:16:04 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:04 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:04 np0005548789.localdomain podman[315019]: 2025-12-06 10:16:04.406665392 +0000 UTC m=+0.070602392 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:16:04 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCPRELEASE(tape1277966-bb) 192.168.122.189 fa:16:3e:ad:2b:28
Dec 06 10:16:05 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 5 addresses
Dec 06 10:16:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:05 np0005548789.localdomain podman[315057]: 2025-12-06 10:16:05.100461396 +0000 UTC m=+0.057289899 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:16:05 np0005548789.localdomain dnsmasq[312566]: exiting on receipt of SIGTERM
Dec 06 10:16:05 np0005548789.localdomain podman[315093]: 2025-12-06 10:16:05.327975366 +0000 UTC m=+0.039868190 container kill 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:05 np0005548789.localdomain systemd[1]: libpod-8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362.scope: Deactivated successfully.
Dec 06 10:16:05 np0005548789.localdomain podman[315107]: 2025-12-06 10:16:05.371483046 +0000 UTC m=+0.035915300 container died 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:16:05 np0005548789.localdomain podman[315107]: 2025-12-06 10:16:05.401257309 +0000 UTC m=+0.065689513 container cleanup 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:16:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-06549e5dbf4ea1c819a27ad89b0090c0fd564fb1fbcc2e1eabbb66d37085811c-merged.mount: Deactivated successfully.
Dec 06 10:16:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:05 np0005548789.localdomain systemd[1]: libpod-conmon-8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362.scope: Deactivated successfully.
Dec 06 10:16:05 np0005548789.localdomain podman[315109]: 2025-12-06 10:16:05.455429842 +0000 UTC m=+0.114009569 container remove 8a8b7a6a9724101bff1398ade8c854164d1816271ca6c4f86a12732f70229362 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-deb7774c-e96b-4e7f-88d7-ed9d740915f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:05 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:05.549 263652 INFO neutron.agent.dhcp.agent [None req-0128c940-9b2a-4d6e-83de-f01e7db06316 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:05 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2ddeb7774c\x2de96b\x2d4e7f\x2d88d7\x2ded9d740915f4.mount: Deactivated successfully.
Dec 06 10:16:05 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:05.773 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:05 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:05.822 2 INFO neutron.agent.securitygroups_rpc [None req-be960e3b-e920-4ec4-8e87-e409a0af324a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:05 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:05.874 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:06 np0005548789.localdomain ceph-mon[298582]: pgmap v155: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:16:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:16:06 np0005548789.localdomain systemd[1]: tmp-crun.qfFFGU.mount: Deactivated successfully.
Dec 06 10:16:06 np0005548789.localdomain podman[315132]: 2025-12-06 10:16:06.93004223 +0000 UTC m=+0.091053614 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public)
Dec 06 10:16:06 np0005548789.localdomain podman[315132]: 2025-12-06 10:16:06.971223968 +0000 UTC m=+0.132235362 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:16:06 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:16:06 np0005548789.localdomain podman[315133]: 2025-12-06 10:16:06.989264976 +0000 UTC m=+0.146812274 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:16:07 np0005548789.localdomain podman[315133]: 2025-12-06 10:16:07.028272249 +0000 UTC m=+0.185819607 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:16:07 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:16:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:07.085 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:08 np0005548789.localdomain ceph-mon[298582]: pgmap v156: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:08 np0005548789.localdomain dnsmasq[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/addn_hosts - 0 addresses
Dec 06 10:16:08 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/host
Dec 06 10:16:08 np0005548789.localdomain podman[315188]: 2025-12-06 10:16:08.566881997 +0000 UTC m=+0.056187635 container kill 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:16:08 np0005548789.localdomain dnsmasq-dhcp[314461]: read /var/lib/neutron/dhcp/7bcb9995-c8be-445e-890a-c8635f090fa6/opts
Dec 06 10:16:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:08.812 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:08.991 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:08 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:08Z|00167|binding|INFO|Releasing lport 1d53082e-11ae-49e3-9448-7b2e1b2ec267 from this chassis (sb_readonly=0)
Dec 06 10:16:08 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:08Z|00168|binding|INFO|Setting lport 1d53082e-11ae-49e3-9448-7b2e1b2ec267 down in Southbound
Dec 06 10:16:08 np0005548789.localdomain kernel: device tap1d53082e-11 left promiscuous mode
Dec 06 10:16:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:09.007 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-7bcb9995-c8be-445e-890a-c8635f090fa6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7bcb9995-c8be-445e-890a-c8635f090fa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '44e6bb9426fc43a084f983db0bd7f0ad', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6349ccef-9387-4e01-b0b2-fbf339bbd83f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1d53082e-11ae-49e3-9448-7b2e1b2ec267) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:09.009 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1d53082e-11ae-49e3-9448-7b2e1b2ec267 in datapath 7bcb9995-c8be-445e-890a-c8635f090fa6 unbound from our chassis
Dec 06 10:16:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:09.014 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7bcb9995-c8be-445e-890a-c8635f090fa6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:09.015 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[416b626d-1018-4208-af77-7034a0749a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:09.024 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:10 np0005548789.localdomain ceph-mon[298582]: pgmap v157: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:10 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCPRELEASE(tape1277966-bb) 192.168.122.192 fa:16:3e:80:e4:79
Dec 06 10:16:10 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:16:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:10 np0005548789.localdomain podman[315230]: 2025-12-06 10:16:10.918826296 +0000 UTC m=+0.065735845 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:10Z|00169|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:11.013 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:11.325 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:11.326 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:11.327 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:16:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:16:11 np0005548789.localdomain podman[315250]: 2025-12-06 10:16:11.929540602 +0000 UTC m=+0.086552925 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:16:11 np0005548789.localdomain podman[315250]: 2025-12-06 10:16:11.938899697 +0000 UTC m=+0.095911970 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:16:11 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.110 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548789.localdomain ceph-mon[298582]: pgmap v158: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:12 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:16:12 np0005548789.localdomain podman[315286]: 2025-12-06 10:16:12.427928949 +0000 UTC m=+0.046383767 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:12 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:12 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:12Z|00170|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.613 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:12.881 263652 INFO neutron.agent.linux.ip_lib [None req-993d3ab0-a589-4781-996c-7ffeb36d7b9b - - - - - -] Device tapda02d3d2-69 cannot be used as it has no MAC address
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.904 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548789.localdomain kernel: device tapda02d3d2-69 entered promiscuous mode
Dec 06 10:16:12 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016172.9149] manager: (tapda02d3d2-69): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Dec 06 10:16:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:12Z|00171|binding|INFO|Claiming lport da02d3d2-692f-455e-be00-1cf20526dba9 for this chassis.
Dec 06 10:16:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:12Z|00172|binding|INFO|da02d3d2-692f-455e-be00-1cf20526dba9: Claiming unknown
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.916 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548789.localdomain systemd-udevd[315316]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:12.927 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-a1e70fff-f7c1-4a44-8853-ff024a9f780b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1e70fff-f7c1-4a44-8853-ff024a9f780b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7435808e897043e08b27fd5dcaabc003', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccc8a57e-8463-42c0-9469-317af07ded18, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=da02d3d2-692f-455e-be00-1cf20526dba9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:12.928 160509 INFO neutron.agent.ovn.metadata.agent [-] Port da02d3d2-692f-455e-be00-1cf20526dba9 in datapath a1e70fff-f7c1-4a44-8853-ff024a9f780b bound to our chassis
Dec 06 10:16:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:12.928 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a1e70fff-f7c1-4a44-8853-ff024a9f780b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:12.929 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9b982680-a8b4-44cf-89e4-254f5b44f0a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:12Z|00173|binding|INFO|Setting lport da02d3d2-692f-455e-be00-1cf20526dba9 ovn-installed in OVS
Dec 06 10:16:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:12Z|00174|binding|INFO|Setting lport da02d3d2-692f-455e-be00-1cf20526dba9 up in Southbound
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.957 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapda02d3d2-69: No such device
Dec 06 10:16:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:12.997 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:13.024 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:13.815 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:13 np0005548789.localdomain podman[315387]: 
Dec 06 10:16:13 np0005548789.localdomain podman[315387]: 2025-12-06 10:16:13.922937705 +0000 UTC m=+0.071044306 container create ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:16:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2.scope.
Dec 06 10:16:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10ea93285ee6bf33fabd8991e6c84ec56f594aa69c471e85c2d0afdfeabfc6e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:13 np0005548789.localdomain podman[315387]: 2025-12-06 10:16:13.89144515 +0000 UTC m=+0.039551781 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:13 np0005548789.localdomain podman[315387]: 2025-12-06 10:16:13.99926759 +0000 UTC m=+0.147374191 container init ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:14 np0005548789.localdomain podman[315387]: 2025-12-06 10:16:14.008932954 +0000 UTC m=+0.157039555 container start ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[315430]: started, version 2.85 cachesize 150
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[315430]: DNS service limited to local subnets
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[315430]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[315430]: warning: no upstream servers configured
Dec 06 10:16:14 np0005548789.localdomain dnsmasq-dhcp[315430]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/addn_hosts - 0 addresses
Dec 06 10:16:14 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/host
Dec 06 10:16:14 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/opts
Dec 06 10:16:14 np0005548789.localdomain dnsmasq[314461]: exiting on receipt of SIGTERM
Dec 06 10:16:14 np0005548789.localdomain podman[315422]: 2025-12-06 10:16:14.062624212 +0000 UTC m=+0.063289191 container kill 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:16:14 np0005548789.localdomain systemd[1]: libpod-7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760.scope: Deactivated successfully.
Dec 06 10:16:14 np0005548789.localdomain podman[315436]: 2025-12-06 10:16:14.115687682 +0000 UTC m=+0.043883363 container died 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:14 np0005548789.localdomain podman[315436]: 2025-12-06 10:16:14.157308994 +0000 UTC m=+0.085504615 container cleanup 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:16:14 np0005548789.localdomain systemd[1]: libpod-conmon-7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760.scope: Deactivated successfully.
Dec 06 10:16:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:14.177 263652 INFO neutron.agent.dhcp.agent [None req-0bf1d0b3-8dd9-4d39-8c2d-e5a6c4c6cf25 - - - - - -] DHCP configuration for ports {'c8ac7c67-b7ec-4bf1-ad7a-a9af2fd0e8bd'} is completed
Dec 06 10:16:14 np0005548789.localdomain ceph-mon[298582]: pgmap v159: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:14 np0005548789.localdomain podman[315443]: 2025-12-06 10:16:14.213656943 +0000 UTC m=+0.127857639 container remove 7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7bcb9995-c8be-445e-890a-c8635f090fa6, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:16:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:14.240 263652 INFO neutron.agent.dhcp.agent [None req-939a5336-603b-4411-b162-221e78243ab7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:14.241 263652 INFO neutron.agent.dhcp.agent [None req-939a5336-603b-4411-b162-221e78243ab7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain snmpd[67279]: empty variable list in _query
Dec 06 10:16:14 np0005548789.localdomain sshd[315467]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:14 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCPRELEASE(tape1277966-bb) 192.168.122.248 fa:16:3e:6f:70:a0
Dec 06 10:16:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-5c28922d979d8687baf0d17c061c99ccd1f7b4506833ab64cdd748cd838b58f4-merged.mount: Deactivated successfully.
Dec 06 10:16:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7deecf5c59b0f04bb3525d11262bd98a6e888a03a84db6c0d01f340a61932760-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:14 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d7bcb9995\x2dc8be\x2d445e\x2d890a\x2dc8635f090fa6.mount: Deactivated successfully.
Dec 06 10:16:15 np0005548789.localdomain sshd[315467]: Received disconnect from 64.227.102.57 port 53732:11: Bye Bye [preauth]
Dec 06 10:16:15 np0005548789.localdomain sshd[315467]: Disconnected from authenticating user root 64.227.102.57 port 53732 [preauth]
Dec 06 10:16:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:16:15 np0005548789.localdomain systemd[1]: tmp-crun.yTXPy2.mount: Deactivated successfully.
Dec 06 10:16:15 np0005548789.localdomain podman[315470]: 2025-12-06 10:16:15.113046883 +0000 UTC m=+0.085361440 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:16:15 np0005548789.localdomain podman[315470]: 2025-12-06 10:16:15.121269963 +0000 UTC m=+0.093584460 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:16:15 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:16:15 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:16:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:15 np0005548789.localdomain podman[315510]: 2025-12-06 10:16:15.232222388 +0000 UTC m=+0.054412081 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:15 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:15Z|00175|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:15.683 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:15 np0005548789.localdomain systemd[1]: tmp-crun.ElSw4c.mount: Deactivated successfully.
Dec 06 10:16:16 np0005548789.localdomain ceph-mon[298582]: pgmap v160: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:16.405 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:15Z, description=, device_id=71af98f7-3b87-4f22-8e42-c4c7e7586541, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc857f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc853a0>], id=5955baae-5bb8-453d-bf95-d281294502a6, ip_allocation=immediate, mac_address=fa:16:3e:08:b3:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=975, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:16:16Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:16:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:16:16 np0005548789.localdomain systemd[1]: tmp-crun.qpIOlk.mount: Deactivated successfully.
Dec 06 10:16:16 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:16:16 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:16 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:16 np0005548789.localdomain podman[315547]: 2025-12-06 10:16:16.648589098 +0000 UTC m=+0.069706615 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:16.835 263652 INFO neutron.agent.dhcp.agent [None req-99a7d35a-8aa7-4e0f-b7b0-fa1b60028996 - - - - - -] DHCP configuration for ports {'5955baae-5bb8-453d-bf95-d281294502a6'} is completed
Dec 06 10:16:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:17.154 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:17.590 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:17Z, description=, device_id=1a41ced9-29be-4992-bdce-4aa27040262d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcb0fd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcb01f0>], id=7c050743-5ffe-4017-9560-2b6d5888c4c3, ip_allocation=immediate, mac_address=fa:16:3e:07:97:ff, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=977, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:16:17Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:16:17 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:16:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:17 np0005548789.localdomain podman[315584]: 2025-12-06 10:16:17.840692726 +0000 UTC m=+0.059262048 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:16:18 np0005548789.localdomain ceph-mon[298582]: pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:18.054 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:18.092 263652 INFO neutron.agent.dhcp.agent [None req-f2c6871d-afc5-4cfc-8439-964d4c458bc6 - - - - - -] DHCP configuration for ports {'7c050743-5ffe-4017-9560-2b6d5888c4c3'} is completed
Dec 06 10:16:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:18 np0005548789.localdomain sshd[311097]: fatal: Timeout before authentication for 45.78.222.162 port 48698
Dec 06 10:16:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:18.817 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:19.161 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:18Z, description=, device_id=71af98f7-3b87-4f22-8e42-c4c7e7586541, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe3845533d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd0adc0>], id=015e1950-2195-40d8-a2a7-064ffc59ec35, ip_allocation=immediate, mac_address=fa:16:3e:c1:27:af, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:10Z, description=, dns_domain=, id=a1e70fff-f7c1-4a44-8853-ff024a9f780b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-1975212823-network, port_security_enabled=True, project_id=7435808e897043e08b27fd5dcaabc003, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=944, status=ACTIVE, subnets=['0915ac38-6dfd-47e4-bf76-8ab2ffd38d09'], tags=[], tenant_id=7435808e897043e08b27fd5dcaabc003, updated_at=2025-12-06T10:16:11Z, vlan_transparent=None, network_id=a1e70fff-f7c1-4a44-8853-ff024a9f780b, port_security_enabled=False, project_id=7435808e897043e08b27fd5dcaabc003, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=978, status=DOWN, tags=[], tenant_id=7435808e897043e08b27fd5dcaabc003, updated_at=2025-12-06T10:16:18Z on network a1e70fff-f7c1-4a44-8853-ff024a9f780b
Dec 06 10:16:19 np0005548789.localdomain podman[315620]: 2025-12-06 10:16:19.374992734 +0000 UTC m=+0.066340033 container kill ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:19 np0005548789.localdomain dnsmasq[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/addn_hosts - 1 addresses
Dec 06 10:16:19 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/host
Dec 06 10:16:19 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/opts
Dec 06 10:16:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:16:19 np0005548789.localdomain systemd[1]: tmp-crun.FfYdLt.mount: Deactivated successfully.
Dec 06 10:16:19 np0005548789.localdomain podman[315635]: 2025-12-06 10:16:19.4948731 +0000 UTC m=+0.089978870 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller)
Dec 06 10:16:19 np0005548789.localdomain podman[315635]: 2025-12-06 10:16:19.562360507 +0000 UTC m=+0.157466207 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:16:19 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:16:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:19.660 263652 INFO neutron.agent.dhcp.agent [None req-a72bd817-8070-4b28-b3e6-908280b0e2c4 - - - - - -] DHCP configuration for ports {'015e1950-2195-40d8-a2a7-064ffc59ec35'} is completed
Dec 06 10:16:20 np0005548789.localdomain ceph-mon[298582]: pgmap v162: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:20.299 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:18Z, description=, device_id=71af98f7-3b87-4f22-8e42-c4c7e7586541, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fe1b130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fe1ba60>], id=015e1950-2195-40d8-a2a7-064ffc59ec35, ip_allocation=immediate, mac_address=fa:16:3e:c1:27:af, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:10Z, description=, dns_domain=, id=a1e70fff-f7c1-4a44-8853-ff024a9f780b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-1975212823-network, port_security_enabled=True, project_id=7435808e897043e08b27fd5dcaabc003, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=944, status=ACTIVE, subnets=['0915ac38-6dfd-47e4-bf76-8ab2ffd38d09'], tags=[], tenant_id=7435808e897043e08b27fd5dcaabc003, updated_at=2025-12-06T10:16:11Z, vlan_transparent=None, network_id=a1e70fff-f7c1-4a44-8853-ff024a9f780b, port_security_enabled=False, project_id=7435808e897043e08b27fd5dcaabc003, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=978, status=DOWN, tags=[], tenant_id=7435808e897043e08b27fd5dcaabc003, updated_at=2025-12-06T10:16:18Z on network a1e70fff-f7c1-4a44-8853-ff024a9f780b
Dec 06 10:16:20 np0005548789.localdomain dnsmasq[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/addn_hosts - 1 addresses
Dec 06 10:16:20 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/host
Dec 06 10:16:20 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/opts
Dec 06 10:16:20 np0005548789.localdomain podman[315686]: 2025-12-06 10:16:20.531290746 +0000 UTC m=+0.063977632 container kill ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:16:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:20.823 263652 INFO neutron.agent.dhcp.agent [None req-99081cd3-a8ee-4ee3-af5f-6287fd90038f - - - - - -] DHCP configuration for ports {'015e1950-2195-40d8-a2a7-064ffc59ec35'} is completed
Dec 06 10:16:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:21.329 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:16:22 np0005548789.localdomain ceph-mon[298582]: pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:22.200 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:22 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:22.875 2 INFO neutron.agent.securitygroups_rpc [None req-08283fcf-8c3f-4ce1-8201-1776fe09eb71 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:23 np0005548789.localdomain sshd[315708]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:23.821 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:16:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:16:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:16:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157928 "" "Go-http-client/1.1"
Dec 06 10:16:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:16:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19739 "" "Go-http-client/1.1"
Dec 06 10:16:24 np0005548789.localdomain dnsmasq[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/addn_hosts - 0 addresses
Dec 06 10:16:24 np0005548789.localdomain podman[315726]: 2025-12-06 10:16:24.070194316 +0000 UTC m=+0.044576463 container kill ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:16:24 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/host
Dec 06 10:16:24 np0005548789.localdomain dnsmasq-dhcp[315430]: read /var/lib/neutron/dhcp/a1e70fff-f7c1-4a44-8853-ff024a9f780b/opts
Dec 06 10:16:24 np0005548789.localdomain ceph-mon[298582]: pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:24.252 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:24 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:24Z|00176|binding|INFO|Releasing lport da02d3d2-692f-455e-be00-1cf20526dba9 from this chassis (sb_readonly=0)
Dec 06 10:16:24 np0005548789.localdomain kernel: device tapda02d3d2-69 left promiscuous mode
Dec 06 10:16:24 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:24Z|00177|binding|INFO|Setting lport da02d3d2-692f-455e-be00-1cf20526dba9 down in Southbound
Dec 06 10:16:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:24.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:24 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:24.282 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-a1e70fff-f7c1-4a44-8853-ff024a9f780b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1e70fff-f7c1-4a44-8853-ff024a9f780b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7435808e897043e08b27fd5dcaabc003', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ccc8a57e-8463-42c0-9469-317af07ded18, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=da02d3d2-692f-455e-be00-1cf20526dba9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:24 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:24.284 160509 INFO neutron.agent.ovn.metadata.agent [-] Port da02d3d2-692f-455e-be00-1cf20526dba9 in datapath a1e70fff-f7c1-4a44-8853-ff024a9f780b unbound from our chassis
Dec 06 10:16:24 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:24.286 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1e70fff-f7c1-4a44-8853-ff024a9f780b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:24 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:24.287 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5100c729-bcfb-4dd2-b089-33e9173f1b57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:25.247 2 INFO neutron.agent.securitygroups_rpc [None req-2f0fe649-a0ce-475a-a444-c6db3fc27153 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:25 np0005548789.localdomain sshd[315708]: Connection reset by authenticating user root 45.135.232.92 port 29520 [preauth]
Dec 06 10:16:25 np0005548789.localdomain sshd[315749]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:26 np0005548789.localdomain ceph-mon[298582]: pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:27 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:16:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:27 np0005548789.localdomain podman[315768]: 2025-12-06 10:16:27.103575465 +0000 UTC m=+0.059014251 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:27.254 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:27 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:27Z|00178|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:27.281 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:27 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:16:27 np0005548789.localdomain podman[315806]: 2025-12-06 10:16:27.492563723 +0000 UTC m=+0.056130603 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:16:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:27 np0005548789.localdomain sshd[315749]: Connection reset by authenticating user root 45.135.232.92 port 61798 [preauth]
Dec 06 10:16:28 np0005548789.localdomain ceph-mon[298582]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:28 np0005548789.localdomain dnsmasq[315430]: exiting on receipt of SIGTERM
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: tmp-crun.9cPnXm.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain podman[315844]: 2025-12-06 10:16:28.055117817 +0000 UTC m=+0.059262039 container kill ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: libpod-ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2.scope: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain sshd[315859]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:28 np0005548789.localdomain podman[315858]: 2025-12-06 10:16:28.113960511 +0000 UTC m=+0.038036815 container died ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-10ea93285ee6bf33fabd8991e6c84ec56f594aa69c471e85c2d0afdfeabfc6e8-merged.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain podman[315858]: 2025-12-06 10:16:28.159355368 +0000 UTC m=+0.083431682 container remove ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a1e70fff-f7c1-4a44-8853-ff024a9f780b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: libpod-conmon-ab9a2c7f430a211bca8b7e65add9f1fb30eab1d81430fd08f590978abe7e2bd2.scope: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2da1e70fff\x2df7c1\x2d4a44\x2d8853\x2dff024a9f780b.mount: Deactivated successfully.
Dec 06 10:16:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:28.191 263652 INFO neutron.agent.dhcp.agent [None req-dd00c1b1-d1b6-4fb0-ba4e-35de2f788451 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:28.261 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:28.275 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:28.739 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:28.823 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:29.310 2 INFO neutron.agent.securitygroups_rpc [None req-64ece17b-51fa-4f7d-ac9f-f7ae51f6ef1a 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:29.331 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:29.839 2 INFO neutron.agent.securitygroups_rpc [None req-e4d175d7-f151-45a2-bfa9-dd114b2ac98c 13b250438f8e49ee9d0d9f0fe4791c05 a22ced63e346459ab637424ae7833af7 - - default default] Security group member updated ['55c805cd-9bbe-4434-83af-206ee080e6b9']
Dec 06 10:16:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:29.851 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:30 np0005548789.localdomain ceph-mon[298582]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:30 np0005548789.localdomain sshd[315859]: Connection reset by authenticating user root 45.135.232.92 port 61814 [preauth]
Dec 06 10:16:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:16:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:16:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:30.640 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:30 np0005548789.localdomain systemd[1]: tmp-crun.IjYzlV.mount: Deactivated successfully.
Dec 06 10:16:30 np0005548789.localdomain podman[315887]: 2025-12-06 10:16:30.664239185 +0000 UTC m=+0.100390836 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:16:30 np0005548789.localdomain sshd[315915]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:30 np0005548789.localdomain podman[315888]: 2025-12-06 10:16:30.705108155 +0000 UTC m=+0.138392939 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:16:30 np0005548789.localdomain podman[315888]: 2025-12-06 10:16:30.711056655 +0000 UTC m=+0.144341409 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:16:30 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:16:30 np0005548789.localdomain podman[315887]: 2025-12-06 10:16:30.796393413 +0000 UTC m=+0.232545094 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:30 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:16:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:30.883 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.207 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.208 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.208 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.208 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.209 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:31 np0005548789.localdomain systemd[1]: tmp-crun.Mwh73r.mount: Deactivated successfully.
Dec 06 10:16:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:31 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2582979284' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.678 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.891 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:16:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:31.892 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.119 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.121 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11302MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.122 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.122 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:32 np0005548789.localdomain ceph-mon[298582]: pgmap v168: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:32 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2582979284' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.290 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.291 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.291 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.295 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.348 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:32 np0005548789.localdomain sshd[315915]: Invalid user user1 from 45.135.232.92 port 61832
Dec 06 10:16:32 np0005548789.localdomain dnsmasq-dhcp[314636]: DHCPRELEASE(tape1277966-bb) 192.168.122.174 fa:16:3e:71:8d:2e
Dec 06 10:16:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:32 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4066488691' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.818 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.826 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:16:32 np0005548789.localdomain sshd[315915]: Connection reset by invalid user user1 45.135.232.92 port 61832 [preauth]
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.847 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.888 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:16:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:32.888 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.766s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:32 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:16:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:32 np0005548789.localdomain podman[315991]: 2025-12-06 10:16:32.907356603 +0000 UTC m=+0.052390621 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:16:32 np0005548789.localdomain systemd[1]: tmp-crun.kClgm7.mount: Deactivated successfully.
Dec 06 10:16:33 np0005548789.localdomain sshd[316007]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:33 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4066488691' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:33.207 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:33.827 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:34 np0005548789.localdomain ceph-mon[298582]: pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:34 np0005548789.localdomain sshd[316007]: Connection reset by authenticating user adm 45.135.232.92 port 61834 [preauth]
Dec 06 10:16:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:34.930 263652 INFO neutron.agent.linux.ip_lib [None req-a8c65535-c5ff-4d4b-91d3-bf259fd36a37 - - - - - -] Device tap9fc3daab-2b cannot be used as it has no MAC address
Dec 06 10:16:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:34.957 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:34 np0005548789.localdomain kernel: device tap9fc3daab-2b entered promiscuous mode
Dec 06 10:16:34 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016194.9686] manager: (tap9fc3daab-2b): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Dec 06 10:16:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:34.972 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:34Z|00179|binding|INFO|Claiming lport 9fc3daab-2b42-430e-915a-f1ee9d25ffbe for this chassis.
Dec 06 10:16:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:34Z|00180|binding|INFO|9fc3daab-2b42-430e-915a-f1ee9d25ffbe: Claiming unknown
Dec 06 10:16:34 np0005548789.localdomain systemd-udevd[316023]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:34.991 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b7a42283-4c55-4c11-8e24-f6394c9a461a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7a42283-4c55-4c11-8e24-f6394c9a461a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '550d07fdc38d491ba10875a25f95fdea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc90d7d0-806b-4760-9447-b6831c3346a6, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=9fc3daab-2b42-430e-915a-f1ee9d25ffbe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:34.992 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc3daab-2b42-430e-915a-f1ee9d25ffbe in datapath b7a42283-4c55-4c11-8e24-f6394c9a461a bound to our chassis
Dec 06 10:16:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:34.994 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b7a42283-4c55-4c11-8e24-f6394c9a461a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:16:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:34.995 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[84d63ba4-738f-46b0-a32e-dc93bf900f82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:35Z|00181|binding|INFO|Setting lport 9fc3daab-2b42-430e-915a-f1ee9d25ffbe ovn-installed in OVS
Dec 06 10:16:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:35Z|00182|binding|INFO|Setting lport 9fc3daab-2b42-430e-915a-f1ee9d25ffbe up in Southbound
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:35.022 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap9fc3daab-2b: No such device
Dec 06 10:16:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:35.063 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:35.098 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:35 np0005548789.localdomain podman[316094]: 
Dec 06 10:16:36 np0005548789.localdomain podman[316094]: 2025-12-06 10:16:36.007495755 +0000 UTC m=+0.080204133 container create f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:16:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399.scope.
Dec 06 10:16:36 np0005548789.localdomain systemd[1]: tmp-crun.bHz8WB.mount: Deactivated successfully.
Dec 06 10:16:36 np0005548789.localdomain podman[316094]: 2025-12-06 10:16:35.965307606 +0000 UTC m=+0.038016014 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93508883cd7c788e7b11fc41a7148ccecde4258bfe0c638def17c492548919fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:36 np0005548789.localdomain podman[316094]: 2025-12-06 10:16:36.088001077 +0000 UTC m=+0.160709455 container init f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:16:36 np0005548789.localdomain podman[316094]: 2025-12-06 10:16:36.098043112 +0000 UTC m=+0.170751490 container start f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:16:36 np0005548789.localdomain dnsmasq[316112]: started, version 2.85 cachesize 150
Dec 06 10:16:36 np0005548789.localdomain dnsmasq[316112]: DNS service limited to local subnets
Dec 06 10:16:36 np0005548789.localdomain dnsmasq[316112]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:36 np0005548789.localdomain dnsmasq[316112]: warning: no upstream servers configured
Dec 06 10:16:36 np0005548789.localdomain dnsmasq-dhcp[316112]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:36 np0005548789.localdomain dnsmasq[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/addn_hosts - 0 addresses
Dec 06 10:16:36 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/host
Dec 06 10:16:36 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/opts
Dec 06 10:16:36 np0005548789.localdomain ceph-mon[298582]: pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:36.483 263652 INFO neutron.agent.dhcp.agent [None req-a4abf64e-7bd6-4c9d-ba32-2827ff25b811 - - - - - -] DHCP configuration for ports {'1cba0605-9994-45f3-b711-845fbf180ceb'} is completed
Dec 06 10:16:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:36.592 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.884 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.885 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.885 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.885 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.951 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.952 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.953 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:16:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:36.953 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:16:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:16:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:16:37 np0005548789.localdomain podman[316113]: 2025-12-06 10:16:37.162339093 +0000 UTC m=+0.111796442 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Dec 06 10:16:37 np0005548789.localdomain podman[316113]: 2025-12-06 10:16:37.179487433 +0000 UTC m=+0.128944792 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:16:37 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:16:37 np0005548789.localdomain podman[316131]: 2025-12-06 10:16:37.255742556 +0000 UTC m=+0.084205235 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 06 10:16:37 np0005548789.localdomain podman[316131]: 2025-12-06 10:16:37.272140984 +0000 UTC m=+0.100603663 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:37.278 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:37Z, description=, device_id=84276e13-7738-4c28-b592-6465bc338221, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcb09d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcb0550>], id=0cf1dea2-757c-46ff-a408-efd0a5e6423c, ip_allocation=immediate, mac_address=fa:16:3e:90:dc:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1116, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:16:37Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:16:37 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.296 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:37 np0005548789.localdomain podman[316168]: 2025-12-06 10:16:37.466829538 +0000 UTC m=+0.049471440 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:16:37 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:16:37 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:37 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.520 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.541 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.542 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.542 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:37.542 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:37.752 263652 INFO neutron.agent.dhcp.agent [None req-33c52649-a362-4103-8ab9-88a0c093f703 - - - - - -] DHCP configuration for ports {'0cf1dea2-757c-46ff-a408-efd0a5e6423c'} is completed
Dec 06 10:16:38 np0005548789.localdomain ceph-mon[298582]: pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.185 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:16:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:38.828 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1196607636' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:16:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1196607636' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:16:40 np0005548789.localdomain ceph-mon[298582]: pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:40.186 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:40.652 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:40Z, description=, device_id=84276e13-7738-4c28-b592-6465bc338221, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc8ecd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc8e730>], id=fcfaf919-b8a6-49c6-94f9-46161c17ee32, ip_allocation=immediate, mac_address=fa:16:3e:39:c5:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:32Z, description=, dns_domain=, id=b7a42283-4c55-4c11-8e24-f6394c9a461a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-300055235-network, port_security_enabled=True, project_id=550d07fdc38d491ba10875a25f95fdea, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7958, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1078, status=ACTIVE, subnets=['57051124-797e-44d0-8e7b-649184ccc4f4'], tags=[], tenant_id=550d07fdc38d491ba10875a25f95fdea, updated_at=2025-12-06T10:16:33Z, vlan_transparent=None, network_id=b7a42283-4c55-4c11-8e24-f6394c9a461a, port_security_enabled=False, project_id=550d07fdc38d491ba10875a25f95fdea, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1136, status=DOWN, tags=[], tenant_id=550d07fdc38d491ba10875a25f95fdea, updated_at=2025-12-06T10:16:40Z on network b7a42283-4c55-4c11-8e24-f6394c9a461a
Dec 06 10:16:40 np0005548789.localdomain dnsmasq[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/addn_hosts - 1 addresses
Dec 06 10:16:40 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/host
Dec 06 10:16:40 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/opts
Dec 06 10:16:40 np0005548789.localdomain podman[316206]: 2025-12-06 10:16:40.898394283 +0000 UTC m=+0.066579830 container kill f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2751703348' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:41.242 263652 INFO neutron.agent.dhcp.agent [None req-2c27286b-39fc-4b71-ad5e-df52125c6453 - - - - - -] DHCP configuration for ports {'fcfaf919-b8a6-49c6-94f9-46161c17ee32'} is completed
Dec 06 10:16:41 np0005548789.localdomain sshd[316227]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:16:42 np0005548789.localdomain ceph-mon[298582]: pgmap v173: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1359574343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2821502390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:42.337 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:40Z, description=, device_id=84276e13-7738-4c28-b592-6465bc338221, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc33430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc33910>], id=fcfaf919-b8a6-49c6-94f9-46161c17ee32, ip_allocation=immediate, mac_address=fa:16:3e:39:c5:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:32Z, description=, dns_domain=, id=b7a42283-4c55-4c11-8e24-f6394c9a461a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-300055235-network, port_security_enabled=True, project_id=550d07fdc38d491ba10875a25f95fdea, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7958, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1078, status=ACTIVE, subnets=['57051124-797e-44d0-8e7b-649184ccc4f4'], tags=[], tenant_id=550d07fdc38d491ba10875a25f95fdea, updated_at=2025-12-06T10:16:33Z, vlan_transparent=None, network_id=b7a42283-4c55-4c11-8e24-f6394c9a461a, port_security_enabled=False, project_id=550d07fdc38d491ba10875a25f95fdea, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1136, status=DOWN, tags=[], tenant_id=550d07fdc38d491ba10875a25f95fdea, updated_at=2025-12-06T10:16:40Z on network b7a42283-4c55-4c11-8e24-f6394c9a461a
Dec 06 10:16:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:42.340 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:42 np0005548789.localdomain dnsmasq[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/addn_hosts - 1 addresses
Dec 06 10:16:42 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/host
Dec 06 10:16:42 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/opts
Dec 06 10:16:42 np0005548789.localdomain podman[316245]: 2025-12-06 10:16:42.535680205 +0000 UTC m=+0.063223229 container kill f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:16:42 np0005548789.localdomain podman[316260]: 2025-12-06 10:16:42.65617716 +0000 UTC m=+0.090042212 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:16:42 np0005548789.localdomain podman[316260]: 2025-12-06 10:16:42.694245014 +0000 UTC m=+0.128110076 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 06 10:16:42 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:16:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:42.817 263652 INFO neutron.agent.dhcp.agent [None req-7b166144-93f5-46af-8092-4b08555fc5d7 - - - - - -] DHCP configuration for ports {'fcfaf919-b8a6-49c6-94f9-46161c17ee32'} is completed
Dec 06 10:16:42 np0005548789.localdomain sshd[316227]: Received disconnect from 118.219.234.233 port 58960:11: Bye Bye [preauth]
Dec 06 10:16:42 np0005548789.localdomain sshd[316227]: Disconnected from authenticating user root 118.219.234.233 port 58960 [preauth]
Dec 06 10:16:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2914913536' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:16:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:43.830 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:44 np0005548789.localdomain ceph-mon[298582]: pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:16:45 np0005548789.localdomain podman[316286]: 2025-12-06 10:16:45.918260173 +0000 UTC m=+0.081169203 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:16:45 np0005548789.localdomain podman[316286]: 2025-12-06 10:16:45.927887666 +0000 UTC m=+0.090796686 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:16:45 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:16:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:46.178 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:46 np0005548789.localdomain ceph-mon[298582]: pgmap v175: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:16:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:16:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:16:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:47.306 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:47.306 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:47.307 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:47.385 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548789.localdomain ceph-mon[298582]: pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:48.833 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:48 np0005548789.localdomain dnsmasq[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/addn_hosts - 0 addresses
Dec 06 10:16:48 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/host
Dec 06 10:16:48 np0005548789.localdomain podman[316329]: 2025-12-06 10:16:48.896823267 +0000 UTC m=+0.067515568 container kill f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:48 np0005548789.localdomain dnsmasq-dhcp[316112]: read /var/lib/neutron/dhcp/b7a42283-4c55-4c11-8e24-f6394c9a461a/opts
Dec 06 10:16:48 np0005548789.localdomain systemd[1]: tmp-crun.5H2ZGE.mount: Deactivated successfully.
Dec 06 10:16:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:49Z|00183|binding|INFO|Releasing lport 9fc3daab-2b42-430e-915a-f1ee9d25ffbe from this chassis (sb_readonly=0)
Dec 06 10:16:49 np0005548789.localdomain kernel: device tap9fc3daab-2b left promiscuous mode
Dec 06 10:16:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:49.279 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:49Z|00184|binding|INFO|Setting lport 9fc3daab-2b42-430e-915a-f1ee9d25ffbe down in Southbound
Dec 06 10:16:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:49.292 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b7a42283-4c55-4c11-8e24-f6394c9a461a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7a42283-4c55-4c11-8e24-f6394c9a461a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '550d07fdc38d491ba10875a25f95fdea', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc90d7d0-806b-4760-9447-b6831c3346a6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=9fc3daab-2b42-430e-915a-f1ee9d25ffbe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:49.294 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc3daab-2b42-430e-915a-f1ee9d25ffbe in datapath b7a42283-4c55-4c11-8e24-f6394c9a461a unbound from our chassis
Dec 06 10:16:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:49.297 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7a42283-4c55-4c11-8e24-f6394c9a461a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:49.298 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[705bc57b-0970-42ce-b803-7dfa180f526c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:49.299 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:16:49 np0005548789.localdomain podman[316351]: 2025-12-06 10:16:49.928395997 +0000 UTC m=+0.087348000 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:16:49 np0005548789.localdomain podman[316351]: 2025-12-06 10:16:49.996259935 +0000 UTC m=+0.155211948 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:50 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:16:50 np0005548789.localdomain ceph-mon[298582]: pgmap v177: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e116 e116: 6 total, 6 up, 6 in
Dec 06 10:16:52 np0005548789.localdomain ceph-mon[298582]: pgmap v178: 177 pgs: 177 active+clean; 145 MiB data, 773 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:52.388 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:52 np0005548789.localdomain podman[316392]: 2025-12-06 10:16:52.493996307 +0000 UTC m=+0.069565101 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:52 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:16:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:52 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:52Z|00185|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:52.702 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:53.165 2 INFO neutron.agent.securitygroups_rpc [None req-26ae0ef5-9433-41c4-a064-a0d5d3110043 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:53 np0005548789.localdomain ceph-mon[298582]: osdmap e116: 6 total, 6 up, 6 in
Dec 06 10:16:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e117 e117: 6 total, 6 up, 6 in
Dec 06 10:16:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:53 np0005548789.localdomain dnsmasq[316112]: exiting on receipt of SIGTERM
Dec 06 10:16:53 np0005548789.localdomain podman[316429]: 2025-12-06 10:16:53.528347281 +0000 UTC m=+0.058888268 container kill f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:16:53 np0005548789.localdomain systemd[1]: libpod-f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399.scope: Deactivated successfully.
Dec 06 10:16:53 np0005548789.localdomain podman[316444]: 2025-12-06 10:16:53.588067212 +0000 UTC m=+0.040318144 container died f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:16:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399-userdata-shm.mount: Deactivated successfully.
Dec 06 10:16:53 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-93508883cd7c788e7b11fc41a7148ccecde4258bfe0c638def17c492548919fa-merged.mount: Deactivated successfully.
Dec 06 10:16:53 np0005548789.localdomain podman[316444]: 2025-12-06 10:16:53.634837651 +0000 UTC m=+0.087088493 container remove f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b7a42283-4c55-4c11-8e24-f6394c9a461a, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:53 np0005548789.localdomain systemd[1]: libpod-conmon-f5b9c00fd0fe99e78a6bb373d0c252ad67c3ab760edeaee288dfd88518b3f399.scope: Deactivated successfully.
Dec 06 10:16:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:53.835 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:16:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:16:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:16:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:16:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:16:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19259 "" "Go-http-client/1.1"
Dec 06 10:16:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e118 e118: 6 total, 6 up, 6 in
Dec 06 10:16:54 np0005548789.localdomain ceph-mon[298582]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 4.3 KiB/s wr, 35 op/s
Dec 06 10:16:54 np0005548789.localdomain ceph-mon[298582]: osdmap e117: 6 total, 6 up, 6 in
Dec 06 10:16:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:54.712 263652 INFO neutron.agent.linux.ip_lib [None req-b37a7ebe-4db7-4b9d-b1da-15af0596f8e7 - - - - - -] Device tap949a183f-bf cannot be used as it has no MAC address
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.778 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548789.localdomain kernel: device tap949a183f-bf entered promiscuous mode
Dec 06 10:16:54 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016214.7917] manager: (tap949a183f-bf): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Dec 06 10:16:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:54Z|00186|binding|INFO|Claiming lport 949a183f-bfda-4354-9310-98929388f22d for this chassis.
Dec 06 10:16:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:54Z|00187|binding|INFO|949a183f-bfda-4354-9310-98929388f22d: Claiming unknown
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.794 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548789.localdomain systemd-udevd[316478]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.828 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:54Z|00188|binding|INFO|Setting lport 949a183f-bfda-4354-9310-98929388f22d ovn-installed in OVS
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.831 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap949a183f-bf: No such device
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.866 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:54.897 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:55 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2db7a42283\x2d4c55\x2d4c11\x2d8e24\x2df6394c9a461a.mount: Deactivated successfully.
Dec 06 10:16:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:55Z|00189|binding|INFO|Setting lport 949a183f-bfda-4354-9310-98929388f22d up in Southbound
Dec 06 10:16:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:55.055 263652 INFO neutron.agent.dhcp.agent [None req-a4220a31-8e2e-4ffd-8f77-aa2b0e7c33ae - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:55.056 263652 INFO neutron.agent.dhcp.agent [None req-a4220a31-8e2e-4ffd-8f77-aa2b0e7c33ae - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:55.056 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-8fb7fee7-47f3-496e-84a0-2200c47dea55', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8fb7fee7-47f3-496e-84a0-2200c47dea55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '290c121e7a5344fea2a32f4e64e74fb4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6e7c85-e1e8-4901-8e98-f2cdf448ee9d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=949a183f-bfda-4354-9310-98929388f22d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:55.058 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 949a183f-bfda-4354-9310-98929388f22d in datapath 8fb7fee7-47f3-496e-84a0-2200c47dea55 bound to our chassis
Dec 06 10:16:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:55.059 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 781a5cd9-f731-448d-91bd-ddefbb48ec27 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:16:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:55.059 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8fb7fee7-47f3-496e-84a0-2200c47dea55, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:16:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:16:55.060 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3899be47-97e7-4b47-a480-cc478683ecb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:16:55 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:55.063 2 INFO neutron.agent.securitygroups_rpc [None req-6596b1da-4291-462f-a9bc-899ad3053051 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:16:55Z|00190|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:16:55 np0005548789.localdomain sudo[316512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:16:55 np0005548789.localdomain sudo[316512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:55 np0005548789.localdomain sudo[316512]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:55 np0005548789.localdomain sudo[316535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:16:55 np0005548789.localdomain sudo[316535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:55 np0005548789.localdomain ceph-mon[298582]: osdmap e118: 6 total, 6 up, 6 in
Dec 06 10:16:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e119 e119: 6 total, 6 up, 6 in
Dec 06 10:16:55 np0005548789.localdomain podman[316603]: 
Dec 06 10:16:55 np0005548789.localdomain podman[316603]: 2025-12-06 10:16:55.837266083 +0000 UTC m=+0.089264458 container create 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:16:55 np0005548789.localdomain systemd[1]: Started libpod-conmon-31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936.scope.
Dec 06 10:16:55 np0005548789.localdomain systemd[1]: tmp-crun.4BDBs9.mount: Deactivated successfully.
Dec 06 10:16:55 np0005548789.localdomain sudo[316535]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:55 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:16:55 np0005548789.localdomain podman[316603]: 2025-12-06 10:16:55.799107686 +0000 UTC m=+0.051106091 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:16:55 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e623c434dbbedf87878fe7b9c9b2c78a861ad29951b1d46175e6c1c0d161e920/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:16:55 np0005548789.localdomain podman[316603]: 2025-12-06 10:16:55.910104433 +0000 UTC m=+0.162102828 container init 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:55 np0005548789.localdomain podman[316603]: 2025-12-06 10:16:55.919839297 +0000 UTC m=+0.171837692 container start 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:55 np0005548789.localdomain dnsmasq[316635]: started, version 2.85 cachesize 150
Dec 06 10:16:55 np0005548789.localdomain dnsmasq[316635]: DNS service limited to local subnets
Dec 06 10:16:55 np0005548789.localdomain dnsmasq[316635]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:16:55 np0005548789.localdomain dnsmasq[316635]: warning: no upstream servers configured
Dec 06 10:16:55 np0005548789.localdomain dnsmasq-dhcp[316635]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:16:55 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 0 addresses
Dec 06 10:16:55 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:55 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:55.979 263652 INFO neutron.agent.dhcp.agent [None req-589c5935-71cc-458d-b186-8ee13fc0a445 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:52Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fdb24c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd3a790>], id=a70a96bc-9485-4b3a-ad28-6f527d77539d, ip_allocation=immediate, mac_address=fa:16:3e:1e:0c:70, name=tempest-AllowedAddressPairTestJSON-810561683, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1220, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:52Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:16:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:56.146 263652 INFO neutron.agent.dhcp.agent [None req-68f3686e-892d-4d1d-bf9e-1e34c1151c74 - - - - - -] DHCP configuration for ports {'ecf433c2-2ffc-446b-b4a2-27567af57062'} is completed
Dec 06 10:16:56 np0005548789.localdomain sudo[316645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:16:56 np0005548789.localdomain sudo[316645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:16:56 np0005548789.localdomain sudo[316645]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:56.291 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e120 e120: 6 total, 6 up, 6 in
Dec 06 10:16:56 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 1 addresses
Dec 06 10:16:56 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:56 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:56 np0005548789.localdomain podman[316668]: 2025-12-06 10:16:56.301231797 +0000 UTC m=+0.062023783 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 7.2 KiB/s wr, 59 op/s
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: osdmap e119: 6 total, 6 up, 6 in
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:56 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:16:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:56.443 263652 INFO neutron.agent.dhcp.agent [None req-d4098ffc-31f7-4403-b8a5-1fdda8dfb471 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc269d0>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:54Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc26b20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc26a00>], id=74b11fa3-cd87-45bf-856b-6e392660b0b7, ip_allocation=immediate, mac_address=fa:16:3e:23:43:84, name=tempest-AllowedAddressPairTestJSON-1448303374, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1223, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:54Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:16:56 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:56.532 2 INFO neutron.agent.securitygroups_rpc [None req-c7b28b51-59d5-4f0a-ad7d-a932cd0ad09d 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:56.565 263652 INFO neutron.agent.dhcp.agent [None req-3a0f99e8-5145-49e3-8652-6b9d2dd414c1 - - - - - -] DHCP configuration for ports {'a70a96bc-9485-4b3a-ad28-6f527d77539d'} is completed
Dec 06 10:16:56 np0005548789.localdomain podman[316707]: 2025-12-06 10:16:56.6716062 +0000 UTC m=+0.047574294 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:16:56 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 2 addresses
Dec 06 10:16:56 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:56 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:56.892 263652 INFO neutron.agent.dhcp.agent [None req-b88dea56-c5e2-4d82-aff8-e854ae30bd99 - - - - - -] DHCP configuration for ports {'74b11fa3-cd87-45bf-856b-6e392660b0b7'} is completed
Dec 06 10:16:57 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 1 addresses
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:57 np0005548789.localdomain podman[316746]: 2025-12-06 10:16:57.070831689 +0000 UTC m=+0.067004693 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:16:57 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:57.215 2 INFO neutron.agent.securitygroups_rpc [None req-dca415e6-2c01-4081-a144-3151bae67c51 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:57.259 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe3845a9b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4c10>], id=488fdf8a-7f71-4571-b31d-c641a9b76ebd, ip_allocation=immediate, mac_address=fa:16:3e:d4:d4:d9, name=tempest-AllowedAddressPairTestJSON-1094459018, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1227, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:57Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:16:57 np0005548789.localdomain ceph-mon[298582]: osdmap e120: 6 total, 6 up, 6 in
Dec 06 10:16:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:16:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:57.439 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:57 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 2 addresses
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:57 np0005548789.localdomain podman[316784]: 2025-12-06 10:16:57.519070805 +0000 UTC m=+0.063779435 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:57.749 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:57Z, description=, device_id=f3eed2e0-6009-48cb-b29a-fc71e49972a4, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcc2c10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcc2310>], id=4b7d8e10-916e-45ff-a07c-b4f9174b6f3d, ip_allocation=immediate, mac_address=fa:16:3e:ef:32:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1233, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:16:57Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:16:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:57.755 263652 INFO neutron.agent.dhcp.agent [None req-766521d5-53da-40c0-99ee-ad5c2892c84b - - - - - -] DHCP configuration for ports {'488fdf8a-7f71-4571-b31d-c641a9b76ebd'} is completed
Dec 06 10:16:57 np0005548789.localdomain systemd[1]: tmp-crun.qGMtZj.mount: Deactivated successfully.
Dec 06 10:16:57 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:16:57 np0005548789.localdomain podman[316824]: 2025-12-06 10:16:57.991185325 +0000 UTC m=+0.066672633 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:16:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:16:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e121 e121: 6 total, 6 up, 6 in
Dec 06 10:16:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:16:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:58.310 263652 INFO neutron.agent.dhcp.agent [None req-1a9a464f-1034-422c-b4ec-9f0fdda03eea - - - - - -] DHCP configuration for ports {'4b7d8e10-916e-45ff-a07c-b4f9174b6f3d'} is completed
Dec 06 10:16:58 np0005548789.localdomain ceph-mon[298582]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:16:58 np0005548789.localdomain ceph-mon[298582]: osdmap e121: 6 total, 6 up, 6 in
Dec 06 10:16:58 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:58.381 2 INFO neutron.agent.securitygroups_rpc [None req-97e21b62-44c1-4fc5-958c-bcc0268c52d3 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:58 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 1 addresses
Dec 06 10:16:58 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:58 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:58 np0005548789.localdomain podman[316860]: 2025-12-06 10:16:58.619343308 +0000 UTC m=+0.065745695 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:16:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:16:58.838 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:16:59 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:16:59.203 2 INFO neutron.agent.securitygroups_rpc [None req-a68c2924-ed4d-4682-a596-626401a139a3 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:16:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:59.232 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:16:58Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb369a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb36820>], id=d1855335-d570-45f4-bc5e-1b4a2f0ee869, ip_allocation=immediate, mac_address=fa:16:3e:43:c9:d2, name=tempest-AllowedAddressPairTestJSON-1276619202, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1244, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:59Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:16:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e122 e122: 6 total, 6 up, 6 in
Dec 06 10:16:59 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 2 addresses
Dec 06 10:16:59 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:16:59 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:16:59 np0005548789.localdomain podman[316899]: 2025-12-06 10:16:59.457843881 +0000 UTC m=+0.066000972 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:16:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:16:59.658 263652 INFO neutron.agent.dhcp.agent [None req-bd56f61b-821a-47a5-b328-cbbe45697b34 - - - - - -] DHCP configuration for ports {'d1855335-d570-45f4-bc5e-1b4a2f0ee869'} is completed
Dec 06 10:17:00 np0005548789.localdomain ceph-mon[298582]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 15 KiB/s wr, 153 op/s
Dec 06 10:17:00 np0005548789.localdomain ceph-mon[298582]: osdmap e122: 6 total, 6 up, 6 in
Dec 06 10:17:00 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:00.351 2 INFO neutron.agent.securitygroups_rpc [None req-1e316261-d213-40cc-b644-592e2d6242e7 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:00 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 1 addresses
Dec 06 10:17:00 np0005548789.localdomain podman[316937]: 2025-12-06 10:17:00.62028722 +0000 UTC m=+0.052126911 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:00 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:00 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:17:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:17:00 np0005548789.localdomain podman[316958]: 2025-12-06 10:17:00.914058131 +0000 UTC m=+0.074119649 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:17:00 np0005548789.localdomain podman[316958]: 2025-12-06 10:17:00.928349274 +0000 UTC m=+0.088410762 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:17:00 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:17:00 np0005548789.localdomain podman[316957]: 2025-12-06 10:17:00.974886796 +0000 UTC m=+0.137212413 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 10:17:01 np0005548789.localdomain podman[316957]: 2025-12-06 10:17:01.003900476 +0000 UTC m=+0.166226053 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:17:01 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:17:01 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:01.030 2 INFO neutron.agent.securitygroups_rpc [None req-af1607ac-cf21-43e9-9dc2-41d6c40546b7 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:01.062 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:00Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06be0>], id=63b78d82-3c66-4f23-abda-cb052c5a7880, ip_allocation=immediate, mac_address=fa:16:3e:7d:59:5d, name=tempest-AllowedAddressPairTestJSON-1183397262, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1246, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:17:00Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:17:01 np0005548789.localdomain podman[317013]: 2025-12-06 10:17:01.265596244 +0000 UTC m=+0.035081535 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:17:01 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 2 addresses
Dec 06 10:17:01 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:01 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e123 e123: 6 total, 6 up, 6 in
Dec 06 10:17:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:02.479 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:02.936 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:00Z, description=, device_id=0df74357-660e-4fa2-9159-46e39f559540, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc782e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc78520>], id=1b9f45f8-f0b0-429e-9cca-a3658963a54f, ip_allocation=immediate, mac_address=fa:16:3e:50:8d:54, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1247, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:00Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:03.026 263652 INFO neutron.agent.dhcp.agent [None req-ed2dbc9c-136b-4261-a697-cdf6831354c8 - - - - - -] DHCP configuration for ports {'63b78d82-3c66-4f23-abda-cb052c5a7880'} is completed
Dec 06 10:17:03 np0005548789.localdomain ceph-mon[298582]: pgmap v190: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 12 KiB/s wr, 128 op/s
Dec 06 10:17:03 np0005548789.localdomain ceph-mon[298582]: osdmap e123: 6 total, 6 up, 6 in
Dec 06 10:17:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:17:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2522 writes, 23K keys, 2522 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s
                                                           Cumulative WAL: 2522 writes, 2522 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2522 writes, 23K keys, 2522 commit groups, 1.0 writes per commit group, ingest: 46.29 MB, 0.08 MB/s
                                                           Interval WAL: 2522 writes, 2522 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    169.0      0.19              0.06         9    0.021       0      0       0.0       0.0
                                                             L6      1/0   18.39 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   4.4    174.5    159.1      0.88              0.38         8    0.111     96K   3989       0.0       0.0
                                                            Sum      1/0   18.39 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.4    143.7    160.8      1.07              0.44        17    0.063     96K   3989       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.4    144.1    161.2      1.07              0.44        16    0.067     96K   3989       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    174.5    159.1      0.88              0.38         8    0.111     96K   3989       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    171.2      0.19              0.06         8    0.023       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.031, interval 0.031
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.17 GB write, 0.29 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.1 seconds
                                                           Interval compaction: 0.17 GB write, 0.29 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.1 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b608f29350#2 capacity: 308.00 MB usage: 15.11 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000113 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(762,14.45 MB,4.69297%) FilterBlock(17,291.61 KB,0.0924593%) IndexBlock(17,382.36 KB,0.121233%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:17:03 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:17:03 np0005548789.localdomain podman[317050]: 2025-12-06 10:17:03.191284913 +0000 UTC m=+0.062899709 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:03 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:03 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:03.471 263652 INFO neutron.agent.dhcp.agent [None req-691db5f4-e8cc-4ab0-a2e4-0a707959f49e - - - - - -] DHCP configuration for ports {'1b9f45f8-f0b0-429e-9cca-a3658963a54f'} is completed
Dec 06 10:17:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:03.841 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:04 np0005548789.localdomain ceph-mon[298582]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 141 KiB/s rd, 18 KiB/s wr, 198 op/s
Dec 06 10:17:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e124 e124: 6 total, 6 up, 6 in
Dec 06 10:17:04 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:04.261 2 INFO neutron.agent.securitygroups_rpc [None req-585d973c-6716-4802-939d-774d36a541bf 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:04.410 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2f0d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2ff40>], id=620b3ac0-01e9-4c71-8ddf-fdc64ae29c4e, ip_allocation=immediate, mac_address=fa:16:3e:3a:76:0f, name=tempest-AllowedAddressPairTestJSON-2140749850, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:16:50Z, description=, dns_domain=, id=8fb7fee7-47f3-496e-84a0-2200c47dea55, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-162135958, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55801, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1197, status=ACTIVE, subnets=['c51e2e1d-7019-4112-9f0b-cec61466e763'], tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:16:51Z, vlan_transparent=None, network_id=8fb7fee7-47f3-496e-84a0-2200c47dea55, port_security_enabled=True, project_id=290c121e7a5344fea2a32f4e64e74fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3'], standard_attr_id=1249, status=DOWN, tags=[], tenant_id=290c121e7a5344fea2a32f4e64e74fb4, updated_at=2025-12-06T10:17:03Z on network 8fb7fee7-47f3-496e-84a0-2200c47dea55
Dec 06 10:17:04 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 3 addresses
Dec 06 10:17:04 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:04 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:04 np0005548789.localdomain podman[317089]: 2025-12-06 10:17:04.620642518 +0000 UTC m=+0.060584208 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:17:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:04.994 263652 INFO neutron.agent.dhcp.agent [None req-f7b443e5-dde0-4f2b-a20b-093fe62f1547 - - - - - -] DHCP configuration for ports {'620b3ac0-01e9-4c71-8ddf-fdc64ae29c4e'} is completed
Dec 06 10:17:05 np0005548789.localdomain ceph-mon[298582]: osdmap e124: 6 total, 6 up, 6 in
Dec 06 10:17:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:05.331 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e125 e125: 6 total, 6 up, 6 in
Dec 06 10:17:06 np0005548789.localdomain ceph-mon[298582]: pgmap v194: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 6.3 KiB/s wr, 74 op/s
Dec 06 10:17:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:06.955 2 INFO neutron.agent.securitygroups_rpc [None req-0061dfd5-bb12-495d-9673-65d6ef0bbdf6 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:06.960 2 INFO neutron.agent.securitygroups_rpc [None req-113b5acf-416d-4ae9-b224-76f1b565f762 7dcd2b11aeb4499894c7ac7c29cb6997 d6a02136413f4ad3ac51d2c4ffdad3d4 - - default default] Security group member updated ['58296f43-3702-412f-8387-07510507ed41']
Dec 06 10:17:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e126 e126: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 2 addresses
Dec 06 10:17:07 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:07 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:07 np0005548789.localdomain podman[317128]: 2025-12-06 10:17:07.185707981 +0000 UTC m=+0.045188222 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:17:07 np0005548789.localdomain ceph-mon[298582]: osdmap e125: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548789.localdomain ceph-mon[298582]: osdmap e126: 6 total, 6 up, 6 in
Dec 06 10:17:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:07.483 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:17:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.927 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '399a6c3c-4bfc-462b-aae9-35cbb645f894', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:07.919015', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b863a9d4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': 'f2111306e1257bfd985fc463689344c44d8b1179b564792c18f10b80f1be0957'}]}, 'timestamp': '2025-12-06 10:17:07.928298', '_unique_id': '48b1f1bbd37e42c19c9cdf5033786768'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.931 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain podman[317151]: 2025-12-06 10:17:07.937119592 +0000 UTC m=+0.089827266 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.951 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.951 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '752bc1e0-382e-457a-a222-b0dead1e437e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:07.932683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b867488c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': 'd33a4c449948e38f0debd6a1b8139ff90890a0e806e2597f03c2479a7df0bc83'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:07.932683', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8675e58-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': '7847419181d271e6485d5a081bbb934e8d42be6b0446511dd1c0e9a0ad987edb'}]}, 'timestamp': '2025-12-06 10:17:07.952412', '_unique_id': '4ed5ac9b46f9446b87d49c1d417f3314'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.953 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.955 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.955 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.955 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679ffbb2-867d-4e07-a0a4-ab9ace7a4e23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:07.955464', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b867e7d8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': 'c52588fa6eff569eba2ecedb1dcbcedd508f2b8e816c160fd95ed92117fb87db'}]}, 'timestamp': '2025-12-06 10:17:07.956036', '_unique_id': 'a36b939187464f53b69bffe5547f54e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.957 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.958 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'badc2661-f82d-4ded-89ca-36798cf75c1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:07.958510', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b8685eca-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': '2667be29f62006f16ef9812165093074a37d00d293e88e7173b5e3110670fc4f'}]}, 'timestamp': '2025-12-06 10:17:07.959143', '_unique_id': '9cf0564cdbe745a5b2e8af45c212b7b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.960 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.961 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.961 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bea14d3-a872-4b2e-9be3-fc9c3acaae77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:07.961521', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b868d5d0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': '8b3953280b7d197f25d9b9c11ba14200f1b7d381f3f81382c86d181bd718600d'}]}, 'timestamp': '2025-12-06 10:17:07.962049', '_unique_id': '595cc331881b4dbfae6cf9595747b2cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:07.964 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:17:07 np0005548789.localdomain podman[317150]: 2025-12-06 10:17:07.991466851 +0000 UTC m=+0.147842296 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 06 10:17:08 np0005548789.localdomain podman[317151]: 2025-12-06 10:17:08.00035062 +0000 UTC m=+0.153058274 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.001 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b687cdc1-675c-4534-a2ac-1b30773db161', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:07.964442', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b86eead8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': '1e3eee8ac1b2fc62ec18f3ede434fd22e1b1e52fbfd9b904eef0e1b4d67eba2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:07.964442', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b86f04aa-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': '4b6996662ee2b927e9831c21b3aba7f75a656ac1a311f89d7d7165889c6b6c66'}]}, 'timestamp': '2025-12-06 10:17:08.002643', '_unique_id': 'edb1af72fbc44923800b4d15893bb603'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.007 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4efed2de-41d7-41fc-8279-d8147f880f74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.006446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b86fb350-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': '45a05f9b6b448f89b45dbe74baef80864e6abf3e4b38f5f53672a80790baaacc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.006446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b86fc840-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': '8e5f70e76e79b217c85da8711b36c0851d87bc0462ea1fd45ddb2d04b858a8ee'}]}, 'timestamp': '2025-12-06 10:17:08.007544', '_unique_id': '8fb82c3a411047f98ac9944548ecec6b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain podman[317150]: 2025-12-06 10:17:08.01224131 +0000 UTC m=+0.168616785 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6585df6f-e727-40eb-9177-6fab047affbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.009923', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b8703744-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': '73ae9d3838e28cc4fe2aeeda6295d78227c5bfd98c1d30ac507bda9858154a2c'}]}, 'timestamp': '2025-12-06 10:17:08.010446', '_unique_id': 'ddbd988f246c45af8201f966767cac9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40938c5d-8545-4f6f-af1a-abdf4a61a548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.012664', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b870a2c4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': '517915c279a58cc59a66e566f8db149b3abd184c98b4f8797f445bb961c95cb1'}]}, 'timestamp': '2025-12-06 10:17:08.013174', '_unique_id': 'cdf3492e83b24fb5bbb1e43daf063e89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc30977-586a-4a27-8d1e-597d7b222ff1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.015489', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8710f84-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': 'a53046bfbb22905a3257d25290fb3f97df0e4bb905ca4e16b992b6209be57d15'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.015489', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b87120c8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.182092584, 'message_signature': 'e624872abf6348c1b5c0193d8647126b13e957b0c8f1229a5006c8424463133e'}]}, 'timestamp': '2025-12-06 10:17:08.016644', '_unique_id': 'c29c14eb66624af8a17a96867e344af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2190a60d-af47-4316-90bb-e85727e16afa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.018846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b87193d2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': '21475f39b2cac19b8e44a651b6ba45160b38f4907e58e4f415db3a983d1980f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.018846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b871a412-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'f318dd026494c5dedd2e3d533d9786bc433faab2926ea77d5ccd02ce7ccac063'}]}, 'timestamp': '2025-12-06 10:17:08.019730', '_unique_id': '55e8084b46034cff8c143ab39c10530f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.021 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:17:08 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.038 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a397c47-eab9-43ca-8399-de10cb347571', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:17:08.021938', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b8749b86-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.287440688, 'message_signature': 'b4e947e2007eb3b51856f9c43dc00340ea69c9de081c826d92190d1a5b4c0297'}]}, 'timestamp': '2025-12-06 10:17:08.039342', '_unique_id': '8bf255170ea94b1fb42fef3c7b37d3d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.043 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16643b7c-0864-46f8-b545-c28fa48d8b4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.042886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8753fa0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': '466a7da3f6bae4e4527184d39b822ed3cfa3620cb5d4faacca29161c43e9f587'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.042886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8755166-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'f913331cd93feae86d1fe49f7f54e424e17f447bf4d5ed940ee74074155ac137'}]}, 'timestamp': '2025-12-06 10:17:08.043839', '_unique_id': '60f509ae6e314587b27f5d32e45db9db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.046 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.046 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03c1b836-4850-4456-8e9f-982a70a03bc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.046162', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b875be12-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': 'a66d5697e6b8dcd010cc8c9bf07a18361fed191081d61e30f5c6fc1c1a98b03e'}]}, 'timestamp': '2025-12-06 10:17:08.046621', '_unique_id': '97a53fd34b0c4178af8dc29a8859a713'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.048 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.048 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c236c281-338b-470c-8e6a-b9ba0c4b52c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.048890', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b87628b6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': '5e6e54e98b8d606a5de94da48cf662826c0e49ebc6e364cc2a4c8e4da0a56af4'}]}, 'timestamp': '2025-12-06 10:17:08.049346', '_unique_id': 'ded72ee178e94f9cb0ba4d0306b9d769'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.051 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.052 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6d8a3a1-7255-4689-88b8-f7b16bc76f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.051517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8768eb4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'ea6edc5d50c6eb6ba1cbe6895f59b31b8d30a53d8b1adbf0611f233379ee2461'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.051517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b876a340-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'f2d9322dceb7a8bf1a83ab4d60ac05254bb9d43c0d70e5886fb41c919bfb102a'}]}, 'timestamp': '2025-12-06 10:17:08.052464', '_unique_id': '3f8047fbbe13494ab129aa0000e37ab9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.054 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.055 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40d6387d-c714-411d-9c1b-29b79d2e82eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.054644', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b8770c86-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'efbd9214f5ca818911e8b67e2e05cb3158be306c79081374f910c7d4332a8e4d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.054644', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b87723c4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': '32f7098e80c0899a8ca35b8afd62bb13095e8f5e9382f15d534c3617df3dc9a1'}]}, 'timestamp': '2025-12-06 10:17:08.055873', '_unique_id': 'd274687dc31d44aea43e238e3d006230'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.058 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.058 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be81a68e-2d89-460f-82e7-ef49a4e00f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.058819', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b877af74-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': 'e2f92c992bddb1bb72f75a3008e03d055dfdd96d8a318ee3ed5bc87b8d977003'}]}, 'timestamp': '2025-12-06 10:17:08.059356', '_unique_id': 'fd77dfa8c4574fb7bacd766da08006e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.060 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.061 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.061 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f25947dc-9b03-43f7-88be-4b21092eccd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:17:08.061600', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'b87819e6-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.168387908, 'message_signature': 'a91a0350b8ba036a896185982a66953d0ab3a93c78105e0f21d1f502e3c6065a'}]}, 'timestamp': '2025-12-06 10:17:08.062081', '_unique_id': '0ac109510a174eabbf3837d056a4fc61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.063 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.064 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.064 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.064 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 16800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcfa7939-6ba2-4e50-bdbd-8becb0f0757e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16800000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:17:08.064505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b8788a02-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.287440688, 'message_signature': 'e7fcf639eeab7ba0729f7b3af067b8bd929c11cbdae12d2c22c80e609dc4d2a8'}]}, 'timestamp': '2025-12-06 10:17:08.064967', '_unique_id': '558a01ed3e524e948cf674851acc29bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.065 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.067 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.067 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.067 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5dc6c6e-6c44-4da0-9701-302bb2f2c2ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:17:08.067343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b878f762-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'c2e4f964e039e2896e1a2fe7723b102cfb3d4cae073d864ef5c49be197d75c0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:17:08.067343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b8790b1c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12646.213841076, 'message_signature': 'be4ecaf8e5bb1e0fd153d497279b9933fa984eeb8485d963a91df8981e3afc8b'}]}, 'timestamp': '2025-12-06 10:17:08.068161', '_unique_id': '1a01c615fd84412991695b117ed93f4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:17:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:17:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:17:08 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:08.211 2 INFO neutron.agent.securitygroups_rpc [None req-7b711491-888b-4783-949a-3ad1e34a6987 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e127 e127: 6 total, 6 up, 6 in
Dec 06 10:17:08 np0005548789.localdomain ceph-mon[298582]: pgmap v197: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 7.5 KiB/s wr, 88 op/s
Dec 06 10:17:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:08 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 1 addresses
Dec 06 10:17:08 np0005548789.localdomain podman[317207]: 2025-12-06 10:17:08.561383127 +0000 UTC m=+0.071137499 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:17:08 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:08 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:08.843 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:08 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:08.859 2 INFO neutron.agent.securitygroups_rpc [None req-5531eff7-1536-47cb-87b6-fc07778b8cfc 183487bfea4148c8bd274489b01ac583 290c121e7a5344fea2a32f4e64e74fb4 - - default default] Security group member updated ['7248d87f-aba7-4d7e-b680-1fdbc4f1cdd3']
Dec 06 10:17:09 np0005548789.localdomain dnsmasq[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/addn_hosts - 0 addresses
Dec 06 10:17:09 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/host
Dec 06 10:17:09 np0005548789.localdomain dnsmasq-dhcp[316635]: read /var/lib/neutron/dhcp/8fb7fee7-47f3-496e-84a0-2200c47dea55/opts
Dec 06 10:17:09 np0005548789.localdomain podman[317246]: 2025-12-06 10:17:09.085901916 +0000 UTC m=+0.060968891 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:17:09 np0005548789.localdomain ceph-mon[298582]: osdmap e127: 6 total, 6 up, 6 in
Dec 06 10:17:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e128 e128: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548789.localdomain dnsmasq[316635]: exiting on receipt of SIGTERM
Dec 06 10:17:10 np0005548789.localdomain podman[317283]: 2025-12-06 10:17:10.048030839 +0000 UTC m=+0.059788135 container kill 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:10 np0005548789.localdomain systemd[1]: libpod-31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936.scope: Deactivated successfully.
Dec 06 10:17:10 np0005548789.localdomain podman[317296]: 2025-12-06 10:17:10.118387663 +0000 UTC m=+0.053103051 container died 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:10 np0005548789.localdomain podman[317296]: 2025-12-06 10:17:10.159004555 +0000 UTC m=+0.093719883 container cleanup 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:10 np0005548789.localdomain systemd[1]: libpod-conmon-31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936.scope: Deactivated successfully.
Dec 06 10:17:10 np0005548789.localdomain podman[317297]: 2025-12-06 10:17:10.203732721 +0000 UTC m=+0.132462178 container remove 31508468c3e2cd2720accdb917799d9a6e9d1e3826de33ddb026a829aff69936 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8fb7fee7-47f3-496e-84a0-2200c47dea55, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:10.215 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:10Z|00191|binding|INFO|Releasing lport 949a183f-bfda-4354-9310-98929388f22d from this chassis (sb_readonly=0)
Dec 06 10:17:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:10Z|00192|binding|INFO|Setting lport 949a183f-bfda-4354-9310-98929388f22d down in Southbound
Dec 06 10:17:10 np0005548789.localdomain kernel: device tap949a183f-bf left promiscuous mode
Dec 06 10:17:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:10.227 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-8fb7fee7-47f3-496e-84a0-2200c47dea55', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8fb7fee7-47f3-496e-84a0-2200c47dea55', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '290c121e7a5344fea2a32f4e64e74fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6e7c85-e1e8-4901-8e98-f2cdf448ee9d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=949a183f-bfda-4354-9310-98929388f22d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:10.230 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 949a183f-bfda-4354-9310-98929388f22d in datapath 8fb7fee7-47f3-496e-84a0-2200c47dea55 unbound from our chassis
Dec 06 10:17:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:10.232 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8fb7fee7-47f3-496e-84a0-2200c47dea55, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:10.233 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e1bae5e5-3c00-4a46-8624-06382b6e8637]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:10.236 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e129 e129: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548789.localdomain ceph-mon[298582]: pgmap v199: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 188 KiB/s rd, 16 KiB/s wr, 256 op/s
Dec 06 10:17:10 np0005548789.localdomain ceph-mon[298582]: osdmap e128: 6 total, 6 up, 6 in
Dec 06 10:17:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:10.340 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:10.414 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:10.747 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e623c434dbbedf87878fe7b9c9b2c78a861ad29951b1d46175e6c1c0d161e920-merged.mount: Deactivated successfully.
Dec 06 10:17:11 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d8fb7fee7\x2d47f3\x2d496e\x2d84a0\x2d2200c47dea55.mount: Deactivated successfully.
Dec 06 10:17:11 np0005548789.localdomain ceph-mon[298582]: osdmap e129: 6 total, 6 up, 6 in
Dec 06 10:17:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:11Z|00193|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:11.412 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:11.613 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:11.613 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:11.616 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:17:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e130 e130: 6 total, 6 up, 6 in
Dec 06 10:17:12 np0005548789.localdomain ceph-mon[298582]: pgmap v202: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 234 KiB/s rd, 20 KiB/s wr, 319 op/s
Dec 06 10:17:12 np0005548789.localdomain ceph-mon[298582]: osdmap e130: 6 total, 6 up, 6 in
Dec 06 10:17:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:12.534 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:12.610 2 INFO neutron.agent.securitygroups_rpc [req-0f5bcb52-14d6-4090-84d5-2a6fc264a912 req-f6b94b35-a5a9-45fc-80c3-03af12f9ebaa b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['1e2df8fe-9d93-4483-a509-0caee18c220e']
Dec 06 10:17:12 np0005548789.localdomain sshd[317325]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:17:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:17:12 np0005548789.localdomain podman[317326]: 2025-12-06 10:17:12.905361205 +0000 UTC m=+0.070289163 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 06 10:17:12 np0005548789.localdomain podman[317326]: 2025-12-06 10:17:12.912974836 +0000 UTC m=+0.077902804 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:12 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:17:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:12.963 2 INFO neutron.agent.securitygroups_rpc [None req-5b35cf9b-2c10-4acb-804d-e7f71d7bfae3 7dcd2b11aeb4499894c7ac7c29cb6997 d6a02136413f4ad3ac51d2c4ffdad3d4 - - default default] Security group member updated ['58296f43-3702-412f-8387-07510507ed41']
Dec 06 10:17:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:13.847 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:13 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:13.898 2 INFO neutron.agent.securitygroups_rpc [req-01e69ff7-4c57-4a62-a8e2-72eac205e556 req-eb6ec33a-4a21-4246-bab7-a4fceda1903a b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['73772eb3-7feb-4994-9518-58f9e6d5a8ed']
Dec 06 10:17:14 np0005548789.localdomain ceph-mon[298582]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 137 KiB/s rd, 10 KiB/s wr, 188 op/s
Dec 06 10:17:14 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:14.676 2 INFO neutron.agent.securitygroups_rpc [None req-591e6c3a-21e9-4cb6-8654-ee5dfe5ee17d f89e0038548e41fa9a8202b7a7e9ade1 49bb78ce003e4bec87707ab7af03ae7e - - default default] Security group rule updated ['7d9717d3-d014-450e-9e8d-c62143b51d32']
Dec 06 10:17:15 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:15.529 2 INFO neutron.agent.securitygroups_rpc [req-802ef8ad-2f30-424a-8810-ccf196e89ec8 req-2c9ca4ac-9b05-42f0-9546-b86c6383ded6 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['80cd7ff3-0b8b-4d61-9358-b2f28d5f4668']
Dec 06 10:17:15 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:15 np0005548789.localdomain podman[317362]: 2025-12-06 10:17:15.898432218 +0000 UTC m=+0.062550358 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:16 np0005548789.localdomain ceph-mon[298582]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 8.3 KiB/s wr, 153 op/s
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:17:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:17:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:17:16 np0005548789.localdomain podman[317384]: 2025-12-06 10:17:16.924116228 +0000 UTC m=+0.085654998 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:17:16 np0005548789.localdomain podman[317384]: 2025-12-06 10:17:16.934083181 +0000 UTC m=+0.095621941 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:17:16 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:17:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e131 e131: 6 total, 6 up, 6 in
Dec 06 10:17:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:17.224 2 INFO neutron.agent.securitygroups_rpc [req-d0c022c7-5c29-48d9-b4af-ef083b33fa00 req-5f51dca4-f136-4f7f-a521-ca766171afcb b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['48d24f9a-1de0-4ca7-bff4-bdd00474b49e']
Dec 06 10:17:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:17.571 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:17.657 2 INFO neutron.agent.securitygroups_rpc [None req-2df808f7-3669-4bdd-a1f6-a6327b63c196 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:18.077 263652 INFO neutron.agent.linux.ip_lib [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Device tap809f0ef4-0c cannot be used as it has no MAC address
Dec 06 10:17:18 np0005548789.localdomain ceph-mon[298582]: osdmap e131: 6 total, 6 up, 6 in
Dec 06 10:17:18 np0005548789.localdomain ceph-mon[298582]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 98 KiB/s rd, 7.3 KiB/s wr, 135 op/s
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain kernel: device tap809f0ef4-0c entered promiscuous mode
Dec 06 10:17:18 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016238.1198] manager: (tap809f0ef4-0c): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Dec 06 10:17:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:18Z|00194|binding|INFO|Claiming lport 809f0ef4-0cca-474a-984b-630935d33748 for this chassis.
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.120 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:18Z|00195|binding|INFO|809f0ef4-0cca-474a-984b-630935d33748: Claiming unknown
Dec 06 10:17:18 np0005548789.localdomain systemd-udevd[317417]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:18.133 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-e709cdf3-3894-4310-9fed-c1671aabae61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e709cdf3-3894-4310-9fed-c1671aabae61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64b9b91747c648148f6dd23ce81ceb80', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd31e5c5-52c8-4ae1-8e71-d675fcdc4430, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=809f0ef4-0cca-474a-984b-630935d33748) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:18.135 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 809f0ef4-0cca-474a-984b-630935d33748 in datapath e709cdf3-3894-4310-9fed-c1671aabae61 bound to our chassis
Dec 06 10:17:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:18.137 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e709cdf3-3894-4310-9fed-c1671aabae61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:18.137 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd631c9-91e2-4cd3-b453-5ce2dfe70f81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.154 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:18Z|00196|binding|INFO|Setting lport 809f0ef4-0cca-474a-984b-630935d33748 ovn-installed in OVS
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.163 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:18Z|00197|binding|INFO|Setting lport 809f0ef4-0cca-474a-984b-630935d33748 up in Southbound
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap809f0ef4-0c: No such device
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.192 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.219 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:18.848 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:18.887 2 INFO neutron.agent.securitygroups_rpc [None req-a4d21316-b177-48b7-92ec-319ed42d1b0b 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:19 np0005548789.localdomain podman[317489]: 
Dec 06 10:17:19 np0005548789.localdomain podman[317489]: 2025-12-06 10:17:19.071575915 +0000 UTC m=+0.088478575 container create bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:19 np0005548789.localdomain systemd[1]: Started libpod-conmon-bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629.scope.
Dec 06 10:17:19 np0005548789.localdomain systemd[1]: tmp-crun.d4ESPW.mount: Deactivated successfully.
Dec 06 10:17:19 np0005548789.localdomain podman[317489]: 2025-12-06 10:17:19.02856727 +0000 UTC m=+0.045469960 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:19 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:19 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f971ae176a97e0304dfc9b45d1512a47d408b01268faabd5b4da043741f7c056/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:19 np0005548789.localdomain podman[317489]: 2025-12-06 10:17:19.151681714 +0000 UTC m=+0.168584374 container init bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:19 np0005548789.localdomain podman[317489]: 2025-12-06 10:17:19.160350587 +0000 UTC m=+0.177253257 container start bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: started, version 2.85 cachesize 150
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: DNS service limited to local subnets
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: warning: no upstream servers configured
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 0 addresses
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.220 263652 INFO neutron.agent.dhcp.agent [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:17Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbecb20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbec0d0>], id=b93a6690-afbd-469d-bbc9-a5932ffd807d, ip_allocation=immediate, mac_address=fa:16:3e:b3:34:ce, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1911397592, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:15Z, description=, dns_domain=, id=e709cdf3-3894-4310-9fed-c1671aabae61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-140163012, port_security_enabled=True, project_id=64b9b91747c648148f6dd23ce81ceb80, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1327, status=ACTIVE, subnets=['bb46347f-24c7-43e9-9180-2da434974c29'], tags=[], tenant_id=64b9b91747c648148f6dd23ce81ceb80, updated_at=2025-12-06T10:17:16Z, vlan_transparent=None, network_id=e709cdf3-3894-4310-9fed-c1671aabae61, port_security_enabled=True, project_id=64b9b91747c648148f6dd23ce81ceb80, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['592d46d3-9a35-48d4-b6ca-ba1068626b4d'], standard_attr_id=1350, status=DOWN, tags=[], tenant_id=64b9b91747c648148f6dd23ce81ceb80, updated_at=2025-12-06T10:17:17Z on network e709cdf3-3894-4310-9fed-c1671aabae61
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.317 263652 INFO neutron.agent.dhcp.agent [None req-0099a7fb-e025-473b-ad90-745e6e5c3e62 - - - - - -] DHCP configuration for ports {'b171ac9f-0dc2-448f-bf80-30471cfeee04'} is completed
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 1 addresses
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:19 np0005548789.localdomain podman[317524]: 2025-12-06 10:17:19.415120814 +0000 UTC m=+0.058336059 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.573 263652 INFO neutron.agent.dhcp.agent [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:18Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc063a0>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc33f10>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06070>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06700>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc33250>], id=105decea-a722-4708-998a-413f0ec23ccd, ip_allocation=immediate, mac_address=fa:16:3e:bf:1d:ef, name=tempest-ExtraDHCPOptionsIpV6TestJSON-611288529, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:15Z, description=, dns_domain=, id=e709cdf3-3894-4310-9fed-c1671aabae61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-140163012, port_security_enabled=True, project_id=64b9b91747c648148f6dd23ce81ceb80, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1327, status=ACTIVE, subnets=['bb46347f-24c7-43e9-9180-2da434974c29'], tags=[], tenant_id=64b9b91747c648148f6dd23ce81ceb80, updated_at=2025-12-06T10:17:16Z, vlan_transparent=None, network_id=e709cdf3-3894-4310-9fed-c1671aabae61, port_security_enabled=True, project_id=64b9b91747c648148f6dd23ce81ceb80, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['592d46d3-9a35-48d4-b6ca-ba1068626b4d'], standard_attr_id=1356, status=DOWN, tags=[], tenant_id=64b9b91747c648148f6dd23ce81ceb80, updated_at=2025-12-06T10:17:18Z on network e709cdf3-3894-4310-9fed-c1671aabae61
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.594 263652 INFO neutron.agent.linux.dhcp [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.595 263652 INFO neutron.agent.linux.dhcp [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.595 263652 INFO neutron.agent.linux.dhcp [None req-5affb593-f8b2-4f20-8c0e-96fe8e5951d1 - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:19.632 2 INFO neutron.agent.securitygroups_rpc [None req-de533a6a-08ae-42c2-b158-11c15e64ecbf 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:19.699 2 INFO neutron.agent.securitygroups_rpc [req-5de42e7e-0662-4156-9401-22106a567059 req-ed4ee54f-d494-4326-9cfe-66d7201bb9f8 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.751 263652 INFO neutron.agent.dhcp.agent [None req-403a028f-089d-4a9b-b31d-59c08b8c8fc5 - - - - - -] DHCP configuration for ports {'b93a6690-afbd-469d-bbc9-a5932ffd807d'} is completed
Dec 06 10:17:19 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 2 addresses
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:19 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:19 np0005548789.localdomain podman[317562]: 2025-12-06 10:17:19.779090964 +0000 UTC m=+0.065736844 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:17:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:19.972 263652 INFO neutron.agent.dhcp.agent [None req-244ee3ae-dbbf-490b-86f7-19454eb179cb - - - - - -] DHCP configuration for ports {'105decea-a722-4708-998a-413f0ec23ccd'} is completed
Dec 06 10:17:19 np0005548789.localdomain sshd[317589]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:17:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:17:20 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 1 addresses
Dec 06 10:17:20 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:20 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:20 np0005548789.localdomain podman[317604]: 2025-12-06 10:17:20.144378545 +0000 UTC m=+0.071305305 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:17:20 np0005548789.localdomain ceph-mon[298582]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 6.2 KiB/s wr, 114 op/s
Dec 06 10:17:20 np0005548789.localdomain podman[317610]: 2025-12-06 10:17:20.197012741 +0000 UTC m=+0.104335986 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Dec 06 10:17:20 np0005548789.localdomain podman[317610]: 2025-12-06 10:17:20.22437187 +0000 UTC m=+0.131695115 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:17:20 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:17:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:20.355 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:17Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc4ab20>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd770d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd77310>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd77a90>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd77130>], id=b93a6690-afbd-469d-bbc9-a5932ffd807d, ip_allocation=immediate, mac_address=fa:16:3e:b3:34:ce, name=tempest-new-port-name-1100173450, network_id=e709cdf3-3894-4310-9fed-c1671aabae61, port_security_enabled=True, project_id=64b9b91747c648148f6dd23ce81ceb80, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['592d46d3-9a35-48d4-b6ca-ba1068626b4d'], standard_attr_id=1350, status=DOWN, tags=[], tenant_id=64b9b91747c648148f6dd23ce81ceb80, updated_at=2025-12-06T10:17:19Z on network e709cdf3-3894-4310-9fed-c1671aabae61
Dec 06 10:17:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:20.370 263652 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:20.371 263652 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:20.371 263652 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Dec 06 10:17:20 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 1 addresses
Dec 06 10:17:20 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:20 np0005548789.localdomain podman[317669]: 2025-12-06 10:17:20.529171386 +0000 UTC m=+0.058950939 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:20 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:20.530 2 INFO neutron.agent.securitygroups_rpc [req-0deed905-7e01-4df3-9b96-c6dd2bc740af req-aa574686-cd75-40ee-9098-a2781b4cfdf3 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:20 np0005548789.localdomain sshd[317589]: Received disconnect from 64.227.102.57 port 47086:11: Bye Bye [preauth]
Dec 06 10:17:20 np0005548789.localdomain sshd[317589]: Disconnected from authenticating user root 64.227.102.57 port 47086 [preauth]
Dec 06 10:17:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:20.618 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:17:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:20.833 263652 INFO neutron.agent.dhcp.agent [None req-ae631429-7ca7-4709-b5e7-03d9ac315beb - - - - - -] DHCP configuration for ports {'b93a6690-afbd-469d-bbc9-a5932ffd807d'} is completed
Dec 06 10:17:21 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:21.274 2 INFO neutron.agent.securitygroups_rpc [None req-a16f2b30-088a-4292-a104-7f6939a88353 406e5cc53df749808a8770da68d7033d 64b9b91747c648148f6dd23ce81ceb80 - - default default] Security group member updated ['592d46d3-9a35-48d4-b6ca-ba1068626b4d']
Dec 06 10:17:21 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:21.331 2 INFO neutron.agent.securitygroups_rpc [req-eb1d2fcf-8073-401d-9c1d-cc925d78bfca req-148b7f3e-4a54-4ceb-8b18-a63fa0926a26 b4f9b4e4cabd4b079cb8c31c22004b7a 37dcf5204733427ebb8bdbe574dca584 - - default default] Security group rule updated ['9b6ed766-684e-4de1-9195-49dc13639cf2']
Dec 06 10:17:21 np0005548789.localdomain dnsmasq[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/addn_hosts - 0 addresses
Dec 06 10:17:21 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/host
Dec 06 10:17:21 np0005548789.localdomain dnsmasq-dhcp[317507]: read /var/lib/neutron/dhcp/e709cdf3-3894-4310-9fed-c1671aabae61/opts
Dec 06 10:17:21 np0005548789.localdomain podman[317707]: 2025-12-06 10:17:21.488830284 +0000 UTC m=+0.065375454 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.095509) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242095590, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2578, "num_deletes": 264, "total_data_size": 3568876, "memory_usage": 3624720, "flush_reason": "Manual Compaction"}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242108432, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2275089, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21900, "largest_seqno": 24472, "table_properties": {"data_size": 2265916, "index_size": 5678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20684, "raw_average_key_size": 21, "raw_value_size": 2246918, "raw_average_value_size": 2321, "num_data_blocks": 247, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016079, "oldest_key_time": 1765016079, "file_creation_time": 1765016242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 12967 microseconds, and 6082 cpu microseconds.
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.108483) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2275089 bytes OK
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.108506) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.110155) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.110176) EVENT_LOG_v1 {"time_micros": 1765016242110170, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.110199) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3557478, prev total WAL file size 3557478, number of live WAL files 2.
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.111176) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2221KB)], [36(18MB)]
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242111228, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 21558254, "oldest_snapshot_seqno": -1}
Dec 06 10:17:22 np0005548789.localdomain dnsmasq[317507]: exiting on receipt of SIGTERM
Dec 06 10:17:22 np0005548789.localdomain podman[317743]: 2025-12-06 10:17:22.151020069 +0000 UTC m=+0.062532267 container kill bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:17:22 np0005548789.localdomain systemd[1]: libpod-bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629.scope: Deactivated successfully.
Dec 06 10:17:22 np0005548789.localdomain podman[317757]: 2025-12-06 10:17:22.205593905 +0000 UTC m=+0.046164072 container died bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12583 keys, 17569151 bytes, temperature: kUnknown
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242206254, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 17569151, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17497684, "index_size": 38918, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 336683, "raw_average_key_size": 26, "raw_value_size": 17283747, "raw_average_value_size": 1373, "num_data_blocks": 1480, "num_entries": 12583, "num_filter_entries": 12583, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016242, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.206681) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 17569151 bytes
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.209528) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 226.5 rd, 184.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 18.4 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(17.2) write-amplify(7.7) OK, records in: 13121, records dropped: 538 output_compression: NoCompression
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.209570) EVENT_LOG_v1 {"time_micros": 1765016242209551, "job": 20, "event": "compaction_finished", "compaction_time_micros": 95186, "compaction_time_cpu_micros": 46164, "output_level": 6, "num_output_files": 1, "total_output_size": 17569151, "num_input_records": 13121, "num_output_records": 12583, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242210163, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016242213743, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.111050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.213844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.213850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.213852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.213854) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:17:22.213856) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:22 np0005548789.localdomain ceph-mon[298582]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 5.5 KiB/s wr, 101 op/s
Dec 06 10:17:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:22 np0005548789.localdomain podman[317757]: 2025-12-06 10:17:22.243431802 +0000 UTC m=+0.084001889 container cleanup bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:22 np0005548789.localdomain systemd[1]: libpod-conmon-bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629.scope: Deactivated successfully.
Dec 06 10:17:22 np0005548789.localdomain podman[317764]: 2025-12-06 10:17:22.313273331 +0000 UTC m=+0.139347288 container remove bdd630025009e2e44fbf04be47c5d7ffaef46590b0c0efcadf90ee243c00b629 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e709cdf3-3894-4310-9fed-c1671aabae61, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:22.359 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:22Z|00198|binding|INFO|Releasing lport 809f0ef4-0cca-474a-984b-630935d33748 from this chassis (sb_readonly=0)
Dec 06 10:17:22 np0005548789.localdomain kernel: device tap809f0ef4-0c left promiscuous mode
Dec 06 10:17:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:22Z|00199|binding|INFO|Setting lport 809f0ef4-0cca-474a-984b-630935d33748 down in Southbound
Dec 06 10:17:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:22.380 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:22 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f971ae176a97e0304dfc9b45d1512a47d408b01268faabd5b4da043741f7c056-merged.mount: Deactivated successfully.
Dec 06 10:17:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:22.491 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-e709cdf3-3894-4310-9fed-c1671aabae61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e709cdf3-3894-4310-9fed-c1671aabae61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64b9b91747c648148f6dd23ce81ceb80', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd31e5c5-52c8-4ae1-8e71-d675fcdc4430, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=809f0ef4-0cca-474a-984b-630935d33748) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:22.493 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 809f0ef4-0cca-474a-984b-630935d33748 in datapath e709cdf3-3894-4310-9fed-c1671aabae61 unbound from our chassis
Dec 06 10:17:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:22.494 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e709cdf3-3894-4310-9fed-c1671aabae61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:22.495 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a69111f9-d8c9-4ae8-9c7f-07f4ed97ac54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:22.573 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:22.706 263652 INFO neutron.agent.dhcp.agent [None req-747ed302-3a3d-4852-ad67-7e34b8bd675e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:22.707 263652 INFO neutron.agent.dhcp.agent [None req-747ed302-3a3d-4852-ad67-7e34b8bd675e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:22 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2de709cdf3\x2d3894\x2d4310\x2d9fed\x2dc1671aabae61.mount: Deactivated successfully.
Dec 06 10:17:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:23.002 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:23 np0005548789.localdomain sshd[317325]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:17:23 np0005548789.localdomain sshd[317325]: banner exchange: Connection from 123.160.164.187 port 42188: Connection timed out
Dec 06 10:17:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:23Z|00200|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:23.305 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:23.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:17:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:17:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:17:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:17:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:17:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19266 "" "Go-http-client/1.1"
Dec 06 10:17:24 np0005548789.localdomain ceph-mon[298582]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:26 np0005548789.localdomain ceph-mon[298582]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:26Z|00201|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:26 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:17:26 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:26 np0005548789.localdomain podman[317807]: 2025-12-06 10:17:26.440833666 +0000 UTC m=+0.070120148 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:17:26 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:26.447 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:27.576 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:28 np0005548789.localdomain ceph-mon[298582]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:28.862 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:29.499 2 INFO neutron.agent.securitygroups_rpc [None req-b4a3dd75-3886-433c-a68a-5b82ba491223 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:29.934 263652 INFO neutron.agent.linux.ip_lib [None req-5a0982c0-0ffd-4ef1-87ce-b364c336c465 - - - - - -] Device tap1154309d-20 cannot be used as it has no MAC address
Dec 06 10:17:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:29.991 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:29 np0005548789.localdomain kernel: device tap1154309d-20 entered promiscuous mode
Dec 06 10:17:30 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016250.0001] manager: (tap1154309d-20): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Dec 06 10:17:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:30Z|00202|binding|INFO|Claiming lport 1154309d-2092-44e6-a8a3-8b5f18384543 for this chassis.
Dec 06 10:17:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:30Z|00203|binding|INFO|1154309d-2092-44e6-a8a3-8b5f18384543: Claiming unknown
Dec 06 10:17:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:30.001 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548789.localdomain systemd-udevd[317838]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:30.015 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1154309d-2092-44e6-a8a3-8b5f18384543) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:30.018 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1154309d-2092-44e6-a8a3-8b5f18384543 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:30 np0005548789.localdomain systemd-journald[47810]: Data hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Dec 06 10:17:30 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 10:17:30 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:17:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:30.020 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:30.024 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c46ac7-5337-4d33-904d-adf6d24c9cef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:30.032 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:30Z|00204|binding|INFO|Setting lport 1154309d-2092-44e6-a8a3-8b5f18384543 ovn-installed in OVS
Dec 06 10:17:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:30Z|00205|binding|INFO|Setting lport 1154309d-2092-44e6-a8a3-8b5f18384543 up in Southbound
Dec 06 10:17:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:30.036 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1154309d-20: No such device
Dec 06 10:17:30 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:30.064 2 INFO neutron.agent.securitygroups_rpc [None req-77263169-ab43-473e-a592-07200b19e18c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:30.076 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:30.107 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:30 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:17:30 np0005548789.localdomain ceph-mon[298582]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:30 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:30.255 2 INFO neutron.agent.securitygroups_rpc [None req-b0fdf288-4ef8-4212-8aee-98bfee473c24 8eeb1ce8ea6f4981a55c23fbea57f4cb f9595f0635f14c2196533c0f5ee5dc3b - - default default] Security group member updated ['cab1d39e-aba5-4938-880e-87b80fed90d0']
Dec 06 10:17:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:30.321 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdc3a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5f970>], id=7ea19ce8-d67f-4375-adea-622ee0a8cf03, ip_allocation=immediate, mac_address=fa:16:3e:38:89:d8, name=tempest-RoutersAdminNegativeIpV6Test-251011938, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=True, project_id=f9595f0635f14c2196533c0f5ee5dc3b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['cab1d39e-aba5-4938-880e-87b80fed90d0'], standard_attr_id=1437, status=DOWN, tags=[], tenant_id=f9595f0635f14c2196533c0f5ee5dc3b, updated_at=2025-12-06T10:17:29Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:30 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:30 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:30 np0005548789.localdomain podman[317902]: 2025-12-06 10:17:30.555148231 +0000 UTC m=+0.055501425 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:17:30 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:30 np0005548789.localdomain podman[317947]: 
Dec 06 10:17:30 np0005548789.localdomain podman[317947]: 2025-12-06 10:17:30.908427596 +0000 UTC m=+0.090424833 container create fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:17:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:30.913 263652 INFO neutron.agent.dhcp.agent [None req-fe2f1bd1-05bf-479e-85ed-4391ea10d1cc - - - - - -] DHCP configuration for ports {'7ea19ce8-d67f-4375-adea-622ee0a8cf03'} is completed
Dec 06 10:17:30 np0005548789.localdomain systemd[1]: Started libpod-conmon-fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44.scope.
Dec 06 10:17:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:17:30 np0005548789.localdomain podman[317947]: 2025-12-06 10:17:30.8633832 +0000 UTC m=+0.045380477 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:30 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:30 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/319bd8eac1224c0d5ec3129b67444d61663b4bfc2a5a1b52547071c047a6e30a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:30 np0005548789.localdomain podman[317947]: 2025-12-06 10:17:30.998424056 +0000 UTC m=+0.180421293 container init fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:17:31 np0005548789.localdomain podman[317947]: 2025-12-06 10:17:31.01174498 +0000 UTC m=+0.193742217 container start fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: started, version 2.85 cachesize 150
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: DNS service limited to local subnets
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: warning: no upstream servers configured
Dec 06 10:17:31 np0005548789.localdomain dnsmasq-dhcp[317974]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:31 np0005548789.localdomain dnsmasq-dhcp[317974]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:31 np0005548789.localdomain dnsmasq-dhcp[317974]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:17:31 np0005548789.localdomain podman[317963]: 2025-12-06 10:17:31.086679913 +0000 UTC m=+0.114853064 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:17:31 np0005548789.localdomain podman[317963]: 2025-12-06 10:17:31.097077698 +0000 UTC m=+0.125250839 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain podman[317982]: 2025-12-06 10:17:31.170680461 +0000 UTC m=+0.074720218 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:17:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:31.182 263652 INFO neutron.agent.dhcp.agent [None req-26ffc36e-4a76-46e7-8151-c8233f3843fd - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:31 np0005548789.localdomain podman[317982]: 2025-12-06 10:17:31.207209739 +0000 UTC m=+0.111249526 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain dnsmasq[317974]: exiting on receipt of SIGTERM
Dec 06 10:17:31 np0005548789.localdomain podman[318021]: 2025-12-06 10:17:31.345648078 +0000 UTC m=+0.049712969 container kill fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: libpod-fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44.scope: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain podman[318035]: 2025-12-06 10:17:31.415337842 +0000 UTC m=+0.054245746 container died fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:17:31 np0005548789.localdomain podman[318035]: 2025-12-06 10:17:31.447023562 +0000 UTC m=+0.085931426 container cleanup fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: libpod-conmon-fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44.scope: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain podman[318036]: 2025-12-06 10:17:31.497804884 +0000 UTC m=+0.129344365 container remove fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:31.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:31Z|00206|binding|INFO|Releasing lport 1154309d-2092-44e6-a8a3-8b5f18384543 from this chassis (sb_readonly=0)
Dec 06 10:17:31 np0005548789.localdomain kernel: device tap1154309d-20 left promiscuous mode
Dec 06 10:17:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:31Z|00207|binding|INFO|Setting lport 1154309d-2092-44e6-a8a3-8b5f18384543 down in Southbound
Dec 06 10:17:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:31.518 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1154309d-2092-44e6-a8a3-8b5f18384543) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:31.519 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1154309d-2092-44e6-a8a3-8b5f18384543 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:31.519 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:31.520 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[996f7547-2ddc-447a-bd0a-108384b6081d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:31.534 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:31 np0005548789.localdomain sshd[318063]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: tmp-crun.CpQLbr.mount: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-319bd8eac1224c0d5ec3129b67444d61663b4bfc2a5a1b52547071c047a6e30a-merged.mount: Deactivated successfully.
Dec 06 10:17:31 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fcd5fcaf1d18edd42f83cf2c3f5954ccc0712db0917ab07da489c141bedf3a44-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:32 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:32 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:32.191 2 INFO neutron.agent.securitygroups_rpc [None req-a2daea0b-127d-4cb1-8d58-679cf0ec3092 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:32 np0005548789.localdomain ceph-mon[298582]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.577 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:32.682 2 INFO neutron.agent.securitygroups_rpc [None req-a950d9cb-4b90-43c7-9619-4f314921acec 8eeb1ce8ea6f4981a55c23fbea57f4cb f9595f0635f14c2196533c0f5ee5dc3b - - default default] Security group member updated ['cab1d39e-aba5-4938-880e-87b80fed90d0']
Dec 06 10:17:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:32.826 263652 INFO neutron.agent.linux.ip_lib [None req-eb7b53fc-c777-40a1-97d1-51b2015d260d - - - - - -] Device tap097310b2-f2 cannot be used as it has no MAC address
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.852 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain kernel: device tap097310b2-f2 entered promiscuous mode
Dec 06 10:17:32 np0005548789.localdomain systemd-udevd[317841]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:32 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016252.8625] manager: (tap097310b2-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Dec 06 10:17:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:32Z|00208|binding|INFO|Claiming lport 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 for this chassis.
Dec 06 10:17:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:32Z|00209|binding|INFO|097310b2-f25c-43e0-9a4c-c7a1efaf80e5: Claiming unknown
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.866 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:32Z|00210|binding|INFO|Setting lport 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 up in Southbound
Dec 06 10:17:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:32.870 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=097310b2-f25c-43e0-9a4c-c7a1efaf80e5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:32Z|00211|binding|INFO|Setting lport 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 ovn-installed in OVS
Dec 06 10:17:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:32.872 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.871 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.872 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:32.874 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:32.875 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e99d277b-ee7e-4ac4-9c6e-4dd99f200c8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.877 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.897 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.928 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:32.952 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:32 np0005548789.localdomain podman[318093]: 2025-12-06 10:17:32.954739175 +0000 UTC m=+0.058274679 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:32 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:17:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:33 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:33.078 2 INFO neutron.agent.securitygroups_rpc [None req-675c08cc-007c-4dc9-986b-f4514913c9a2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:33 np0005548789.localdomain sshd[318063]: Received disconnect from 14.194.101.210 port 56166:11: Bye Bye [preauth]
Dec 06 10:17:33 np0005548789.localdomain sshd[318063]: Disconnected from authenticating user root 14.194.101.210 port 56166 [preauth]
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.204 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.205 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.206 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.206 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.207 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:17:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1151500263' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.599 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:33 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:33.675 2 INFO neutron.agent.securitygroups_rpc [None req-2bf571b9-2f59-4b7c-8546-bb481f9be7b1 3ea76362796945abb0389f60eab07566 23fdd860878442e1b8fc77e4ae3ef271 - - default default] Security group member updated ['dd9785c1-eb5d-4293-ac78-0fc1ce108f20']
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.689 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.690 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:17:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:33.733 263652 INFO neutron.agent.linux.ip_lib [None req-0f9ee326-3f39-460e-b859-ae70c9c792d7 - - - - - -] Device tapfd998f59-dd cannot be used as it has no MAC address
Dec 06 10:17:33 np0005548789.localdomain podman[318193]: 
Dec 06 10:17:33 np0005548789.localdomain podman[318193]: 2025-12-06 10:17:33.769798207 +0000 UTC m=+0.079634957 container create abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.797 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain kernel: device tapfd998f59-dd entered promiscuous mode
Dec 06 10:17:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:33Z|00212|binding|INFO|Claiming lport fd998f59-ddde-4bfa-95a4-6f61b1679474 for this chassis.
Dec 06 10:17:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:33Z|00213|binding|INFO|fd998f59-ddde-4bfa-95a4-6f61b1679474: Claiming unknown
Dec 06 10:17:33 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016253.8093] manager: (tapfd998f59-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.810 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:33.820 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-fb8c7162-302b-4277-a437-7090f604bfc2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb8c7162-302b-4277-a437-7090f604bfc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23fdd860878442e1b8fc77e4ae3ef271', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23107f01-722b-406d-a1a5-a58a3fd6433e, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=fd998f59-ddde-4bfa-95a4-6f61b1679474) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:33.824 160509 INFO neutron.agent.ovn.metadata.agent [-] Port fd998f59-ddde-4bfa-95a4-6f61b1679474 in datapath fb8c7162-302b-4277-a437-7090f604bfc2 bound to our chassis
Dec 06 10:17:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:33.826 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 84c07f9d-e9b9-4723-baa3-f24a875f62ef IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:17:33 np0005548789.localdomain podman[318193]: 2025-12-06 10:17:33.727937407 +0000 UTC m=+0.037774157 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:33.827 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb8c7162-302b-4277-a437-7090f604bfc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:33.828 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[72664c25-416a-411e-8388-4e3de35d4cb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:33Z|00214|binding|INFO|Setting lport fd998f59-ddde-4bfa-95a4-6f61b1679474 ovn-installed in OVS
Dec 06 10:17:33 np0005548789.localdomain systemd[1]: Started libpod-conmon-abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d.scope.
Dec 06 10:17:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:33Z|00215|binding|INFO|Setting lport fd998f59-ddde-4bfa-95a4-6f61b1679474 up in Southbound
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.853 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.864 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8258fd64577ef554a8e4f45c1379598573b236be0e821bee3caa7d20e35b8f8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:33 np0005548789.localdomain podman[318193]: 2025-12-06 10:17:33.902165151 +0000 UTC m=+0.212001921 container init abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:17:33 np0005548789.localdomain podman[318193]: 2025-12-06 10:17:33.909616017 +0000 UTC m=+0.219452787 container start abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:17:33 np0005548789.localdomain dnsmasq[318222]: started, version 2.85 cachesize 150
Dec 06 10:17:33 np0005548789.localdomain dnsmasq[318222]: DNS service limited to local subnets
Dec 06 10:17:33 np0005548789.localdomain dnsmasq[318222]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:33 np0005548789.localdomain dnsmasq[318222]: warning: no upstream servers configured
Dec 06 10:17:33 np0005548789.localdomain dnsmasq[318222]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.917 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.981 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.982 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11215MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.983 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:33.983 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.040 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.040 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.040 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:17:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:34.092 263652 INFO neutron.agent.dhcp.agent [None req-f71282a2-6faf-4109-9a6a-39296595016c - - - - - -] DHCP configuration for ports {'5bada2e5-c44e-42db-929a-1fcf2ed4098d', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.100 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:34 np0005548789.localdomain ceph-mon[298582]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1151500263' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:34 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:34.248 2 INFO neutron.agent.securitygroups_rpc [None req-f3a7982c-6432-4aaa-a51f-6f45752d4aa1 440e57a58b9f4b64af7435927930ce6a 37eea2b31d9543b793c928d777810de4 - - default default] Security group member updated ['5bf6ab1c-c80a-456c-9ce8-d446d055d129']
Dec 06 10:17:34 np0005548789.localdomain podman[318254]: 2025-12-06 10:17:34.299493253 +0000 UTC m=+0.081950647 container kill abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: tmp-crun.LL7E8X.mount: Deactivated successfully.
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318222]: exiting on receipt of SIGTERM
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: libpod-abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d.scope: Deactivated successfully.
Dec 06 10:17:34 np0005548789.localdomain podman[318296]: 2025-12-06 10:17:34.408643434 +0000 UTC m=+0.081668419 container died abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:17:34 np0005548789.localdomain podman[318296]: 2025-12-06 10:17:34.448961976 +0000 UTC m=+0.121986931 container remove abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:34Z|00216|binding|INFO|Releasing lport 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 from this chassis (sb_readonly=0)
Dec 06 10:17:34 np0005548789.localdomain kernel: device tap097310b2-f2 left promiscuous mode
Dec 06 10:17:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:34Z|00217|binding|INFO|Setting lport 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 down in Southbound
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.459 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:34.469 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=097310b2-f25c-43e0-9a4c-c7a1efaf80e5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:34 np0005548789.localdomain sshd[318323]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:17:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:34.473 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 097310b2-f25c-43e0-9a4c-c7a1efaf80e5 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:34.474 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: libpod-conmon-abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d.scope: Deactivated successfully.
Dec 06 10:17:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:34.475 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[dad0dc18-ab92-4cef-8e44-993fe5fc3211]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.586 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.593 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.617 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.620 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:17:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:34.621 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:34 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:34.797 2 INFO neutron.agent.securitygroups_rpc [None req-b1db9883-f5c1-471b-9a07-cebf6b7ffba6 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:34 np0005548789.localdomain podman[318349]: 
Dec 06 10:17:34 np0005548789.localdomain podman[318349]: 2025-12-06 10:17:34.851019511 +0000 UTC m=+0.094169727 container create c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: Started libpod-conmon-c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c.scope.
Dec 06 10:17:34 np0005548789.localdomain podman[318349]: 2025-12-06 10:17:34.806889703 +0000 UTC m=+0.050039959 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:34 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e8ffbe4ba68729562762de2b34e92b94090dcca76e856e5cd87d912c117f07d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:34 np0005548789.localdomain podman[318349]: 2025-12-06 10:17:34.936683609 +0000 UTC m=+0.179833835 container init c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:34 np0005548789.localdomain podman[318349]: 2025-12-06 10:17:34.945243019 +0000 UTC m=+0.188393235 container start c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318368]: started, version 2.85 cachesize 150
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318368]: DNS service limited to local subnets
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318368]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318368]: warning: no upstream servers configured
Dec 06 10:17:34 np0005548789.localdomain dnsmasq-dhcp[318368]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:17:34 np0005548789.localdomain dnsmasq[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/addn_hosts - 0 addresses
Dec 06 10:17:34 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/host
Dec 06 10:17:34 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/opts
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8258fd64577ef554a8e4f45c1379598573b236be0e821bee3caa7d20e35b8f8d-merged.mount: Deactivated successfully.
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-abdb41519be4cbaad7fa091b9fa225772f0e04a0030c5b35bf7043b4fcf8ed1d-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:34 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:35.076 263652 INFO neutron.agent.dhcp.agent [None req-b61c0392-90c1-45fa-b424-14f5a8b16f60 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:33Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb030a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb03c70>], id=b850c0d9-77c9-4dd5-9ad4-2e5a440a1ba5, ip_allocation=immediate, mac_address=fa:16:3e:65:f6:03, name=tempest-TagsExtTest-884351016, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:29Z, description=, dns_domain=, id=fb8c7162-302b-4277-a437-7090f604bfc2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-640583948, port_security_enabled=True, project_id=23fdd860878442e1b8fc77e4ae3ef271, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1436, status=ACTIVE, subnets=['1af85c38-1fa4-4964-bb73-fcbd8bb9b651'], tags=[], tenant_id=23fdd860878442e1b8fc77e4ae3ef271, updated_at=2025-12-06T10:17:31Z, vlan_transparent=None, network_id=fb8c7162-302b-4277-a437-7090f604bfc2, port_security_enabled=True, project_id=23fdd860878442e1b8fc77e4ae3ef271, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['dd9785c1-eb5d-4293-ac78-0fc1ce108f20'], standard_attr_id=1457, status=DOWN, tags=[], tenant_id=23fdd860878442e1b8fc77e4ae3ef271, updated_at=2025-12-06T10:17:33Z on network fb8c7162-302b-4277-a437-7090f604bfc2
Dec 06 10:17:35 np0005548789.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 10:17:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:35.197 2 INFO neutron.agent.securitygroups_rpc [None req-5b990af0-9142-4008-b949-8f1c6c9fa9d7 440e57a58b9f4b64af7435927930ce6a 37eea2b31d9543b793c928d777810de4 - - default default] Security group member updated ['5bf6ab1c-c80a-456c-9ce8-d446d055d129']
Dec 06 10:17:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:35.232 263652 INFO neutron.agent.dhcp.agent [None req-8b3f73e4-3f94-4385-9ba9-30d1f570a429 - - - - - -] DHCP configuration for ports {'132f8120-ff81-4b64-9ef9-4b612c95da6c'} is completed
Dec 06 10:17:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2930003635' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:35 np0005548789.localdomain dnsmasq[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/addn_hosts - 1 addresses
Dec 06 10:17:35 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/host
Dec 06 10:17:35 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/opts
Dec 06 10:17:35 np0005548789.localdomain podman[318387]: 2025-12-06 10:17:35.297657439 +0000 UTC m=+0.070495739 container kill c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:35 np0005548789.localdomain systemd[1]: tmp-crun.e9PE2o.mount: Deactivated successfully.
Dec 06 10:17:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:35.505 263652 INFO neutron.agent.dhcp.agent [None req-6ab2c6ed-7134-46a9-9aa7-7ca216e2a1e7 - - - - - -] DHCP configuration for ports {'b850c0d9-77c9-4dd5-9ad4-2e5a440a1ba5'} is completed
Dec 06 10:17:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:35.610 263652 INFO neutron.agent.linux.ip_lib [None req-a4ad59cc-3b69-4ecb-9f10-469052654f2c - - - - - -] Device tap5e36b702-7f cannot be used as it has no MAC address
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.639 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain kernel: device tap5e36b702-7f entered promiscuous mode
Dec 06 10:17:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:35Z|00218|binding|INFO|Claiming lport 5e36b702-7f25-4b55-969a-7996ee55fcd1 for this chassis.
Dec 06 10:17:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:35Z|00219|binding|INFO|5e36b702-7f25-4b55-969a-7996ee55fcd1: Claiming unknown
Dec 06 10:17:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016255.6503] manager: (tap5e36b702-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.649 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain sshd[318323]: Received disconnect from 154.113.10.34 port 50740:11: Bye Bye [preauth]
Dec 06 10:17:35 np0005548789.localdomain sshd[318323]: Disconnected from authenticating user root 154.113.10.34 port 50740 [preauth]
Dec 06 10:17:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:35.660 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=5e36b702-7f25-4b55-969a-7996ee55fcd1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:35Z|00220|binding|INFO|Setting lport 5e36b702-7f25-4b55-969a-7996ee55fcd1 ovn-installed in OVS
Dec 06 10:17:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:35Z|00221|binding|INFO|Setting lport 5e36b702-7f25-4b55-969a-7996ee55fcd1 up in Southbound
Dec 06 10:17:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:35.663 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 5e36b702-7f25-4b55-969a-7996ee55fcd1 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:35.664 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:35.665 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce52bab-fc70-4e13-beef-8c8714005f58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.676 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap5e36b702-7f: No such device
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.712 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:35.741 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:35.856 2 INFO neutron.agent.securitygroups_rpc [None req-a5058513-5128-4405-b292-62b6045d3f2a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e132 e132: 6 total, 6 up, 6 in
Dec 06 10:17:36 np0005548789.localdomain ceph-mon[298582]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:36 np0005548789.localdomain podman[318487]: 
Dec 06 10:17:36 np0005548789.localdomain podman[318487]: 2025-12-06 10:17:36.596244336 +0000 UTC m=+0.093521607 container create 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:17:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3.scope.
Dec 06 10:17:36 np0005548789.localdomain podman[318487]: 2025-12-06 10:17:36.55053222 +0000 UTC m=+0.047809491 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cea834eea21e8604cb1b513c17b5bce8499315738b89efdffa08927875e6fc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:36 np0005548789.localdomain podman[318487]: 2025-12-06 10:17:36.680861664 +0000 UTC m=+0.178138905 container init 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:17:36 np0005548789.localdomain podman[318487]: 2025-12-06 10:17:36.692136836 +0000 UTC m=+0.189414067 container start 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:17:36 np0005548789.localdomain dnsmasq[318505]: started, version 2.85 cachesize 150
Dec 06 10:17:36 np0005548789.localdomain dnsmasq[318505]: DNS service limited to local subnets
Dec 06 10:17:36 np0005548789.localdomain dnsmasq[318505]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:36 np0005548789.localdomain dnsmasq[318505]: warning: no upstream servers configured
Dec 06 10:17:36 np0005548789.localdomain dnsmasq-dhcp[318505]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:36 np0005548789.localdomain dnsmasq[318505]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:36 np0005548789.localdomain dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:36 np0005548789.localdomain dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:37.069 263652 INFO neutron.agent.dhcp.agent [None req-8e9a43fc-a131-4331-93cb-4e1a04745d26 - - - - - -] DHCP configuration for ports {'4e06a687-1f49-4292-acf2-929e0eb84acf', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:37 np0005548789.localdomain dnsmasq[318505]: exiting on receipt of SIGTERM
Dec 06 10:17:37 np0005548789.localdomain podman[318523]: 2025-12-06 10:17:37.255437832 +0000 UTC m=+0.059367302 container kill 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:17:37 np0005548789.localdomain systemd[1]: libpod-0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3.scope: Deactivated successfully.
Dec 06 10:17:37 np0005548789.localdomain ceph-mon[298582]: osdmap e132: 6 total, 6 up, 6 in
Dec 06 10:17:37 np0005548789.localdomain podman[318536]: 2025-12-06 10:17:37.333103628 +0000 UTC m=+0.066182829 container died 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:17:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e133 e133: 6 total, 6 up, 6 in
Dec 06 10:17:37 np0005548789.localdomain podman[318536]: 2025-12-06 10:17:37.412181316 +0000 UTC m=+0.145260457 container cleanup 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:17:37 np0005548789.localdomain systemd[1]: libpod-conmon-0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3.scope: Deactivated successfully.
Dec 06 10:17:37 np0005548789.localdomain podman[318538]: 2025-12-06 10:17:37.455943693 +0000 UTC m=+0.179115694 container remove 0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:37 np0005548789.localdomain kernel: device tap5e36b702-7f left promiscuous mode
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.469 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:37Z|00222|binding|INFO|Releasing lport 5e36b702-7f25-4b55-969a-7996ee55fcd1 from this chassis (sb_readonly=0)
Dec 06 10:17:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:37Z|00223|binding|INFO|Setting lport 5e36b702-7f25-4b55-969a-7996ee55fcd1 down in Southbound
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.484 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:37.499 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=5e36b702-7f25-4b55-969a-7996ee55fcd1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:37.502 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 5e36b702-7f25-4b55-969a-7996ee55fcd1 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:37.503 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:37.505 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[7d00bfb2-3e2a-495e-9143-4b952b821f28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.579 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3cea834eea21e8604cb1b513c17b5bce8499315738b89efdffa08927875e6fc9-merged.mount: Deactivated successfully.
Dec 06 10:17:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0fba49061381186b00c01e0d848b7dc36e7749d4eef5b11601e32c45c06e59d3-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.617 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.618 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.618 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.619 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.731 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.732 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.732 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:17:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:37.733 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:17:37 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:38 np0005548789.localdomain ceph-mon[298582]: pgmap v218: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail
Dec 06 10:17:38 np0005548789.localdomain ceph-mon[298582]: osdmap e133: 6 total, 6 up, 6 in
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.424 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.451 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.451 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.452 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.453 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:38 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:38.465 2 INFO neutron.agent.securitygroups_rpc [None req-ab57ea17-3add-445e-9d4b-332ca72ce0af a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:17:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.866 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:38.871 263652 INFO neutron.agent.linux.ip_lib [None req-9ab44985-bdfd-4f48-8db7-e6b4ad6177c4 - - - - - -] Device tap69eb5d2a-05 cannot be used as it has no MAC address
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.895 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain kernel: device tap69eb5d2a-05 entered promiscuous mode
Dec 06 10:17:38 np0005548789.localdomain systemd[1]: tmp-crun.6ZJTe0.mount: Deactivated successfully.
Dec 06 10:17:38 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016258.9110] manager: (tap69eb5d2a-05): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Dec 06 10:17:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:38Z|00224|binding|INFO|Claiming lport 69eb5d2a-055c-47ec-aa6f-2e93d626f115 for this chassis.
Dec 06 10:17:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:38Z|00225|binding|INFO|69eb5d2a-055c-47ec-aa6f-2e93d626f115: Claiming unknown
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.915 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain systemd-udevd[318604]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:38.924 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=69eb5d2a-055c-47ec-aa6f-2e93d626f115) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:38.926 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 69eb5d2a-055c-47ec-aa6f-2e93d626f115 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:38.927 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:38Z|00226|binding|INFO|Setting lport 69eb5d2a-055c-47ec-aa6f-2e93d626f115 up in Southbound
Dec 06 10:17:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:38.929 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8a14d97b-404e-44eb-a9b6-2aba2b7c718e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:38Z|00227|binding|INFO|Setting lport 69eb5d2a-055c-47ec-aa6f-2e93d626f115 ovn-installed in OVS
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.932 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.936 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.950 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain podman[318568]: 2025-12-06 10:17:38.969160692 +0000 UTC m=+0.162528171 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain podman[318567]: 2025-12-06 10:17:38.91700276 +0000 UTC m=+0.111097742 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain podman[318568]: 2025-12-06 10:17:38.985190678 +0000 UTC m=+0.178558127 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:17:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap69eb5d2a-05: No such device
Dec 06 10:17:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:38.995 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:38 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:17:39 np0005548789.localdomain podman[318567]: 2025-12-06 10:17:39.006433472 +0000 UTC m=+0.200528504 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Dec 06 10:17:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:39.023 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:39 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:17:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e134 e134: 6 total, 6 up, 6 in
Dec 06 10:17:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:39.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:39.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:39.183 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:17:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/80922449' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:39 np0005548789.localdomain ceph-mon[298582]: osdmap e134: 6 total, 6 up, 6 in
Dec 06 10:17:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:39.758 2 INFO neutron.agent.securitygroups_rpc [None req-24de80cf-8a07-42c3-8966-675d0403c3d2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:39 np0005548789.localdomain systemd[1]: tmp-crun.dQ9sJB.mount: Deactivated successfully.
Dec 06 10:17:39 np0005548789.localdomain podman[318685]: 
Dec 06 10:17:39 np0005548789.localdomain podman[318685]: 2025-12-06 10:17:39.864913951 +0000 UTC m=+0.090520136 container create b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:17:39 np0005548789.localdomain systemd[1]: Started libpod-conmon-b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc.scope.
Dec 06 10:17:39 np0005548789.localdomain podman[318685]: 2025-12-06 10:17:39.821189235 +0000 UTC m=+0.046795420 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d161c57bfce869ae5bb0e8067b7e7d5f20f25f550e88e581d171fd0c5e663098/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:39 np0005548789.localdomain podman[318685]: 2025-12-06 10:17:39.93972891 +0000 UTC m=+0.165335055 container init b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:17:39 np0005548789.localdomain podman[318685]: 2025-12-06 10:17:39.953088456 +0000 UTC m=+0.178694611 container start b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:39 np0005548789.localdomain dnsmasq[318703]: started, version 2.85 cachesize 150
Dec 06 10:17:39 np0005548789.localdomain dnsmasq[318703]: DNS service limited to local subnets
Dec 06 10:17:39 np0005548789.localdomain dnsmasq[318703]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:39 np0005548789.localdomain dnsmasq[318703]: warning: no upstream servers configured
Dec 06 10:17:39 np0005548789.localdomain dnsmasq-dhcp[318703]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:39 np0005548789.localdomain dnsmasq[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:39 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:39 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:40.012 263652 INFO neutron.agent.dhcp.agent [None req-9ab44985-bdfd-4f48-8db7-e6b4ad6177c4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:29Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb36f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb366d0>], id=afe4ba38-14bd-4006-b873-2ed564ce569c, ip_allocation=immediate, mac_address=fa:16:3e:8f:b9:5b, name=tempest-NetworksTestDHCPv6-469175711, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['79a21d6b-39ef-4420-bf12-860cde44033d'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:29Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1420, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:29Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:40.153 263652 INFO neutron.agent.dhcp.agent [None req-3b069e9b-727e-42e6-8e76-cb6c2a44ee98 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:40.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:40 np0005548789.localdomain dnsmasq[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:40 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:40 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:40 np0005548789.localdomain podman[318721]: 2025-12-06 10:17:40.188003601 +0000 UTC m=+0.050652137 container kill b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:40.308 263652 INFO neutron.agent.dhcp.agent [None req-9ab44985-bdfd-4f48-8db7-e6b4ad6177c4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:31Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6b20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6cd0>], id=5bada2e5-c44e-42db-929a-1fcf2ed4098d, ip_allocation=immediate, mac_address=fa:16:3e:b8:48:6b, name=tempest-NetworksTestDHCPv6-543457619, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['300b3e12-98b7-455f-9860-7b8899b81779'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:31Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1449, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:31Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:40 np0005548789.localdomain ceph-mon[298582]: pgmap v220: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Dec 06 10:17:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:40.369 263652 INFO neutron.agent.dhcp.agent [None req-2f61132c-5ca9-4775-bd00-7d23901b53f2 - - - - - -] DHCP configuration for ports {'afe4ba38-14bd-4006-b873-2ed564ce569c'} is completed
Dec 06 10:17:40 np0005548789.localdomain dnsmasq[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:17:40 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:40 np0005548789.localdomain podman[318758]: 2025-12-06 10:17:40.472114248 +0000 UTC m=+0.061227768 container kill b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:40 np0005548789.localdomain dnsmasq-dhcp[318703]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:40.712 263652 INFO neutron.agent.dhcp.agent [None req-9a58f7a2-9cf5-46d9-b572-061b6162e310 - - - - - -] DHCP configuration for ports {'5bada2e5-c44e-42db-929a-1fcf2ed4098d'} is completed
Dec 06 10:17:40 np0005548789.localdomain dnsmasq[318703]: exiting on receipt of SIGTERM
Dec 06 10:17:40 np0005548789.localdomain podman[318795]: 2025-12-06 10:17:40.903261256 +0000 UTC m=+0.066635652 container kill b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:40 np0005548789.localdomain systemd[1]: libpod-b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc.scope: Deactivated successfully.
Dec 06 10:17:40 np0005548789.localdomain podman[318809]: 2025-12-06 10:17:40.983224571 +0000 UTC m=+0.064043173 container died b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:17:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:41 np0005548789.localdomain podman[318809]: 2025-12-06 10:17:41.020273475 +0000 UTC m=+0.101092037 container cleanup b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:17:41 np0005548789.localdomain systemd[1]: libpod-conmon-b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc.scope: Deactivated successfully.
Dec 06 10:17:41 np0005548789.localdomain podman[318811]: 2025-12-06 10:17:41.062854837 +0000 UTC m=+0.136673227 container remove b32096f403d644e2f4c19ffec842afc03e7ccdcca601ee3d5fd53645a6657ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:41.076 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:41Z|00228|binding|INFO|Releasing lport 69eb5d2a-055c-47ec-aa6f-2e93d626f115 from this chassis (sb_readonly=0)
Dec 06 10:17:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:41Z|00229|binding|INFO|Setting lport 69eb5d2a-055c-47ec-aa6f-2e93d626f115 down in Southbound
Dec 06 10:17:41 np0005548789.localdomain kernel: device tap69eb5d2a-05 left promiscuous mode
Dec 06 10:17:41 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:41.088 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=69eb5d2a-055c-47ec-aa6f-2e93d626f115) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:41 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:41.090 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 69eb5d2a-055c-47ec-aa6f-2e93d626f115 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:41 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:41.092 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:41 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:41.093 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[74f8c446-81ae-45ef-87cc-68b3a5e0dfb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:41.096 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:41 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:41.101 2 INFO neutron.agent.securitygroups_rpc [None req-4acfb63b-6c96-4af3-b5fa-66e73a2e25c0 cf2cadf875da4c9b86fb2902b9ee90bb 2b975a1e6b7941c09260aeb20365b968 - - default default] Security group member updated ['f9be6b32-ff8a-467f-8358-ff505a55042e']
Dec 06 10:17:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:41.158 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:40Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb03a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb03c10>], id=268db8ec-d894-4956-8c8c-14070df1373b, ip_allocation=immediate, mac_address=fa:16:3e:f5:df:b2, name=tempest-RoutersAdminNegativeTest-1492414301, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=True, project_id=2b975a1e6b7941c09260aeb20365b968, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f9be6b32-ff8a-467f-8358-ff505a55042e'], standard_attr_id=1509, status=DOWN, tags=[], tenant_id=2b975a1e6b7941c09260aeb20365b968, updated_at=2025-12-06T10:17:40Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e135 e135: 6 total, 6 up, 6 in
Dec 06 10:17:41 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:41 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:41 np0005548789.localdomain podman[318858]: 2025-12-06 10:17:41.388417781 +0000 UTC m=+0.064024333 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:41 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:41.758 263652 INFO neutron.agent.dhcp.agent [None req-2a9236fd-e688-46e4-aa65-bd10dd2e6ae5 - - - - - -] DHCP configuration for ports {'268db8ec-d894-4956-8c8c-14070df1373b'} is completed
Dec 06 10:17:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d161c57bfce869ae5bb0e8067b7e7d5f20f25f550e88e581d171fd0c5e663098-merged.mount: Deactivated successfully.
Dec 06 10:17:41 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:42.011 2 INFO neutron.agent.securitygroups_rpc [None req-f035cee5-5c71-4777-a408-c824903df12b 3ea76362796945abb0389f60eab07566 23fdd860878442e1b8fc77e4ae3ef271 - - default default] Security group member updated ['dd9785c1-eb5d-4293-ac78-0fc1ce108f20']
Dec 06 10:17:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e136 e136: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:42.180 2 INFO neutron.agent.securitygroups_rpc [None req-2d1fe085-81b9-49e2-b303-f7feeabc4137 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:42 np0005548789.localdomain dnsmasq[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/addn_hosts - 0 addresses
Dec 06 10:17:42 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/host
Dec 06 10:17:42 np0005548789.localdomain dnsmasq-dhcp[318368]: read /var/lib/neutron/dhcp/fb8c7162-302b-4277-a437-7090f604bfc2/opts
Dec 06 10:17:42 np0005548789.localdomain podman[318895]: 2025-12-06 10:17:42.386485674 +0000 UTC m=+0.086484294 container kill c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:42 np0005548789.localdomain ceph-mon[298582]: pgmap v222: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 1023 B/s wr, 1 op/s
Dec 06 10:17:42 np0005548789.localdomain ceph-mon[298582]: osdmap e135: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/619121492' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:42 np0005548789.localdomain ceph-mon[298582]: osdmap e136: 6 total, 6 up, 6 in
Dec 06 10:17:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:42.455 263652 INFO neutron.agent.linux.ip_lib [None req-33bbd667-c762-4367-9967-0b80fcaf35ed - - - - - -] Device tapf557e6c6-d3 cannot be used as it has no MAC address
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.503 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain kernel: device tapf557e6c6-d3 entered promiscuous mode
Dec 06 10:17:42 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016262.5136] manager: (tapf557e6c6-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Dec 06 10:17:42 np0005548789.localdomain systemd-udevd[318923]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:42Z|00230|binding|INFO|Claiming lport f557e6c6-d34f-468a-a9fd-a253f0fb196d for this chassis.
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.515 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:42Z|00231|binding|INFO|f557e6c6-d34f-468a-a9fd-a253f0fb196d: Claiming unknown
Dec 06 10:17:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:42Z|00232|binding|INFO|Setting lport f557e6c6-d34f-468a-a9fd-a253f0fb196d ovn-installed in OVS
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.525 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.535 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:42.568 2 INFO neutron.agent.securitygroups_rpc [None req-463c5a9c-1342-4628-be66-c954070435e6 cf2cadf875da4c9b86fb2902b9ee90bb 2b975a1e6b7941c09260aeb20365b968 - - default default] Security group member updated ['f9be6b32-ff8a-467f-8358-ff505a55042e']
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.570 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.583 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.617 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:42Z|00233|binding|INFO|Setting lport f557e6c6-d34f-468a-a9fd-a253f0fb196d up in Southbound
Dec 06 10:17:42 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:42.638 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f557e6c6-d34f-468a-a9fd-a253f0fb196d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:42 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:42.641 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f557e6c6-d34f-468a-a9fd-a253f0fb196d in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:42 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:42.642 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:42 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:42.643 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6794a5-8325-41c6-9c93-00d0346c609a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:42.653 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:42 np0005548789.localdomain podman[318960]: 2025-12-06 10:17:42.949864752 +0000 UTC m=+0.088343880 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:42 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:17:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:17:43 np0005548789.localdomain podman[318979]: 2025-12-06 10:17:43.087132375 +0000 UTC m=+0.102454108 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 06 10:17:43 np0005548789.localdomain podman[318979]: 2025-12-06 10:17:43.09553084 +0000 UTC m=+0.110852603 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:43.230 2 INFO neutron.agent.securitygroups_rpc [None req-74f6711f-47e9-487d-bd32-5a2f1bba6efe a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1627444727' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1439659764' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[318368]: exiting on receipt of SIGTERM
Dec 06 10:17:43 np0005548789.localdomain podman[319032]: 2025-12-06 10:17:43.419852917 +0000 UTC m=+0.061823456 container kill c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: libpod-c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c.scope: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain podman[319044]: 2025-12-06 10:17:43.481625581 +0000 UTC m=+0.048560564 container died c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:17:43 np0005548789.localdomain podman[319044]: 2025-12-06 10:17:43.507436005 +0000 UTC m=+0.074370968 container cleanup c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: libpod-conmon-c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c.scope: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain podman[319046]: 2025-12-06 10:17:43.527993848 +0000 UTC m=+0.083451213 container remove c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fb8c7162-302b-4277-a437-7090f604bfc2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:17:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:43.574 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548789.localdomain kernel: device tapfd998f59-dd left promiscuous mode
Dec 06 10:17:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:43Z|00234|binding|INFO|Releasing lport fd998f59-ddde-4bfa-95a4-6f61b1679474 from this chassis (sb_readonly=0)
Dec 06 10:17:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:43Z|00235|binding|INFO|Setting lport fd998f59-ddde-4bfa-95a4-6f61b1679474 down in Southbound
Dec 06 10:17:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:43.587 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-fb8c7162-302b-4277-a437-7090f604bfc2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb8c7162-302b-4277-a437-7090f604bfc2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '23fdd860878442e1b8fc77e4ae3ef271', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23107f01-722b-406d-a1a5-a58a3fd6433e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=fd998f59-ddde-4bfa-95a4-6f61b1679474) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:43.590 160509 INFO neutron.agent.ovn.metadata.agent [-] Port fd998f59-ddde-4bfa-95a4-6f61b1679474 in datapath fb8c7162-302b-4277-a437-7090f604bfc2 unbound from our chassis
Dec 06 10:17:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:43.593 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb8c7162-302b-4277-a437-7090f604bfc2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:17:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:43.593 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb5eeea-1a40-4be9-8bc6-3ab9b66c28dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:43.597 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:43.599 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548789.localdomain podman[319097]: 
Dec 06 10:17:43 np0005548789.localdomain podman[319097]: 2025-12-06 10:17:43.679992498 +0000 UTC m=+0.068620602 container create 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: Started libpod-conmon-934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568.scope.
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:43 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5875013fd97a751f2abbcbe86d70812a3b1c1a55826b088cbe7bf4bd86e9f25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:43 np0005548789.localdomain podman[319097]: 2025-12-06 10:17:43.740971918 +0000 UTC m=+0.129600022 container init 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:43 np0005548789.localdomain podman[319097]: 2025-12-06 10:17:43.647018518 +0000 UTC m=+0.035646642 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:43 np0005548789.localdomain podman[319097]: 2025-12-06 10:17:43.750375643 +0000 UTC m=+0.139003747 container start 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[319116]: started, version 2.85 cachesize 150
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[319116]: DNS service limited to local subnets
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[319116]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[319116]: warning: no upstream servers configured
Dec 06 10:17:43 np0005548789.localdomain dnsmasq[319116]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:43.869 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:43.886 263652 INFO neutron.agent.dhcp.agent [None req-22992f25-fb05-4247-84cc-2880863fe345 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:43.888 263652 INFO neutron.agent.dhcp.agent [None req-22992f25-fb05-4247-84cc-2880863fe345 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:43.915 263652 INFO neutron.agent.dhcp.agent [None req-e5d2d496-bab8-4581-9a7e-de9a58c8a0dc - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: tmp-crun.wW3soN.mount: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4e8ffbe4ba68729562762de2b34e92b94090dcca76e856e5cd87d912c117f07d-merged.mount: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4c8ff188d2142423ad18d025ac974ba3c2dd4c9e1be9e72713cefc79062138c-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:43 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2dfb8c7162\x2d302b\x2d4277\x2da437\x2d7090f604bfc2.mount: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain dnsmasq[319116]: exiting on receipt of SIGTERM
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: libpod-934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568.scope: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:44.090 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:44 np0005548789.localdomain podman[319134]: 2025-12-06 10:17:44.090713636 +0000 UTC m=+0.067861909 container kill 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:17:44 np0005548789.localdomain podman[319149]: 2025-12-06 10:17:44.140380462 +0000 UTC m=+0.043505390 container died 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:17:44 np0005548789.localdomain podman[319149]: 2025-12-06 10:17:44.1871338 +0000 UTC m=+0.090258698 container cleanup 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: libpod-conmon-934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568.scope: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain podman[319156]: 2025-12-06 10:17:44.259228797 +0000 UTC m=+0.142369239 container remove 934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:17:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:44Z|00236|binding|INFO|Releasing lport f557e6c6-d34f-468a-a9fd-a253f0fb196d from this chassis (sb_readonly=0)
Dec 06 10:17:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:44.273 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:44Z|00237|binding|INFO|Setting lport f557e6c6-d34f-468a-a9fd-a253f0fb196d down in Southbound
Dec 06 10:17:44 np0005548789.localdomain kernel: device tapf557e6c6-d3 left promiscuous mode
Dec 06 10:17:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:44.283 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f557e6c6-d34f-468a-a9fd-a253f0fb196d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:44.285 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f557e6c6-d34f-468a-a9fd-a253f0fb196d in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:44.286 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:44 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:44.287 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a4e0b169-cdb9-4192-92e6-23993b3c3344]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:44.313 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:44 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:44Z|00238|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:44.435 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:44 np0005548789.localdomain ceph-mon[298582]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.3 KiB/s wr, 59 op/s
Dec 06 10:17:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3506080910' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4032121207' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: tmp-crun.EeSq4G.mount: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b5875013fd97a751f2abbcbe86d70812a3b1c1a55826b088cbe7bf4bd86e9f25-merged.mount: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-934aed5f8aed9367cf7dbffb3f3484b6785027ae3e6c11154898bf341f681568-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:44 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:45 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:45.567 2 INFO neutron.agent.securitygroups_rpc [None req-a9308ef0-170e-430a-9f5f-6439b979faf7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:45.791 263652 INFO neutron.agent.linux.ip_lib [None req-14a0282a-9c8b-4b61-9054-18eb98946d63 - - - - - -] Device tap42f6d111-d5 cannot be used as it has no MAC address
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.822 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548789.localdomain kernel: device tap42f6d111-d5 entered promiscuous mode
Dec 06 10:17:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:45Z|00239|binding|INFO|Claiming lport 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d for this chassis.
Dec 06 10:17:45 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016265.8326] manager: (tap42f6d111-d5): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.833 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548789.localdomain systemd-udevd[319187]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:45Z|00240|binding|INFO|42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d: Claiming unknown
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.877 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:45Z|00241|binding|INFO|Setting lport 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d ovn-installed in OVS
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.883 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:45.893 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:45Z, description=, device_id=e1d0435f-41a7-4a3a-9168-d8b2d102536f, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06f10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06190>], id=2d5e1755-43eb-417b-85bd-7bf4bf92c7f1, ip_allocation=immediate, mac_address=fa:16:3e:e7:6e:9c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1541, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:45Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.930 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:45Z|00242|binding|INFO|Setting lport 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d up in Southbound
Dec 06 10:17:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:45.934 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:45.936 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:45.937 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:45.938 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a127cf2c-f315-4f4e-a5a7-463cd3cbaa45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:45.966 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:46 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:46 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:46 np0005548789.localdomain podman[319216]: 2025-12-06 10:17:46.199624924 +0000 UTC m=+0.070233122 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:46 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:46.409 263652 INFO neutron.agent.dhcp.agent [None req-2484307c-bf12-4411-bd5e-aaac47468baf - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:45Z, description=, device_id=ef42a7b7-856f-4d93-83fd-eafb16254770, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc661f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd6f8e0>], id=77624924-0bc3-409e-b98a-f90d3ca2c4ea, ip_allocation=immediate, mac_address=fa:16:3e:f5:bc:1b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1542, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:45Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:46 np0005548789.localdomain ceph-mon[298582]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 55 op/s
Dec 06 10:17:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:46.499 263652 INFO neutron.agent.dhcp.agent [None req-174298ce-5597-453c-800d-f95742b6876d - - - - - -] DHCP configuration for ports {'2d5e1755-43eb-417b-85bd-7bf4bf92c7f1'} is completed
Dec 06 10:17:46 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:46.573 2 INFO neutron.agent.securitygroups_rpc [None req-38541453-b414-4a96-8c97-455c5ffb96a0 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:46 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:17:46 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:46 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:46 np0005548789.localdomain podman[319271]: 2025-12-06 10:17:46.594446348 +0000 UTC m=+0.052485622 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:17:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:17:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:17:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:46.866 263652 INFO neutron.agent.dhcp.agent [None req-2659614a-559b-44b0-9a11-954524623b26 - - - - - -] DHCP configuration for ports {'77624924-0bc3-409e-b98a-f90d3ca2c4ea'} is completed
Dec 06 10:17:46 np0005548789.localdomain podman[319315]: 
Dec 06 10:17:46 np0005548789.localdomain podman[319315]: 2025-12-06 10:17:46.952248721 +0000 UTC m=+0.079224774 container create d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:17:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:17:46 np0005548789.localdomain systemd[1]: Started libpod-conmon-d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217.scope.
Dec 06 10:17:47 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:47 np0005548789.localdomain podman[319315]: 2025-12-06 10:17:46.910812294 +0000 UTC m=+0.037788337 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:47 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1970bf154652cc87df1e1e2e4619407439f842cdcb4c780ba34358f189fc220/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:47 np0005548789.localdomain podman[319315]: 2025-12-06 10:17:47.027093262 +0000 UTC m=+0.154069285 container init d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: started, version 2.85 cachesize 150
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: DNS service limited to local subnets
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: warning: no upstream servers configured
Dec 06 10:17:47 np0005548789.localdomain dnsmasq-dhcp[319346]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:47 np0005548789.localdomain dnsmasq-dhcp[319346]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:47 np0005548789.localdomain dnsmasq-dhcp[319346]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:47 np0005548789.localdomain podman[319329]: 2025-12-06 10:17:47.074082327 +0000 UTC m=+0.077656727 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:17:47 np0005548789.localdomain podman[319315]: 2025-12-06 10:17:47.087846675 +0000 UTC m=+0.214822728 container start d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:17:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e137 e137: 6 total, 6 up, 6 in
Dec 06 10:17:47 np0005548789.localdomain podman[319329]: 2025-12-06 10:17:47.111654856 +0000 UTC m=+0.115229266 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:17:47 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:17:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:47.133 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.307 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.307 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.308 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:47.313 263652 INFO neutron.agent.dhcp.agent [None req-b2fc75f0-40f8-46bb-a8d6-d7912a1e5e9d - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:47 np0005548789.localdomain dnsmasq[319346]: exiting on receipt of SIGTERM
Dec 06 10:17:47 np0005548789.localdomain podman[319374]: 2025-12-06 10:17:47.476808062 +0000 UTC m=+0.066351793 container kill d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:17:47 np0005548789.localdomain systemd[1]: libpod-d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217.scope: Deactivated successfully.
Dec 06 10:17:47 np0005548789.localdomain podman[319386]: 2025-12-06 10:17:47.553345864 +0000 UTC m=+0.062317691 container died d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:17:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:47.629 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:47 np0005548789.localdomain podman[319386]: 2025-12-06 10:17:47.634151135 +0000 UTC m=+0.143122922 container cleanup d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:17:47 np0005548789.localdomain systemd[1]: libpod-conmon-d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217.scope: Deactivated successfully.
Dec 06 10:17:47 np0005548789.localdomain podman[319388]: 2025-12-06 10:17:47.660944427 +0000 UTC m=+0.159132418 container remove d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:47.676 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:47Z|00243|binding|INFO|Releasing lport 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d from this chassis (sb_readonly=0)
Dec 06 10:17:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:47Z|00244|binding|INFO|Setting lport 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d down in Southbound
Dec 06 10:17:47 np0005548789.localdomain kernel: device tap42f6d111-d5 left promiscuous mode
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.690 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.692 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 42f6d111-d5b7-4fe3-8ff2-29ff3ea79b8d in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.694 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:47.695 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[95b162a2-d794-48ec-b2ab-ae499d5d37b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:47.699 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:48 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:48.020 2 INFO neutron.agent.securitygroups_rpc [None req-77939ad8-3a8c-44db-b1d8-896917e1a291 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:48 np0005548789.localdomain ceph-mon[298582]: osdmap e137: 6 total, 6 up, 6 in
Dec 06 10:17:48 np0005548789.localdomain ceph-mon[298582]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 55 op/s
Dec 06 10:17:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f1970bf154652cc87df1e1e2e4619407439f842cdcb4c780ba34358f189fc220-merged.mount: Deactivated successfully.
Dec 06 10:17:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8b7e2ea67f9833a438449d4214ac95d50cd723c09fe787e3726ac96d044b217-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:48 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:48.896 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:49.096 263652 INFO neutron.agent.linux.ip_lib [None req-7eb03ef0-f215-4e63-a52e-9c8abe015ca5 - - - - - -] Device tap15e8fc8e-25 cannot be used as it has no MAC address
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.127 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain kernel: device tap15e8fc8e-25 entered promiscuous mode
Dec 06 10:17:49 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016269.1363] manager: (tap15e8fc8e-25): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.135 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:49Z|00245|binding|INFO|Claiming lport 15e8fc8e-2569-4456-89e1-7a3d1684c267 for this chassis.
Dec 06 10:17:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:49Z|00246|binding|INFO|15e8fc8e-2569-4456-89e1-7a3d1684c267: Claiming unknown
Dec 06 10:17:49 np0005548789.localdomain systemd-udevd[319426]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:49Z|00247|binding|INFO|Setting lport 15e8fc8e-2569-4456-89e1-7a3d1684c267 ovn-installed in OVS
Dec 06 10:17:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:49Z|00248|binding|INFO|Setting lport 15e8fc8e-2569-4456-89e1-7a3d1684c267 up in Southbound
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.149 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:49.148 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=15e8fc8e-2569-4456-89e1-7a3d1684c267) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:49.150 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 15e8fc8e-2569-4456-89e1-7a3d1684c267 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:49.152 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:49.153 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[eda05078-7701-415b-a4bf-fff626310cb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.158 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.177 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.229 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:49.267 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:49 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:17:49.735 2 INFO neutron.agent.securitygroups_rpc [None req-028fe2d3-a2af-4154-9a69-d7d602ad3ddf a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:17:49 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:49 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:49 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:49 np0005548789.localdomain podman[319477]: 2025-12-06 10:17:49.912728637 +0000 UTC m=+0.062065433 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:50 np0005548789.localdomain ceph-mon[298582]: pgmap v229: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 45 op/s
Dec 06 10:17:50 np0005548789.localdomain podman[319519]: 
Dec 06 10:17:50 np0005548789.localdomain podman[319519]: 2025-12-06 10:17:50.271712096 +0000 UTC m=+0.073704617 container create b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:17:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:17:50 np0005548789.localdomain systemd[1]: Started libpod-conmon-b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732.scope.
Dec 06 10:17:50 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:50 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c18ac33546be1aae3397ead48d8dd1e3f5abac98a4c4c00eeed7d61e8509d7e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:50 np0005548789.localdomain podman[319519]: 2025-12-06 10:17:50.233466795 +0000 UTC m=+0.035459376 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:50 np0005548789.localdomain podman[319519]: 2025-12-06 10:17:50.337640186 +0000 UTC m=+0.139632707 container init b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:50 np0005548789.localdomain podman[319519]: 2025-12-06 10:17:50.346864035 +0000 UTC m=+0.148856546 container start b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: started, version 2.85 cachesize 150
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: DNS service limited to local subnets
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: warning: no upstream servers configured
Dec 06 10:17:50 np0005548789.localdomain dnsmasq-dhcp[319546]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:50 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:50 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:50 np0005548789.localdomain podman[319533]: 2025-12-06 10:17:50.408468284 +0000 UTC m=+0.092993142 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 06 10:17:50 np0005548789.localdomain podman[319533]: 2025-12-06 10:17:50.453282573 +0000 UTC m=+0.137807441 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec 06 10:17:50 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:17:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:50.523 263652 INFO neutron.agent.dhcp.agent [None req-c9dc30e2-8a6e-4d5f-bb3c-8a9245821114 - - - - - -] DHCP configuration for ports {'031e9ed7-2f9d-4794-b149-fed50ddb5365', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:50 np0005548789.localdomain dnsmasq[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:50 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:50 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:50 np0005548789.localdomain podman[319582]: 2025-12-06 10:17:50.716838337 +0000 UTC m=+0.068526439 container kill b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:50.918 263652 INFO neutron.agent.dhcp.agent [None req-c3044ec0-d405-49dc-90fc-0d54811d5574 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:34Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad14f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad1280>], id=4e06a687-1f49-4292-acf2-929e0eb84acf, ip_allocation=immediate, mac_address=fa:16:3e:4d:58:46, name=tempest-NetworksTestDHCPv6-462185524, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['bc7d1843-cf65-45d5-94a7-f389cac666c9'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:34Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1467, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:34Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:50.997 263652 INFO neutron.agent.dhcp.agent [None req-837fa81a-61f0-4ca8-a9a5-c3cf98bd60d8 - - - - - -] DHCP configuration for ports {'15e8fc8e-2569-4456-89e1-7a3d1684c267', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:51 np0005548789.localdomain podman[319621]: 2025-12-06 10:17:51.126494402 +0000 UTC m=+0.066658363 container kill b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:17:51 np0005548789.localdomain dnsmasq[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:51.293 263652 INFO neutron.agent.dhcp.agent [None req-c3044ec0-d405-49dc-90fc-0d54811d5574 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:37Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb42490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb42f10>], id=9a6a53c7-1fed-40b6-9731-a522fa01a8e9, ip_allocation=immediate, mac_address=fa:16:3e:95:da:c2, name=tempest-NetworksTestDHCPv6-1157473240, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['ce5e9894-b404-44e0-bb7b-8eb8d1458ed9'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:37Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1492, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:37Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:51.333 263652 INFO neutron.agent.dhcp.agent [None req-bdca00a0-e797-46d6-8906-19bbae78f545 - - - - - -] DHCP configuration for ports {'4e06a687-1f49-4292-acf2-929e0eb84acf'} is completed
Dec 06 10:17:51 np0005548789.localdomain dnsmasq[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:17:51 np0005548789.localdomain podman[319658]: 2025-12-06 10:17:51.481304225 +0000 UTC m=+0.064300872 container kill b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[319546]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:51.561 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:50Z, description=, device_id=5a9249e3-4953-4808-89a5-568f69ae8159, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcce1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fccea00>], id=4238df07-d7ae-46fc-981f-02a73e40206c, ip_allocation=immediate, mac_address=fa:16:3e:d5:08:58, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1571, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:51Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:51.750 263652 INFO neutron.agent.dhcp.agent [None req-b3c01a27-6b0d-43c8-b6cb-b41f907a0986 - - - - - -] DHCP configuration for ports {'9a6a53c7-1fed-40b6-9731-a522fa01a8e9'} is completed
Dec 06 10:17:51 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:51 np0005548789.localdomain podman[319694]: 2025-12-06 10:17:51.831718503 +0000 UTC m=+0.073542861 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:51 np0005548789.localdomain dnsmasq[319546]: exiting on receipt of SIGTERM
Dec 06 10:17:51 np0005548789.localdomain systemd[1]: tmp-crun.qjrrlZ.mount: Deactivated successfully.
Dec 06 10:17:51 np0005548789.localdomain podman[319726]: 2025-12-06 10:17:51.973399701 +0000 UTC m=+0.067094956 container kill b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:51 np0005548789.localdomain systemd[1]: libpod-b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732.scope: Deactivated successfully.
Dec 06 10:17:52 np0005548789.localdomain podman[319743]: 2025-12-06 10:17:52.057385608 +0000 UTC m=+0.060149526 container died b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:17:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-9c18ac33546be1aae3397ead48d8dd1e3f5abac98a4c4c00eeed7d61e8509d7e-merged.mount: Deactivated successfully.
Dec 06 10:17:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:52.093 263652 INFO neutron.agent.dhcp.agent [None req-b6568ef1-e238-4481-95e1-b1b0d0170d6f - - - - - -] DHCP configuration for ports {'4238df07-d7ae-46fc-981f-02a73e40206c'} is completed
Dec 06 10:17:52 np0005548789.localdomain podman[319743]: 2025-12-06 10:17:52.157533155 +0000 UTC m=+0.160297043 container remove b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:17:52 np0005548789.localdomain systemd[1]: libpod-conmon-b73c6d50186bb868d1093a23b8c34e1ae426a8ee07e6598ead72c83b7962a732.scope: Deactivated successfully.
Dec 06 10:17:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:52.169 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:52 np0005548789.localdomain kernel: device tap15e8fc8e-25 left promiscuous mode
Dec 06 10:17:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:52Z|00249|binding|INFO|Releasing lport 15e8fc8e-2569-4456-89e1-7a3d1684c267 from this chassis (sb_readonly=0)
Dec 06 10:17:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:52Z|00250|binding|INFO|Setting lport 15e8fc8e-2569-4456-89e1-7a3d1684c267 down in Southbound
Dec 06 10:17:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:52.186 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=15e8fc8e-2569-4456-89e1-7a3d1684c267) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:52.187 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 15e8fc8e-2569-4456-89e1-7a3d1684c267 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:52.188 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:52.189 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[94aedc94-6ee6-4288-b9a3-d3de407122d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:52.193 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:52 np0005548789.localdomain ceph-mon[298582]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 1.4 KiB/s wr, 38 op/s
Dec 06 10:17:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:52.669 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:52 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:53.118 263652 INFO neutron.agent.linux.ip_lib [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Device tape3c62197-6b cannot be used as it has no MAC address
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.143 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain kernel: device tape3c62197-6b entered promiscuous mode
Dec 06 10:17:53 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016273.1507] manager: (tape3c62197-6b): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.151 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:53Z|00251|binding|INFO|Claiming lport e3c62197-6b1b-4fe2-b169-9cfa6917af0a for this chassis.
Dec 06 10:17:53 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:53Z|00252|binding|INFO|e3c62197-6b1b-4fe2-b169-9cfa6917af0a: Claiming unknown
Dec 06 10:17:53 np0005548789.localdomain systemd-udevd[319781]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:53 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:53Z|00253|binding|INFO|Setting lport e3c62197-6b1b-4fe2-b169-9cfa6917af0a up in Southbound
Dec 06 10:17:53 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:53Z|00254|binding|INFO|Setting lport e3c62197-6b1b-4fe2-b169-9cfa6917af0a ovn-installed in OVS
Dec 06 10:17:53 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:53.165 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e3c62197-6b1b-4fe2-b169-9cfa6917af0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.165 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.166 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:53.170 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e3c62197-6b1b-4fe2-b169-9cfa6917af0a in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:53 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:53.171 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:53 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:53.172 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f46539ba-aa3d-40b4-bc23-9b17c4803d71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.183 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.190 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e138 e138: 6 total, 6 up, 6 in
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.224 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.255 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.751 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.898 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:17:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:17:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:17:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:17:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:17:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19263 "" "Go-http-client/1.1"
Dec 06 10:17:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:53.979 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:54.091 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:54 np0005548789.localdomain podman[319836]: 
Dec 06 10:17:54 np0005548789.localdomain podman[319836]: 2025-12-06 10:17:54.210260428 +0000 UTC m=+0.119171245 container create fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:17:54 np0005548789.localdomain ceph-mon[298582]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 409 B/s wr, 2 op/s
Dec 06 10:17:54 np0005548789.localdomain ceph-mon[298582]: osdmap e138: 6 total, 6 up, 6 in
Dec 06 10:17:54 np0005548789.localdomain podman[319836]: 2025-12-06 10:17:54.144887196 +0000 UTC m=+0.053798053 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:54 np0005548789.localdomain systemd[1]: Started libpod-conmon-fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98.scope.
Dec 06 10:17:54 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:54 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c21cbe908f45bcdc1b00e9cb02aecc14e9e3226ea975523b944b16c6663789/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:54 np0005548789.localdomain podman[319836]: 2025-12-06 10:17:54.288204102 +0000 UTC m=+0.197114929 container init fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:54 np0005548789.localdomain podman[319836]: 2025-12-06 10:17:54.296461333 +0000 UTC m=+0.205372150 container start fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: started, version 2.85 cachesize 150
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: DNS service limited to local subnets
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: warning: no upstream servers configured
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:54.365 263652 INFO neutron.agent.dhcp.agent [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe3845a9d30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fe1ba30>], id=67b3547b-9d27-4643-bddb-ba71d121551d, ip_allocation=immediate, mac_address=fa:16:3e:f3:31:93, name=tempest-NetworksTestDHCPv6-1485293953, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['6b9e94cd-e549-4ef1-a60c-bc98bdbbee8c'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:40Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1520, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:41Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:54.479 263652 INFO neutron.agent.dhcp.agent [None req-e70cc606-5920-4c25-8fb7-1e5b3b5bc41a - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:17:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:54.509 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:54Z, description=, device_id=4756bfdd-20ae-4420-baa8-2e1807a793b3, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf5b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf880>], id=39dfb9d0-ac72-4ae3-ac11-cd04485d1755, ip_allocation=immediate, mac_address=fa:16:3e:27:36:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1597, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:54Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:54 np0005548789.localdomain podman[319871]: 2025-12-06 10:17:54.582338284 +0000 UTC m=+0.083667179 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:54 np0005548789.localdomain podman[319904]: 2025-12-06 10:17:54.746919396 +0000 UTC m=+0.068901241 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:17:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:54.792 263652 INFO neutron.agent.dhcp.agent [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:44Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb06a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb06790>], id=dc85b11d-4c03-4e32-ae66-316d37e2ed0c, ip_allocation=immediate, mac_address=fa:16:3e:d4:b8:56, name=tempest-NetworksTestDHCPv6-58844838, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['773a18d1-5b62-4bc9-af5e-7fc433180497'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:44Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1538, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:44Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:54.896 263652 INFO neutron.agent.dhcp.agent [None req-772f3f79-0e41-40f0-8b75-e46d6b3a7d1b - - - - - -] DHCP configuration for ports {'67b3547b-9d27-4643-bddb-ba71d121551d'} is completed
Dec 06 10:17:54 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:54 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:54 np0005548789.localdomain podman[319944]: 2025-12-06 10:17:54.990399271 +0000 UTC m=+0.064039553 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.030 263652 INFO neutron.agent.dhcp.agent [None req-00f6a456-e68b-467f-a5e8-6f89918f7ec4 - - - - - -] DHCP configuration for ports {'39dfb9d0-ac72-4ae3-ac11-cd04485d1755'} is completed
Dec 06 10:17:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:55.088 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.133 263652 INFO neutron.agent.dhcp.agent [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:47Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad67c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6460>], id=031e9ed7-2f9d-4794-b149-fed50ddb5365, ip_allocation=immediate, mac_address=fa:16:3e:a1:a1:ca, name=tempest-NetworksTestDHCPv6-1428419815, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['f4c32db1-eb59-48e0-aec0-c4465a7e322c'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:47Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1554, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:47Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.242 263652 INFO neutron.agent.dhcp.agent [None req-36de5a5a-04cc-4adc-aefd-1294e927fe2f - - - - - -] DHCP configuration for ports {'dc85b11d-4c03-4e32-ae66-316d37e2ed0c'} is completed
Dec 06 10:17:55 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 3 addresses
Dec 06 10:17:55 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:55 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:55 np0005548789.localdomain podman[319981]: 2025-12-06 10:17:55.359557358 +0000 UTC m=+0.089111563 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.556 263652 INFO neutron.agent.dhcp.agent [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:52Z, description=, device_id=984ba1bf-ed49-495e-9318-1b56761910e8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd71340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd71460>], id=846d7e3e-31ba-499c-b8e2-0158928f1018, ip_allocation=immediate, mac_address=fa:16:3e:3f:1a:25, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['f5202084-e1f1-45f3-9585-2947d7b89bec'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:51Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=False, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1590, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:52Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.660 263652 INFO neutron.agent.dhcp.agent [None req-998f1619-48b0-4d21-9c7b-7258357dc7e7 - - - - - -] DHCP configuration for ports {'031e9ed7-2f9d-4794-b149-fed50ddb5365'} is completed
Dec 06 10:17:55 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 4 addresses
Dec 06 10:17:55 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:55 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:55 np0005548789.localdomain podman[320020]: 2025-12-06 10:17:55.762888302 +0000 UTC m=+0.061602780 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.922 263652 INFO neutron.agent.dhcp.agent [None req-8275c5f7-8320-4bbc-b1d7-7694a270cf2f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:52Z, description=, device_id=984ba1bf-ed49-495e-9318-1b56761910e8, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5b9d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5b490>], id=846d7e3e-31ba-499c-b8e2-0158928f1018, ip_allocation=immediate, mac_address=fa:16:3e:3f:1a:25, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['f5202084-e1f1-45f3-9585-2947d7b89bec'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:51Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=False, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1590, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:52Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:55.986 263652 INFO neutron.agent.dhcp.agent [None req-973dab4b-7548-4deb-ac0e-1081184bbe7a - - - - - -] DHCP configuration for ports {'846d7e3e-31ba-499c-b8e2-0158928f1018'} is completed
Dec 06 10:17:56 np0005548789.localdomain dnsmasq[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 4 addresses
Dec 06 10:17:56 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:17:56 np0005548789.localdomain dnsmasq-dhcp[319854]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:17:56 np0005548789.localdomain podman[320059]: 2025-12-06 10:17:56.12615263 +0000 UTC m=+0.069614062 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:17:56 np0005548789.localdomain ceph-mon[298582]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 511 B/s wr, 3 op/s
Dec 06 10:17:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:56.314 263652 INFO neutron.agent.dhcp.agent [None req-e2ce761d-886a-4efd-963a-2f20b12878a1 - - - - - -] DHCP configuration for ports {'846d7e3e-31ba-499c-b8e2-0158928f1018'} is completed
Dec 06 10:17:56 np0005548789.localdomain sudo[320104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:17:56 np0005548789.localdomain systemd[1]: tmp-crun.hOqZ7c.mount: Deactivated successfully.
Dec 06 10:17:56 np0005548789.localdomain sudo[320104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:56 np0005548789.localdomain podman[320098]: 2025-12-06 10:17:56.40505601 +0000 UTC m=+0.074508811 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:17:56 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:17:56 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:56 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:56 np0005548789.localdomain sudo[320104]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:56 np0005548789.localdomain sudo[320140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:17:56 np0005548789.localdomain sudo[320140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:56 np0005548789.localdomain dnsmasq[319854]: exiting on receipt of SIGTERM
Dec 06 10:17:56 np0005548789.localdomain podman[320162]: 2025-12-06 10:17:56.495702089 +0000 UTC m=+0.044110059 container kill fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:56 np0005548789.localdomain systemd[1]: libpod-fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98.scope: Deactivated successfully.
Dec 06 10:17:56 np0005548789.localdomain podman[320180]: 2025-12-06 10:17:56.539200388 +0000 UTC m=+0.031578168 container died fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:17:56 np0005548789.localdomain podman[320180]: 2025-12-06 10:17:56.568222899 +0000 UTC m=+0.060600669 container cleanup fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:17:56 np0005548789.localdomain systemd[1]: libpod-conmon-fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98.scope: Deactivated successfully.
Dec 06 10:17:56 np0005548789.localdomain podman[320182]: 2025-12-06 10:17:56.644970357 +0000 UTC m=+0.127771856 container remove fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:17:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:56.653 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:56Z|00255|binding|INFO|Releasing lport e3c62197-6b1b-4fe2-b169-9cfa6917af0a from this chassis (sb_readonly=0)
Dec 06 10:17:56 np0005548789.localdomain kernel: device tape3c62197-6b left promiscuous mode
Dec 06 10:17:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:56Z|00256|binding|INFO|Setting lport e3c62197-6b1b-4fe2-b169-9cfa6917af0a down in Southbound
Dec 06 10:17:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:56Z|00257|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:56.669 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e3c62197-6b1b-4fe2-b169-9cfa6917af0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:56.670 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e3c62197-6b1b-4fe2-b169-9cfa6917af0a in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:17:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:56.670 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:56.683 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d24796-20bd-48e9-a798-6fec88d542ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:56.684 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:56.695 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:56.698 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:17:56 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:17:56 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:57 np0005548789.localdomain sudo[320140]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-46c21cbe908f45bcdc1b00e9cb02aecc14e9e3226ea975523b944b16c6663789-merged.mount: Deactivated successfully.
Dec 06 10:17:57 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc20fcf21869e5d1157562ec291824a00698e06684a29f052ec6f97cdf9bab98-userdata-shm.mount: Deactivated successfully.
Dec 06 10:17:57 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.213 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.216 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.216 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.217 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.217 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.217 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.218 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.218 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.218 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.218 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.219 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.219 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.219 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.219 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.220 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.220 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:57.220 263652 INFO neutron.agent.dhcp.agent [None req-7f2f6ac0-106b-4c57-a85b-ca2375d7a5ed - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:17:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:17:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1671277929' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:17:57 np0005548789.localdomain sudo[320246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:17:57 np0005548789.localdomain sudo[320246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:17:57 np0005548789.localdomain sudo[320246]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:57.703 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:57Z|00258|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:17:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:57.841 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:57 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:17:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:57 np0005548789.localdomain podman[320279]: 2025-12-06 10:17:57.924224009 +0000 UTC m=+0.054304219 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:17:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 803 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 409 B/s wr, 2 op/s
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:17:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e139 e139: 6 total, 6 up, 6 in
Dec 06 10:17:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:58.452 263652 INFO neutron.agent.linux.ip_lib [None req-2fb783c0-7cf0-45de-874b-b32c19755a44 - - - - - -] Device tapa88d84d5-c8 cannot be used as it has no MAC address
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.479 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548789.localdomain kernel: device tapa88d84d5-c8 entered promiscuous mode
Dec 06 10:17:58 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016278.4896] manager: (tapa88d84d5-c8): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.489 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:58Z|00259|binding|INFO|Claiming lport a88d84d5-c856-402e-975d-7a0db34028a3 for this chassis.
Dec 06 10:17:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:58Z|00260|binding|INFO|a88d84d5-c856-402e-975d-7a0db34028a3: Claiming unknown
Dec 06 10:17:58 np0005548789.localdomain systemd-udevd[320310]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:17:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:58.504 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=a88d84d5-c856-402e-975d-7a0db34028a3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:17:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:58.506 160509 INFO neutron.agent.ovn.metadata.agent [-] Port a88d84d5-c856-402e-975d-7a0db34028a3 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:17:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:58.507 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:17:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:17:58.508 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf731fb-de35-4086-96fc-b89e9928af0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:17:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:58Z|00261|binding|INFO|Setting lport a88d84d5-c856-402e-975d-7a0db34028a3 ovn-installed in OVS
Dec 06 10:17:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:17:58Z|00262|binding|INFO|Setting lport a88d84d5-c856-402e-975d-7a0db34028a3 up in Southbound
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.534 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapa88d84d5-c8: No such device
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.582 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.612 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:58.615 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:58Z, description=, device_id=f9d47455-4f4d-4051-9259-2dd1238f7b5a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc780d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc78190>], id=5653c267-2244-44fb-bd63-9f60854c0945, ip_allocation=immediate, mac_address=fa:16:3e:a5:e7:e5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1631, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:58Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:58 np0005548789.localdomain podman[320359]: 2025-12-06 10:17:58.830418494 +0000 UTC m=+0.049894114 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:17:58 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:17:58 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:58 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:17:58.900 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:17:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:59.316 263652 INFO neutron.agent.dhcp.agent [None req-730aaf8d-f0b4-43a0-8b8a-87c5e3888346 - - - - - -] DHCP configuration for ports {'5653c267-2244-44fb-bd63-9f60854c0945'} is completed
Dec 06 10:17:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:59.343 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:58Z, description=, device_id=6d77d769-2432-46ea-81cb-7c9efbed3186, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdf2e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdfca0>], id=50973850-abb3-4347-9d48-675f44e4821a, ip_allocation=immediate, mac_address=fa:16:3e:24:dc:28, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1634, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:17:59Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:17:59 np0005548789.localdomain ceph-mon[298582]: osdmap e139: 6 total, 6 up, 6 in
Dec 06 10:17:59 np0005548789.localdomain podman[320433]: 
Dec 06 10:17:59 np0005548789.localdomain podman[320433]: 2025-12-06 10:17:59.582565789 +0000 UTC m=+0.058486655 container create 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:17:59 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:17:59 np0005548789.localdomain podman[320447]: 2025-12-06 10:17:59.622736827 +0000 UTC m=+0.061868708 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:17:59 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:17:59 np0005548789.localdomain systemd[1]: Started libpod-conmon-57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a.scope.
Dec 06 10:17:59 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:17:59 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/271f045096e962713e97fa807cab1b588a56da9a1b0e67641acbd6559ec28410/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:59 np0005548789.localdomain podman[320433]: 2025-12-06 10:17:59.551313001 +0000 UTC m=+0.027233897 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:17:59 np0005548789.localdomain podman[320433]: 2025-12-06 10:17:59.651505129 +0000 UTC m=+0.127426035 container init 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:17:59 np0005548789.localdomain podman[320433]: 2025-12-06 10:17:59.667214056 +0000 UTC m=+0.143134952 container start 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: started, version 2.85 cachesize 150
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: DNS service limited to local subnets
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: warning: no upstream servers configured
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:17:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:59.723 263652 INFO neutron.agent.dhcp.agent [None req-2fb783c0-7cf0-45de-874b-b32c19755a44 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:58Z, description=, device_id=65c1a743-e3fe-40a2-b51b-1d247b2883ed, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf4c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf970>], id=c3106dac-02c2-4639-a185-038f65c0f50b, ip_allocation=immediate, mac_address=fa:16:3e:ab:06:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['10c3ac68-1998-4b91-9b6f-10a0e5a37ad1'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:57Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=False, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1630, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:58Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:17:59 np0005548789.localdomain dnsmasq[320470]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:17:59 np0005548789.localdomain podman[320492]: 2025-12-06 10:17:59.912917079 +0000 UTC m=+0.061792895 container kill 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:17:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:17:59.955 263652 INFO neutron.agent.dhcp.agent [None req-ef6dfc4f-d6b1-49e0-a129-db86020f3578 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', '50973850-abb3-4347-9d48-675f44e4821a'} is completed
Dec 06 10:18:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:00.063 263652 INFO neutron.agent.dhcp.agent [None req-2fb783c0-7cf0-45de-874b-b32c19755a44 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:17:58Z, description=, device_id=65c1a743-e3fe-40a2-b51b-1d247b2883ed, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb0d0d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb0df70>], id=c3106dac-02c2-4639-a185-038f65c0f50b, ip_allocation=immediate, mac_address=fa:16:3e:ab:06:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['10c3ac68-1998-4b91-9b6f-10a0e5a37ad1'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:57Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=False, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1630, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:17:58Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:00.158 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:00 np0005548789.localdomain dnsmasq[320470]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:00 np0005548789.localdomain podman[320532]: 2025-12-06 10:18:00.273573277 +0000 UTC m=+0.063405843 container kill 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 2.7 KiB/s wr, 70 op/s
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1856326074' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548789.localdomain systemd[1]: tmp-crun.0xyTwN.mount: Deactivated successfully.
Dec 06 10:18:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:00.614 263652 INFO neutron.agent.dhcp.agent [None req-dc1e3993-e255-4e3c-ae44-d123ce14b3be - - - - - -] DHCP configuration for ports {'c3106dac-02c2-4639-a185-038f65c0f50b'} is completed
Dec 06 10:18:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:00.784 263652 INFO neutron.agent.dhcp.agent [None req-123f9a4d-72c8-4ebf-a4cf-57cc7b111bea - - - - - -] DHCP configuration for ports {'c3106dac-02c2-4639-a185-038f65c0f50b'} is completed
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:01.148 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2389509646' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:18:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:18:01 np0005548789.localdomain podman[320555]: 2025-12-06 10:18:01.922846963 +0000 UTC m=+0.074396937 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:18:01 np0005548789.localdomain podman[320555]: 2025-12-06 10:18:01.931196736 +0000 UTC m=+0.082746730 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:18:01 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:18:01 np0005548789.localdomain podman[320554]: 2025-12-06 10:18:01.981786021 +0000 UTC m=+0.140666948 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:18:01 np0005548789.localdomain podman[320554]: 2025-12-06 10:18:01.98804649 +0000 UTC m=+0.146927367 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:02 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:18:02 np0005548789.localdomain dnsmasq[320470]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:02 np0005548789.localdomain podman[320612]: 2025-12-06 10:18:02.403943806 +0000 UTC m=+0.071772648 container kill 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:02 np0005548789.localdomain ceph-mon[298582]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 2.2 KiB/s wr, 68 op/s
Dec 06 10:18:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2062782985' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2062782985' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:02.681 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:02Z|00263|binding|INFO|Releasing lport a88d84d5-c856-402e-975d-7a0db34028a3 from this chassis (sb_readonly=0)
Dec 06 10:18:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:02Z|00264|binding|INFO|Setting lport a88d84d5-c856-402e-975d-7a0db34028a3 down in Southbound
Dec 06 10:18:02 np0005548789.localdomain kernel: device tapa88d84d5-c8 left promiscuous mode
Dec 06 10:18:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:02.692 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=a88d84d5-c856-402e-975d-7a0db34028a3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:02.694 160509 INFO neutron.agent.ovn.metadata.agent [-] Port a88d84d5-c856-402e-975d-7a0db34028a3 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:02.696 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 43883dce-1590-48c4-987c-a21b63b82a1c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:02.697 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[cdabcf84-34ab-48a4-ba08-4ba22e66dc41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:02.702 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:02.705 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:02 np0005548789.localdomain systemd[1]: tmp-crun.qfS4S5.mount: Deactivated successfully.
Dec 06 10:18:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:03 np0005548789.localdomain podman[320651]: 2025-12-06 10:18:03.313923677 +0000 UTC m=+0.061042152 container kill 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:18:03 np0005548789.localdomain dnsmasq[320470]: exiting on receipt of SIGTERM
Dec 06 10:18:03 np0005548789.localdomain systemd[1]: libpod-57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a.scope: Deactivated successfully.
Dec 06 10:18:03 np0005548789.localdomain podman[320664]: 2025-12-06 10:18:03.391610533 +0000 UTC m=+0.059138535 container died 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:18:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:03Z|00265|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:03 np0005548789.localdomain podman[320664]: 2025-12-06 10:18:03.428919335 +0000 UTC m=+0.096447297 container cleanup 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:03 np0005548789.localdomain systemd[1]: libpod-conmon-57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a.scope: Deactivated successfully.
Dec 06 10:18:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:03.453 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:03 np0005548789.localdomain podman[320665]: 2025-12-06 10:18:03.527097123 +0000 UTC m=+0.190744016 container remove 57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:18:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:03.824 263652 INFO neutron.agent.dhcp.agent [None req-af7bf473-a0e6-4493-960a-d20f0832d732 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:03.936 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-271f045096e962713e97fa807cab1b588a56da9a1b0e67641acbd6559ec28410-merged.mount: Deactivated successfully.
Dec 06 10:18:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57b346441c3abd133ae032fa76a3084c226ca485c6132fd718ec2e896677be9a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:03 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:04 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:04.287 2 INFO neutron.agent.securitygroups_rpc [None req-a84ff9e7-4dda-4f24-9c52-73179c1374d1 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:04 np0005548789.localdomain ceph-mon[298582]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 3.3 KiB/s wr, 91 op/s
Dec 06 10:18:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:04.735 263652 INFO neutron.agent.linux.ip_lib [None req-c817246b-9af4-4a9c-bfad-8300a0140231 - - - - - -] Device tap795909b0-e9 cannot be used as it has no MAC address
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.763 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548789.localdomain kernel: device tap795909b0-e9 entered promiscuous mode
Dec 06 10:18:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:04Z|00266|binding|INFO|Claiming lport 795909b0-e9c1-4d84-850f-e878bfa3090c for this chassis.
Dec 06 10:18:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:04Z|00267|binding|INFO|795909b0-e9c1-4d84-850f-e878bfa3090c: Claiming unknown
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.773 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016284.7746] manager: (tap795909b0-e9): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Dec 06 10:18:04 np0005548789.localdomain systemd-udevd[320702]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.790 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:04.807 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea3:f6bb/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-aacf8ef2-726e-4b97-b5f2-032a84aa6e97', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aacf8ef2-726e-4b97-b5f2-032a84aa6e97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51d66027-c066-482f-93f1-6217163f6b22, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=795909b0-e9c1-4d84-850f-e878bfa3090c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:04.808 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 795909b0-e9c1-4d84-850f-e878bfa3090c in datapath aacf8ef2-726e-4b97-b5f2-032a84aa6e97 bound to our chassis
Dec 06 10:18:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:04.810 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port ab0a787e-d042-495b-a77f-e65096e28c65 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:04.810 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aacf8ef2-726e-4b97-b5f2-032a84aa6e97, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:04.811 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5e956e4e-65d0-4f77-b058-f53cbd2b86d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:04Z|00268|binding|INFO|Setting lport 795909b0-e9c1-4d84-850f-e878bfa3090c ovn-installed in OVS
Dec 06 10:18:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:04Z|00269|binding|INFO|Setting lport 795909b0-e9c1-4d84-850f-e878bfa3090c up in Southbound
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.817 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap795909b0-e9: No such device
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.875 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:04.901 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:05 np0005548789.localdomain systemd[1]: tmp-crun.lziCyZ.mount: Deactivated successfully.
Dec 06 10:18:05 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:05 np0005548789.localdomain podman[320749]: 2025-12-06 10:18:05.071785835 +0000 UTC m=+0.072792709 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:05 np0005548789.localdomain podman[320812]: 
Dec 06 10:18:05 np0005548789.localdomain podman[320812]: 2025-12-06 10:18:05.912986991 +0000 UTC m=+0.093823667 container create ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:18:05 np0005548789.localdomain systemd[1]: Started libpod-conmon-ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd.scope.
Dec 06 10:18:05 np0005548789.localdomain podman[320812]: 2025-12-06 10:18:05.869568273 +0000 UTC m=+0.050404949 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:05 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:05 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d313493fde52dfd539ad4b8ab9379c14b9f143306ec1d068c13748552a2ec796/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:05 np0005548789.localdomain podman[320812]: 2025-12-06 10:18:05.996480173 +0000 UTC m=+0.177316849 container init ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:18:06 np0005548789.localdomain podman[320812]: 2025-12-06 10:18:06.006210438 +0000 UTC m=+0.187047114 container start ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: started, version 2.85 cachesize 150
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: DNS service limited to local subnets
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: warning: no upstream servers configured
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: read /var/lib/neutron/dhcp/aacf8ef2-726e-4b97-b5f2-032a84aa6e97/addn_hosts - 0 addresses
Dec 06 10:18:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:06.201 263652 INFO neutron.agent.dhcp.agent [None req-978a36a2-91c1-42bb-86d7-97571eab2243 - - - - - -] DHCP configuration for ports {'ddf6de7e-486c-44c8-8a7f-842446c78589'} is completed
Dec 06 10:18:06 np0005548789.localdomain dnsmasq[320831]: exiting on receipt of SIGTERM
Dec 06 10:18:06 np0005548789.localdomain podman[320847]: 2025-12-06 10:18:06.331185065 +0000 UTC m=+0.066691274 container kill ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:18:06 np0005548789.localdomain systemd[1]: libpod-ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd.scope: Deactivated successfully.
Dec 06 10:18:06 np0005548789.localdomain podman[320860]: 2025-12-06 10:18:06.414882944 +0000 UTC m=+0.061848417 container died ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:06 np0005548789.localdomain podman[320860]: 2025-12-06 10:18:06.452608188 +0000 UTC m=+0.099573621 container cleanup ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:06 np0005548789.localdomain systemd[1]: libpod-conmon-ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd.scope: Deactivated successfully.
Dec 06 10:18:06 np0005548789.localdomain podman[320861]: 2025-12-06 10:18:06.49190774 +0000 UTC m=+0.136553293 container remove ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aacf8ef2-726e-4b97-b5f2-032a84aa6e97, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:18:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:06.506 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:06Z|00270|binding|INFO|Releasing lport 795909b0-e9c1-4d84-850f-e878bfa3090c from this chassis (sb_readonly=0)
Dec 06 10:18:06 np0005548789.localdomain kernel: device tap795909b0-e9 left promiscuous mode
Dec 06 10:18:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:06Z|00271|binding|INFO|Setting lport 795909b0-e9c1-4d84-850f-e878bfa3090c down in Southbound
Dec 06 10:18:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:06.518 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea3:f6bb/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-aacf8ef2-726e-4b97-b5f2-032a84aa6e97', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aacf8ef2-726e-4b97-b5f2-032a84aa6e97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51d66027-c066-482f-93f1-6217163f6b22, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=795909b0-e9c1-4d84-850f-e878bfa3090c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:06.520 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 795909b0-e9c1-4d84-850f-e878bfa3090c in datapath aacf8ef2-726e-4b97-b5f2-032a84aa6e97 unbound from our chassis
Dec 06 10:18:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:06.523 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aacf8ef2-726e-4b97-b5f2-032a84aa6e97, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:06.524 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b0e7f2-858a-4fbf-93f8-9a498978e07c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:06.527 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:06 np0005548789.localdomain ceph-mon[298582]: pgmap v239: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 06 10:18:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:06.848 263652 INFO neutron.agent.dhcp.agent [None req-85939946-bc92-414a-acbc-b30bab3c85ba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d313493fde52dfd539ad4b8ab9379c14b9f143306ec1d068c13748552a2ec796-merged.mount: Deactivated successfully.
Dec 06 10:18:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee0cfafc2cf0bb1386540f40a224a2f3bfd7d908989bd2f85b43129d15f457fd-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:07 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2daacf8ef2\x2d726e\x2d4b97\x2db5f2\x2d032a84aa6e97.mount: Deactivated successfully.
Dec 06 10:18:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e140 e140: 6 total, 6 up, 6 in
Dec 06 10:18:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:07.608 263652 INFO neutron.agent.linux.ip_lib [None req-1d7f5d7d-a7d8-4cb5-9f7d-4b35e873013c - - - - - -] Device tape6df781f-3c cannot be used as it has no MAC address
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.672 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain kernel: device tape6df781f-3c entered promiscuous mode
Dec 06 10:18:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:07Z|00272|binding|INFO|Claiming lport e6df781f-3c99-4041-b79d-84bfb7ba881e for this chassis.
Dec 06 10:18:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:07Z|00273|binding|INFO|e6df781f-3c99-4041-b79d-84bfb7ba881e: Claiming unknown
Dec 06 10:18:07 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016287.6804] manager: (tape6df781f-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Dec 06 10:18:07 np0005548789.localdomain systemd-udevd[320705]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.682 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:07.700 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe84:1b9/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e6df781f-3c99-4041-b79d-84bfb7ba881e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:07.703 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e6df781f-3c99-4041-b79d-84bfb7ba881e in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:07.706 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1059294c-cfcd-41d2-879b-e9dc313613f9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:07.706 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:07.707 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2cc02a-4c0f-4f60-81e0-594a9820dc3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.713 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:07Z|00274|binding|INFO|Setting lport e6df781f-3c99-4041-b79d-84bfb7ba881e ovn-installed in OVS
Dec 06 10:18:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:07Z|00275|binding|INFO|Setting lport e6df781f-3c99-4041-b79d-84bfb7ba881e up in Southbound
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.720 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.721 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape6df781f-3c: No such device
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.770 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:07.806 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:07.977 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:07Z, description=, device_id=1217a843-f657-48ae-9649-70dee34aefa0, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fabe0d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fabef40>], id=b37b54ab-181e-4c3f-84f8-dee38b4d66be, ip_allocation=immediate, mac_address=fa:16:3e:d1:0b:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1680, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:07Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:08 np0005548789.localdomain ceph-mon[298582]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 3.3 KiB/s wr, 90 op/s
Dec 06 10:18:08 np0005548789.localdomain ceph-mon[298582]: osdmap e140: 6 total, 6 up, 6 in
Dec 06 10:18:08 np0005548789.localdomain podman[320952]: 2025-12-06 10:18:08.208707804 +0000 UTC m=+0.065605941 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:18:08 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:08 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:08 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:08.600 263652 INFO neutron.agent.dhcp.agent [None req-1f545d09-ee9f-4b4e-9006-a512ebe7de32 - - - - - -] DHCP configuration for ports {'b37b54ab-181e-4c3f-84f8-dee38b4d66be'} is completed
Dec 06 10:18:08 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:08.686 2 INFO neutron.agent.securitygroups_rpc [None req-2cd445e7-be6d-4272-b78a-eedc8c1ca774 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:08 np0005548789.localdomain podman[321007]: 
Dec 06 10:18:08 np0005548789.localdomain podman[321007]: 2025-12-06 10:18:08.807365412 +0000 UTC m=+0.099043346 container create 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:08 np0005548789.localdomain systemd[1]: Started libpod-conmon-07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0.scope.
Dec 06 10:18:08 np0005548789.localdomain podman[321007]: 2025-12-06 10:18:08.759244793 +0000 UTC m=+0.050922757 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:08 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:08 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d10b16333c71c329bbfb79d2b80e4350b7ea74cfc2d42251c0bd5d1c1cd279f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:08 np0005548789.localdomain podman[321007]: 2025-12-06 10:18:08.885944896 +0000 UTC m=+0.177622820 container init 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:08 np0005548789.localdomain podman[321007]: 2025-12-06 10:18:08.895474024 +0000 UTC m=+0.187151948 container start 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[321026]: started, version 2.85 cachesize 150
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[321026]: DNS service limited to local subnets
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[321026]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[321026]: warning: no upstream servers configured
Dec 06 10:18:08 np0005548789.localdomain dnsmasq[321026]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:08.976 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:09.012 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:09.075 263652 INFO neutron.agent.dhcp.agent [None req-0ae67413-d3cc-4ad8-bc4e-3a0105829b79 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:18:09 np0005548789.localdomain podman[321032]: 2025-12-06 10:18:09.192340329 +0000 UTC m=+0.091150955 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:18:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:09.225 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:09.226 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:09.228 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1059294c-cfcd-41d2-879b-e9dc313613f9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:09.228 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:09.229 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e555dd53-c23e-4e3f-a9ab-a32a218e03a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:09 np0005548789.localdomain podman[321030]: 2025-12-06 10:18:09.250327208 +0000 UTC m=+0.151790985 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:18:09 np0005548789.localdomain podman[321032]: 2025-12-06 10:18:09.313648448 +0000 UTC m=+0.212459064 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:18:09 np0005548789.localdomain dnsmasq[321026]: exiting on receipt of SIGTERM
Dec 06 10:18:09 np0005548789.localdomain podman[321083]: 2025-12-06 10:18:09.365023337 +0000 UTC m=+0.063651582 container kill 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: libpod-07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0.scope: Deactivated successfully.
Dec 06 10:18:09 np0005548789.localdomain podman[321030]: 2025-12-06 10:18:09.366858352 +0000 UTC m=+0.268322169 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:18:09 np0005548789.localdomain podman[321095]: 2025-12-06 10:18:09.437048951 +0000 UTC m=+0.060277469 container died 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:18:09 np0005548789.localdomain podman[321095]: 2025-12-06 10:18:09.521556085 +0000 UTC m=+0.144784653 container cleanup 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:18:09 np0005548789.localdomain systemd[1]: libpod-conmon-07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0.scope: Deactivated successfully.
Dec 06 10:18:09 np0005548789.localdomain podman[321102]: 2025-12-06 10:18:09.55174266 +0000 UTC m=+0.162034175 container remove 07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:10 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:10.061 2 INFO neutron.agent.securitygroups_rpc [None req-36813505-8d2e-42b4-bcdd-400a4500589a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:10 np0005548789.localdomain ceph-mon[298582]: pgmap v242: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 36 op/s
Dec 06 10:18:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d10b16333c71c329bbfb79d2b80e4350b7ea74cfc2d42251c0bd5d1c1cd279f2-merged.mount: Deactivated successfully.
Dec 06 10:18:10 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07ecf95c17dd1c0fe167c3b563f02c09bc22bd14b2d45508afdb9207cd0a77c0-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:10 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:10.939 2 INFO neutron.agent.securitygroups_rpc [None req-809d6155-5d31-4aee-97b1-907b0d1ee5ee a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:11 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:11 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:11 np0005548789.localdomain podman[321142]: 2025-12-06 10:18:11.018255812 +0000 UTC m=+0.072260343 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:18:11 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e141 e141: 6 total, 6 up, 6 in
Dec 06 10:18:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:11.928 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:11.929 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:11.931 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: pgmap v243: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 1.9 KiB/s wr, 36 op/s
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: osdmap e141: 6 total, 6 up, 6 in
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e142 e142: 6 total, 6 up, 6 in
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:12 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:12 np0005548789.localdomain podman[321214]: 
Dec 06 10:18:12 np0005548789.localdomain podman[321214]: 2025-12-06 10:18:12.282792807 +0000 UTC m=+0.104837571 container create 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:18:12 np0005548789.localdomain podman[321214]: 2025-12-06 10:18:12.231901093 +0000 UTC m=+0.053945857 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:12 np0005548789.localdomain systemd[1]: Started libpod-conmon-3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a.scope.
Dec 06 10:18:12 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:12 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96c1ac57ed0addf4968a8a3f8a0b5b5b6a27c910ab64d54d14bea32ff62ad6b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:12 np0005548789.localdomain podman[321214]: 2025-12-06 10:18:12.363949508 +0000 UTC m=+0.185994242 container init 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:18:12 np0005548789.localdomain podman[321214]: 2025-12-06 10:18:12.372816197 +0000 UTC m=+0.194860931 container start 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: started, version 2.85 cachesize 150
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: DNS service limited to local subnets
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: warning: no upstream servers configured
Dec 06 10:18:12 np0005548789.localdomain dnsmasq-dhcp[321232]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:12 np0005548789.localdomain dnsmasq-dhcp[321232]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:12 np0005548789.localdomain dnsmasq-dhcp[321232]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:12.680 263652 INFO neutron.agent.dhcp.agent [None req-c568e7b2-f4d6-414c-9171-b170262f466d - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', 'e6df781f-3c99-4041-b79d-84bfb7ba881e'} is completed
Dec 06 10:18:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:12.724 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:12 np0005548789.localdomain dnsmasq[321232]: exiting on receipt of SIGTERM
Dec 06 10:18:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:12.747 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:12Z, description=, device_id=0e94edaf-39e7-4c44-b823-1518a09d8708, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa99880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa99790>], id=5fe0748d-ad57-4cf2-ab87-16200f623579, ip_allocation=immediate, mac_address=fa:16:3e:e9:d9:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1702, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:12Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:12 np0005548789.localdomain podman[321249]: 2025-12-06 10:18:12.748116901 +0000 UTC m=+0.092886509 container kill 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:12 np0005548789.localdomain systemd[1]: libpod-3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a.scope: Deactivated successfully.
Dec 06 10:18:12 np0005548789.localdomain podman[321263]: 2025-12-06 10:18:12.828933082 +0000 UTC m=+0.064726054 container died 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:12 np0005548789.localdomain podman[321263]: 2025-12-06 10:18:12.860168769 +0000 UTC m=+0.095961701 container cleanup 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:12 np0005548789.localdomain systemd[1]: libpod-conmon-3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a.scope: Deactivated successfully.
Dec 06 10:18:12 np0005548789.localdomain podman[321265]: 2025-12-06 10:18:12.913045073 +0000 UTC m=+0.141019338 container remove 3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:18:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:12.932 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:12 np0005548789.localdomain kernel: device tape6df781f-3c left promiscuous mode
Dec 06 10:18:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:12Z|00276|binding|INFO|Releasing lport e6df781f-3c99-4041-b79d-84bfb7ba881e from this chassis (sb_readonly=0)
Dec 06 10:18:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:12Z|00277|binding|INFO|Setting lport e6df781f-3c99-4041-b79d-84bfb7ba881e down in Southbound
Dec 06 10:18:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:12.934 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:18:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:12.947 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28 2001:db8::f816:3eff:fe84:1b9/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e6df781f-3c99-4041-b79d-84bfb7ba881e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:12.949 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e6df781f-3c99-4041-b79d-84bfb7ba881e in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:12.955 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:12.957 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[1090082a-9344-4079-901b-66a5a4bbd573]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:12.958 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:13 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:18:13 np0005548789.localdomain podman[321311]: 2025-12-06 10:18:13.051368509 +0000 UTC m=+0.063018903 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:13 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:13 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:13.171 263652 INFO neutron.agent.dhcp.agent [None req-b3df4455-9686-4af5-8abc-1b26a04e7e36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:13.172 263652 INFO neutron.agent.dhcp.agent [None req-b3df4455-9686-4af5-8abc-1b26a04e7e36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:13 np0005548789.localdomain ceph-mon[298582]: osdmap e142: 6 total, 6 up, 6 in
Dec 06 10:18:13 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:13 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2958126727' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-96c1ac57ed0addf4968a8a3f8a0b5b5b6a27c910ab64d54d14bea32ff62ad6b9-merged.mount: Deactivated successfully.
Dec 06 10:18:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c95f8107be33efc6b62126a494e6c2cd6457b38c83e6f5347b359ffdaf46e6a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:13 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:13 np0005548789.localdomain sshd[321333]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:18:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:18:13 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:13.324 263652 INFO neutron.agent.dhcp.agent [None req-1e452ab4-e004-4251-8405-42bff3df5140 - - - - - -] DHCP configuration for ports {'5fe0748d-ad57-4cf2-ab87-16200f623579'} is completed
Dec 06 10:18:13 np0005548789.localdomain podman[321335]: 2025-12-06 10:18:13.40087144 +0000 UTC m=+0.082592086 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:13 np0005548789.localdomain podman[321335]: 2025-12-06 10:18:13.415091472 +0000 UTC m=+0.096812168 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:18:13 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:18:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:13.756 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:13.757 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:13.760 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:13.761 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[04a65137-6c24-40e2-969b-71eab511667b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:13.980 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain sshd[321333]: Received disconnect from 179.33.210.213 port 44260:11: Bye Bye [preauth]
Dec 06 10:18:14 np0005548789.localdomain sshd[321333]: Disconnected from authenticating user root 179.33.210.213 port 44260 [preauth]
Dec 06 10:18:14 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e143 e143: 6 total, 6 up, 6 in
Dec 06 10:18:14 np0005548789.localdomain ceph-mon[298582]: pgmap v246: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 3.2 KiB/s wr, 57 op/s
Dec 06 10:18:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:14.624 263652 INFO neutron.agent.linux.ip_lib [None req-b0be1b0b-5514-4abf-b284-a4ef25d427c9 - - - - - -] Device tap7bbbac24-f9 cannot be used as it has no MAC address
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.643 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain kernel: device tap7bbbac24-f9 entered promiscuous mode
Dec 06 10:18:14 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016294.6530] manager: (tap7bbbac24-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Dec 06 10:18:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:14Z|00278|binding|INFO|Claiming lport 7bbbac24-f9f6-48a3-8929-680a1ce4ebea for this chassis.
Dec 06 10:18:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:14Z|00279|binding|INFO|7bbbac24-f9f6-48a3-8929-680a1ce4ebea: Claiming unknown
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.655 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain systemd-udevd[321365]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:14Z|00280|binding|INFO|Setting lport 7bbbac24-f9f6-48a3-8929-680a1ce4ebea up in Southbound
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.662 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fedb:c901/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7bbbac24-f9f6-48a3-8929-680a1ce4ebea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:14Z|00281|binding|INFO|Setting lport 7bbbac24-f9f6-48a3-8929-680a1ce4ebea ovn-installed in OVS
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.666 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7bbbac24-f9f6-48a3-8929-680a1ce4ebea in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.668 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.669 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 07d15fbe-03f7-4926-82da-8e475fa08c52 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.670 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.671 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a6741d9f-3a55-42f4-9f73-958d19b5a1c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.693 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7bbbac24-f9: No such device
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.732 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:14.761 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.987 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.989 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.992 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 07d15fbe-03f7-4926-82da-8e475fa08c52 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.992 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:14.993 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[988f650d-c8a6-4115-9fd9-079325638be8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:15 np0005548789.localdomain ceph-mon[298582]: osdmap e143: 6 total, 6 up, 6 in
Dec 06 10:18:15 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4158350361' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:15 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4158350361' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e144 e144: 6 total, 6 up, 6 in
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:15 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:15 np0005548789.localdomain podman[321448]: 2025-12-06 10:18:15.542735196 +0000 UTC m=+0.104236842 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:15 np0005548789.localdomain podman[321464]: 
Dec 06 10:18:15 np0005548789.localdomain podman[321464]: 2025-12-06 10:18:15.630511768 +0000 UTC m=+0.111782691 container create 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:15 np0005548789.localdomain podman[321464]: 2025-12-06 10:18:15.577848902 +0000 UTC m=+0.059119895 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:15 np0005548789.localdomain systemd[1]: Started libpod-conmon-8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba.scope.
Dec 06 10:18:15 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:15 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5ec4c3a71df60d42de27b2a8fedf4f4c1b76c95442542d42d577d962b012c1d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:15 np0005548789.localdomain podman[321464]: 2025-12-06 10:18:15.737550306 +0000 UTC m=+0.218821229 container init 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:15 np0005548789.localdomain podman[321464]: 2025-12-06 10:18:15.748228889 +0000 UTC m=+0.229499812 container start 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[321493]: started, version 2.85 cachesize 150
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[321493]: DNS service limited to local subnets
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[321493]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[321493]: warning: no upstream servers configured
Dec 06 10:18:15 np0005548789.localdomain dnsmasq-dhcp[321493]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:15 np0005548789.localdomain dnsmasq[321493]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:15 np0005548789.localdomain dnsmasq-dhcp[321493]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:15 np0005548789.localdomain dnsmasq-dhcp[321493]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:15 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:15.930 263652 INFO neutron.agent.dhcp.agent [None req-05519599-9fb8-4aab-919f-2dfe40f1eb8c - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:16 np0005548789.localdomain dnsmasq[321493]: exiting on receipt of SIGTERM
Dec 06 10:18:16 np0005548789.localdomain podman[321511]: 2025-12-06 10:18:16.096894384 +0000 UTC m=+0.062604041 container kill 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:18:16 np0005548789.localdomain systemd[1]: libpod-8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba.scope: Deactivated successfully.
Dec 06 10:18:16 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:16.111 2 INFO neutron.agent.securitygroups_rpc [None req-6fa383fb-a4a1-4db9-8964-14f7246d83c2 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:16 np0005548789.localdomain podman[321525]: 2025-12-06 10:18:16.170299761 +0000 UTC m=+0.055633839 container died 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:16 np0005548789.localdomain podman[321525]: 2025-12-06 10:18:16.258678352 +0000 UTC m=+0.144012390 container cleanup 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:16 np0005548789.localdomain systemd[1]: libpod-conmon-8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba.scope: Deactivated successfully.
Dec 06 10:18:16 np0005548789.localdomain podman[321527]: 2025-12-06 10:18:16.285586237 +0000 UTC m=+0.154485006 container remove 8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:18:16 np0005548789.localdomain ceph-mon[298582]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.5 KiB/s wr, 55 op/s
Dec 06 10:18:16 np0005548789.localdomain ceph-mon[298582]: osdmap e144: 6 total, 6 up, 6 in
Dec 06 10:18:16 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e145 e145: 6 total, 6 up, 6 in
Dec 06 10:18:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f5ec4c3a71df60d42de27b2a8fedf4f4c1b76c95442542d42d577d962b012c1d-merged.mount: Deactivated successfully.
Dec 06 10:18:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b713c67d98072f47ed508c4de3d2c936900013066e24e6e09f965416b3728ba-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:18:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:18:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:17.010 263652 INFO neutron.agent.linux.ip_lib [None req-133b9a21-ee6e-4861-b739-4ff976c8ae20 - - - - - -] Device tap002e4ba5-7f cannot be used as it has no MAC address
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.045 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:17 np0005548789.localdomain kernel: device tap002e4ba5-7f entered promiscuous mode
Dec 06 10:18:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:17.053 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:16Z, description=, device_id=d43c2188-cd4b-4c96-8093-8bdb70fa0d41, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaa5b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb06760>], id=200c81a7-f7c4-4ce3-a4d6-6f1963f32326, ip_allocation=immediate, mac_address=fa:16:3e:ed:29:39, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1733, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:16Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:17 np0005548789.localdomain systemd-udevd[321367]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:17 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016297.0571] manager: (tap002e4ba5-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Dec 06 10:18:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:17Z|00282|binding|INFO|Claiming lport 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea for this chassis.
Dec 06 10:18:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:17Z|00283|binding|INFO|002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea: Claiming unknown
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.063 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:17 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:17.078 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-9e18bc76-c51e-4fe0-a47b-eaa50620189c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e18bc76-c51e-4fe0-a47b-eaa50620189c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2c9190-6e13-4a60-9837-0f4d9edea65e, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:17 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:17.080 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea in datapath 9e18bc76-c51e-4fe0-a47b-eaa50620189c bound to our chassis
Dec 06 10:18:17 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:17.081 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9e18bc76-c51e-4fe0-a47b-eaa50620189c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:17 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:17.082 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[480b0e13-7e0a-4603-aaf1-7629d21838fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:17Z|00284|binding|INFO|Setting lport 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea ovn-installed in OVS
Dec 06 10:18:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:17Z|00285|binding|INFO|Setting lport 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea up in Southbound
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.117 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.168 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.203 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:17.332 2 INFO neutron.agent.securitygroups_rpc [None req-034cc1e4-4fb9-4793-8ac5-168cd3b3cb7e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:17 np0005548789.localdomain systemd[1]: tmp-crun.wOtBh1.mount: Deactivated successfully.
Dec 06 10:18:17 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:18:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:17 np0005548789.localdomain podman[321597]: 2025-12-06 10:18:17.350421235 +0000 UTC m=+0.062355422 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:18:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:18:17 np0005548789.localdomain ceph-mon[298582]: osdmap e145: 6 total, 6 up, 6 in
Dec 06 10:18:17 np0005548789.localdomain podman[321617]: 2025-12-06 10:18:17.451894964 +0000 UTC m=+0.082591137 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:18:17 np0005548789.localdomain podman[321617]: 2025-12-06 10:18:17.490338899 +0000 UTC m=+0.121035032 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:18:17 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:18:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:17.700 263652 INFO neutron.agent.dhcp.agent [None req-985ad0b2-c243-4936-9ec1-4e6314977268 - - - - - -] DHCP configuration for ports {'200c81a7-f7c4-4ce3-a4d6-6f1963f32326'} is completed
Dec 06 10:18:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:17.745 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:18 np0005548789.localdomain podman[321709]: 
Dec 06 10:18:18 np0005548789.localdomain podman[321709]: 2025-12-06 10:18:18.366244107 +0000 UTC m=+0.097782137 container create 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 3.1 KiB/s wr, 67 op/s
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3956098805' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3956098805' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1565843773' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: Started libpod-conmon-5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c.scope.
Dec 06 10:18:18 np0005548789.localdomain podman[321709]: 2025-12-06 10:18:18.32021226 +0000 UTC m=+0.051750330 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: tmp-crun.hXXi6C.mount: Deactivated successfully.
Dec 06 10:18:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:18Z|00286|binding|INFO|Removing iface tap002e4ba5-7f ovn-installed in OVS
Dec 06 10:18:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:18Z|00287|binding|INFO|Removing lport 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea ovn-installed in OVS
Dec 06 10:18:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:18.429 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bb22d0e7-f4bb-48d5-bb17-8b3da91582dc with type ""
Dec 06 10:18:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:18.431 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:18.431 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-9e18bc76-c51e-4fe0-a47b-eaa50620189c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e18bc76-c51e-4fe0-a47b-eaa50620189c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a2aaeadee6f14b78a73f8886be99b671', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed2c9190-6e13-4a60-9837-0f4d9edea65e, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:18.433 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 002e4ba5-7f51-4c5d-93ec-79ed9f1d16ea in datapath 9e18bc76-c51e-4fe0-a47b-eaa50620189c unbound from our chassis
Dec 06 10:18:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:18.435 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9e18bc76-c51e-4fe0-a47b-eaa50620189c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:18.436 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b177746e-9478-4adf-8978-0c36143b882b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:18.439 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:18 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dd30d0fe6598c25e15c1ef77eb62e6473b8372e3dcc8c8e6af67f90141ace93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:18 np0005548789.localdomain podman[321709]: 2025-12-06 10:18:18.456378421 +0000 UTC m=+0.187916441 container init 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:18 np0005548789.localdomain podman[321709]: 2025-12-06 10:18:18.465492197 +0000 UTC m=+0.197030217 container start 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: started, version 2.85 cachesize 150
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: DNS service limited to local subnets
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: warning: no upstream servers configured
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321744]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: read /var/lib/neutron/dhcp/9e18bc76-c51e-4fe0-a47b-eaa50620189c/addn_hosts - 0 addresses
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/9e18bc76-c51e-4fe0-a47b-eaa50620189c/host
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321744]: read /var/lib/neutron/dhcp/9e18bc76-c51e-4fe0-a47b-eaa50620189c/opts
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321744]: exiting on receipt of SIGTERM
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: libpod-5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c.scope: Deactivated successfully.
Dec 06 10:18:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:18.582 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain kernel: device tap002e4ba5-7f left promiscuous mode
Dec 06 10:18:18 np0005548789.localdomain podman[321751]: 2025-12-06 10:18:18.584284471 +0000 UTC m=+0.086073112 container died 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: tmp-crun.0MvI4F.mount: Deactivated successfully.
Dec 06 10:18:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:18.601 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:18.617 263652 INFO neutron.agent.dhcp.agent [None req-7af9cf4c-4fdd-471b-8d91-b1e3b3d5e68a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:18.618 263652 INFO neutron.agent.dhcp.agent [None req-7af9cf4c-4fdd-471b-8d91-b1e3b3d5e68a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:18 np0005548789.localdomain podman[321751]: 2025-12-06 10:18:18.624924503 +0000 UTC m=+0.126713154 container cleanup 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:18.628 263652 INFO neutron.agent.dhcp.agent [None req-5c14d042-7d42-49c3-84f6-0ec4bc9aebb2 - - - - - -] DHCP configuration for ports {'34a236c4-68d2-4892-b791-a5726cf64064'} is completed
Dec 06 10:18:18 np0005548789.localdomain podman[321765]: 2025-12-06 10:18:18.650969854 +0000 UTC m=+0.066343094 container cleanup 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: libpod-conmon-5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c.scope: Deactivated successfully.
Dec 06 10:18:18 np0005548789.localdomain podman[321778]: 2025-12-06 10:18:18.704136016 +0000 UTC m=+0.065769796 container remove 5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9e18bc76-c51e-4fe0-a47b-eaa50620189c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:18:18 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:18Z|00288|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:18.749 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:18 np0005548789.localdomain podman[321799]: 
Dec 06 10:18:18 np0005548789.localdomain podman[321799]: 2025-12-06 10:18:18.795213938 +0000 UTC m=+0.070278262 container create 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: Started libpod-conmon-38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a.scope.
Dec 06 10:18:18 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:18 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6de4ae044bb1f768d5915fea78d8879f29c2071e4a65c276f65b84bb4ebc306/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:18 np0005548789.localdomain podman[321799]: 2025-12-06 10:18:18.761229828 +0000 UTC m=+0.036294132 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:18 np0005548789.localdomain podman[321799]: 2025-12-06 10:18:18.865912673 +0000 UTC m=+0.140976987 container init 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:18:18 np0005548789.localdomain podman[321799]: 2025-12-06 10:18:18.876280767 +0000 UTC m=+0.151345091 container start 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321817]: started, version 2.85 cachesize 150
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321817]: DNS service limited to local subnets
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321817]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321817]: warning: no upstream servers configured
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321817]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321817]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:18 np0005548789.localdomain dnsmasq[321817]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321817]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:18 np0005548789.localdomain dnsmasq-dhcp[321817]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:19.026 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:19.191 263652 INFO neutron.agent.dhcp.agent [None req-ed662b1f-eabc-4e24-9924-1e4959744cf8 - - - - - -] DHCP configuration for ports {'7bbbac24-f9f6-48a3-8929-680a1ce4ebea', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:19 np0005548789.localdomain dnsmasq[321817]: exiting on receipt of SIGTERM
Dec 06 10:18:19 np0005548789.localdomain podman[321833]: 2025-12-06 10:18:19.300060771 +0000 UTC m=+0.074189761 container kill 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:18:19 np0005548789.localdomain systemd[1]: libpod-38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a.scope: Deactivated successfully.
Dec 06 10:18:19 np0005548789.localdomain podman[321848]: 2025-12-06 10:18:19.376668365 +0000 UTC m=+0.063651942 container died 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:18:19 np0005548789.localdomain podman[321848]: 2025-12-06 10:18:19.409625744 +0000 UTC m=+0.096609291 container cleanup 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:19 np0005548789.localdomain systemd[1]: libpod-conmon-38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a.scope: Deactivated successfully.
Dec 06 10:18:19 np0005548789.localdomain podman[321855]: 2025-12-06 10:18:19.464002854 +0000 UTC m=+0.131672975 container remove 38e9370e7a974e755d0fff89f0bd50a79088f2b6d8f8c5ccddffe487b048ce0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7dd30d0fe6598c25e15c1ef77eb62e6473b8372e3dcc8c8e6af67f90141ace93-merged.mount: Deactivated successfully.
Dec 06 10:18:19 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5512667bb0d7e6b1e3aa8412a868107815b9331685eff6b7cf20ec820030910c-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:19 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d9e18bc76\x2dc51e\x2d4fe0\x2da47b\x2deaa50620189c.mount: Deactivated successfully.
Dec 06 10:18:19 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:19.718 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:19 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:19.720 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:19 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:19.724 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 07d15fbe-03f7-4926-82da-8e475fa08c52 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:19 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:19.725 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:19 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:19.726 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[42696e4f-aff9-49c4-93e0-7ccf143c389b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:19 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:19 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:19 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:19 np0005548789.localdomain podman[321893]: 2025-12-06 10:18:19.872074621 +0000 UTC m=+0.066464347 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:18:20 np0005548789.localdomain ceph-mon[298582]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 127 KiB/s rd, 8.0 KiB/s wr, 172 op/s
Dec 06 10:18:20 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3253677710' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:20.752 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:20Z, description=, device_id=0424f01c-36e4-4cfa-bf1a-e61c35c7ef48, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fabc3d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa77130>], id=4ae4a0b2-e3d7-4f1a-b9de-013e26a569c6, ip_allocation=immediate, mac_address=fa:16:3e:2e:e2:eb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1741, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:20Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:18:20 np0005548789.localdomain systemd[1]: tmp-crun.DuGCiy.mount: Deactivated successfully.
Dec 06 10:18:20 np0005548789.localdomain podman[321946]: 2025-12-06 10:18:20.950083779 +0000 UTC m=+0.100170530 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:18:21 np0005548789.localdomain podman[321946]: 2025-12-06 10:18:21.018323519 +0000 UTC m=+0.168410320 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:18:21 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 4 addresses
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:21 np0005548789.localdomain podman[321994]: 2025-12-06 10:18:21.069979965 +0000 UTC m=+0.070985833 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:21 np0005548789.localdomain podman[322007]: 
Dec 06 10:18:21 np0005548789.localdomain podman[322007]: 2025-12-06 10:18:21.156073547 +0000 UTC m=+0.126872750 container create dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:18:21 np0005548789.localdomain systemd[1]: Started libpod-conmon-dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced.scope.
Dec 06 10:18:21 np0005548789.localdomain podman[322007]: 2025-12-06 10:18:21.117062994 +0000 UTC m=+0.087862287 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:21 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:21 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0ed46f155af315b60d8e77e1ead0cf54960d92c9fb9020019f6aab4123a0dc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:21 np0005548789.localdomain podman[322007]: 2025-12-06 10:18:21.231927078 +0000 UTC m=+0.202726351 container init dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:21 np0005548789.localdomain podman[322007]: 2025-12-06 10:18:21.241169258 +0000 UTC m=+0.211968481 container start dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: started, version 2.85 cachesize 150
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: DNS service limited to local subnets
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: warning: no upstream servers configured
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:21.304 263652 INFO neutron.agent.dhcp.agent [None req-2b814532-aeea-4377-8df9-a338ba3f8a08 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fccb5b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcaa0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa0e2b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe384db1550>], id=0a97c207-b259-4dd6-97a0-5e53d9dcfae9, ip_allocation=immediate, mac_address=fa:16:3e:8a:27:09, name=tempest-NetworksTestDHCPv6-1066377207, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['e373677f-5620-4e92-a6c1-ef2cc27d6d54', 'ee1b67d1-08dd-4780-a445-d29a810260e7'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:14Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1730, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:15Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:21 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e146 e146: 6 total, 6 up, 6 in
Dec 06 10:18:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4142291706' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:21.475 263652 INFO neutron.agent.dhcp.agent [None req-00188759-231a-44b0-8e6b-3e98cd0b4267 - - - - - -] DHCP configuration for ports {'4ae4a0b2-e3d7-4f1a-b9de-013e26a569c6', '7bbbac24-f9f6-48a3-8929-680a1ce4ebea', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:21 np0005548789.localdomain podman[322056]: 2025-12-06 10:18:21.508894719 +0000 UTC m=+0.040234542 container kill dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:21.716 263652 INFO neutron.agent.dhcp.agent [None req-a4c74881-40bf-46d0-9fb5-bf6e396e9ac3 - - - - - -] DHCP configuration for ports {'0a97c207-b259-4dd6-97a0-5e53d9dcfae9'} is completed
Dec 06 10:18:21 np0005548789.localdomain dnsmasq[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:21 np0005548789.localdomain dnsmasq-dhcp[322038]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:21 np0005548789.localdomain podman[322093]: 2025-12-06 10:18:21.898689022 +0000 UTC m=+0.063409075 container kill dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:18:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e147 e147: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548789.localdomain ceph-mon[298582]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 804 MiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 7.0 KiB/s wr, 151 op/s
Dec 06 10:18:22 np0005548789.localdomain ceph-mon[298582]: osdmap e146: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548789.localdomain ceph-mon[298582]: osdmap e147: 6 total, 6 up, 6 in
Dec 06 10:18:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:22.476 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:22.478 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:22.481 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 07d15fbe-03f7-4926-82da-8e475fa08c52 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:22.481 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:22.483 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2dac71-60ee-4c1a-a8cc-c1cbd3057bc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:22.788 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:23 np0005548789.localdomain dnsmasq[322038]: exiting on receipt of SIGTERM
Dec 06 10:18:23 np0005548789.localdomain podman[322131]: 2025-12-06 10:18:23.448748558 +0000 UTC m=+0.068456757 container kill dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:18:23 np0005548789.localdomain systemd[1]: libpod-dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced.scope: Deactivated successfully.
Dec 06 10:18:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e148 e148: 6 total, 6 up, 6 in
Dec 06 10:18:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/298180858' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/298180858' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:23 np0005548789.localdomain sshd[322163]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:18:23 np0005548789.localdomain podman[322144]: 2025-12-06 10:18:23.54277454 +0000 UTC m=+0.080915346 container died dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:23 np0005548789.localdomain podman[322144]: 2025-12-06 10:18:23.580532765 +0000 UTC m=+0.118673500 container cleanup dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:18:23 np0005548789.localdomain systemd[1]: libpod-conmon-dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced.scope: Deactivated successfully.
Dec 06 10:18:23 np0005548789.localdomain podman[322151]: 2025-12-06 10:18:23.609716221 +0000 UTC m=+0.134082458 container remove dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:18:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:23.641 2 INFO neutron.agent.securitygroups_rpc [None req-b79a01a3-8e64-4889-8420-e298cffcfc58 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:23.727 2 INFO neutron.agent.securitygroups_rpc [None req-9f63fce7-8a34-4731-bfa7-9d45ada3f54e 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:23 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:23 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:23 np0005548789.localdomain podman[322203]: 2025-12-06 10:18:23.886918419 +0000 UTC m=+0.085186225 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:23 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:18:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:18:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:18:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:18:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:18:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Dec 06 10:18:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:24.028 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a0ed46f155af315b60d8e77e1ead0cf54960d92c9fb9020019f6aab4123a0dc9-merged.mount: Deactivated successfully.
Dec 06 10:18:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc4154f63878f1821bb554dd4e9f27396ecfb6b8486bc7cc4668a4fe1cdf5ced-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e149 e149: 6 total, 6 up, 6 in
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: pgmap v256: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 204 KiB/s rd, 15 MiB/s wr, 282 op/s
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: osdmap e148: 6 total, 6 up, 6 in
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:24 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:24 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:24.635 2 INFO neutron.agent.securitygroups_rpc [None req-366a0057-fc3f-46e6-9a84-ba466e35126f a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:24 np0005548789.localdomain podman[322265]: 
Dec 06 10:18:24 np0005548789.localdomain podman[322265]: 2025-12-06 10:18:24.7158147 +0000 UTC m=+0.114271386 container create 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:18:24 np0005548789.localdomain podman[322265]: 2025-12-06 10:18:24.658309766 +0000 UTC m=+0.056766482 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c.scope.
Dec 06 10:18:24 np0005548789.localdomain systemd[1]: tmp-crun.oJmBM7.mount: Deactivated successfully.
Dec 06 10:18:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/817b9bf6c12c6bcbb693e0f772f7a483fc0ca562d40fce2f6f132e4c3d88dc33/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:24 np0005548789.localdomain podman[322265]: 2025-12-06 10:18:24.822216537 +0000 UTC m=+0.220673183 container init 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:18:24 np0005548789.localdomain podman[322265]: 2025-12-06 10:18:24.833151269 +0000 UTC m=+0.231607915 container start 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:18:24 np0005548789.localdomain dnsmasq[322283]: started, version 2.85 cachesize 150
Dec 06 10:18:24 np0005548789.localdomain dnsmasq[322283]: DNS service limited to local subnets
Dec 06 10:18:24 np0005548789.localdomain dnsmasq[322283]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:24 np0005548789.localdomain dnsmasq[322283]: warning: no upstream servers configured
Dec 06 10:18:24 np0005548789.localdomain dnsmasq-dhcp[322283]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:24 np0005548789.localdomain dnsmasq[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:24 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:24 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:24 np0005548789.localdomain sshd[322163]: Received disconnect from 118.219.234.233 port 60728:11: Bye Bye [preauth]
Dec 06 10:18:24 np0005548789.localdomain sshd[322163]: Disconnected from authenticating user root 118.219.234.233 port 60728 [preauth]
Dec 06 10:18:24 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:24.911 263652 INFO neutron.agent.dhcp.agent [None req-9d612b54-4b84-4f39-818a-dcaba74301cb - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6310>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6a60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6430>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fad6fa0>], id=b4fda620-9f24-4002-924f-2a076f0e31f0, ip_allocation=immediate, mac_address=fa:16:3e:9d:7e:fa, name=tempest-NetworksTestDHCPv6-1726515961, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['259cd01c-9daa-4d41-93b5-27fb2cf38ff6', 'cc5199f7-ffc7-4584-9436-5644a8e4cb87'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:20Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1748, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:23Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:25.080 263652 INFO neutron.agent.dhcp.agent [None req-ffa0c331-58ad-49c7-ad5e-1f7cc23c0727 - - - - - -] DHCP configuration for ports {'7bbbac24-f9f6-48a3-8929-680a1ce4ebea', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:25 np0005548789.localdomain dnsmasq[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:18:25 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:25 np0005548789.localdomain podman[322302]: 2025-12-06 10:18:25.155019512 +0000 UTC m=+0.067905861 container kill 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:18:25 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:25.373 2 INFO neutron.agent.securitygroups_rpc [None req-9f7062a2-5eeb-4deb-87a1-858e2e900cdd 4c6008178bdc445aa99fb1b726f87b45 a2aaeadee6f14b78a73f8886be99b671 - - default default] Security group member updated ['13551db8-e8e0-43b0-89a9-b0d8423e74c9']
Dec 06 10:18:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:25.404 263652 INFO neutron.agent.dhcp.agent [None req-ce980b1a-31d9-441e-a6f1-debe43d8e15e - - - - - -] DHCP configuration for ports {'b4fda620-9f24-4002-924f-2a076f0e31f0'} is completed
Dec 06 10:18:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:25.420 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:25 np0005548789.localdomain ceph-mon[298582]: osdmap e149: 6 total, 6 up, 6 in
Dec 06 10:18:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1581079548' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e150 e150: 6 total, 6 up, 6 in
Dec 06 10:18:25 np0005548789.localdomain dnsmasq[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:25 np0005548789.localdomain podman[322340]: 2025-12-06 10:18:25.651235543 +0000 UTC m=+0.073616444 container kill 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:18:25 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:25 np0005548789.localdomain dnsmasq-dhcp[322283]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:25.843 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:26 np0005548789.localdomain dnsmasq[322283]: exiting on receipt of SIGTERM
Dec 06 10:18:26 np0005548789.localdomain systemd[1]: libpod-96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c.scope: Deactivated successfully.
Dec 06 10:18:26 np0005548789.localdomain podman[322380]: 2025-12-06 10:18:26.142887996 +0000 UTC m=+0.076809721 container kill 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.217 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:18:26 np0005548789.localdomain podman[322394]: 2025-12-06 10:18:26.223618065 +0000 UTC m=+0.053819964 container died 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:18:26 np0005548789.localdomain podman[322394]: 2025-12-06 10:18:26.275972732 +0000 UTC m=+0.106174581 container remove 96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:18:26 np0005548789.localdomain systemd[1]: libpod-conmon-96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c.scope: Deactivated successfully.
Dec 06 10:18:26 np0005548789.localdomain kernel: device tap7bbbac24-f9 left promiscuous mode
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.335 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:26Z|00289|binding|INFO|Releasing lport 7bbbac24-f9f6-48a3-8929-680a1ce4ebea from this chassis (sb_readonly=0)
Dec 06 10:18:26 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:26Z|00290|binding|INFO|Setting lport 7bbbac24-f9f6-48a3-8929-680a1ce4ebea down in Southbound
Dec 06 10:18:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:26.346 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fedb:c901/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7bbbac24-f9f6-48a3-8929-680a1ce4ebea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:26.348 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7bbbac24-f9f6-48a3-8929-680a1ce4ebea in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:26.350 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:26 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:26.351 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[cc25167d-fe91-4f8d-9036-e773a1c01055]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.354 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:26.356 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-817b9bf6c12c6bcbb693e0f772f7a483fc0ca562d40fce2f6f132e4c3d88dc33-merged.mount: Deactivated successfully.
Dec 06 10:18:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96403699a1358f1a4db9d2adf9ab3c07936dd48c6b352f0d937390988fa3463c-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:26 np0005548789.localdomain ceph-mon[298582]: pgmap v259: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 155 KiB/s rd, 26 MiB/s wr, 220 op/s
Dec 06 10:18:26 np0005548789.localdomain ceph-mon[298582]: osdmap e150: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e151 e151: 6 total, 6 up, 6 in
Dec 06 10:18:26 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:27 np0005548789.localdomain podman[322437]: 2025-12-06 10:18:27.059400485 +0000 UTC m=+0.065718184 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:18:27 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:18:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:27 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:27 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:27Z|00291|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:27 np0005548789.localdomain sshd[322451]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:18:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:27.111 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:27.167 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:27.169 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:27.171 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:27.172 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f7518ce9-d499-431a-af19-957509bff87b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e152 e152: 6 total, 6 up, 6 in
Dec 06 10:18:27 np0005548789.localdomain ceph-mon[298582]: osdmap e151: 6 total, 6 up, 6 in
Dec 06 10:18:27 np0005548789.localdomain sshd[322451]: Received disconnect from 64.227.102.57 port 53230:11: Bye Bye [preauth]
Dec 06 10:18:27 np0005548789.localdomain sshd[322451]: Disconnected from authenticating user root 64.227.102.57 port 53230 [preauth]
Dec 06 10:18:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:27.823 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:28.152 263652 INFO neutron.agent.linux.ip_lib [None req-f161c0a0-8298-408a-9c36-5da35e09fec6 - - - - - -] Device tapb3029020-e8 cannot be used as it has no MAC address
Dec 06 10:18:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:28.185 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain kernel: device tapb3029020-e8 entered promiscuous mode
Dec 06 10:18:28 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016308.1958] manager: (tapb3029020-e8): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Dec 06 10:18:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:28.196 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:28Z|00292|binding|INFO|Claiming lport b3029020-e85b-4668-b0f6-5a9b030e9618 for this chassis.
Dec 06 10:18:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:28Z|00293|binding|INFO|b3029020-e85b-4668-b0f6-5a9b030e9618: Claiming unknown
Dec 06 10:18:28 np0005548789.localdomain systemd-udevd[322471]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.210 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=b3029020-e85b-4668-b0f6-5a9b030e9618) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.212 160509 INFO neutron.agent.ovn.metadata.agent [-] Port b3029020-e85b-4668-b0f6-5a9b030e9618 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.214 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 11e42bea-b512-4655-8b5d-68b257966ab4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.215 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.216 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[395b2e66-de7d-471a-a278-7873b15eba20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:28Z|00294|binding|INFO|Setting lport b3029020-e85b-4668-b0f6-5a9b030e9618 ovn-installed in OVS
Dec 06 10:18:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:28Z|00295|binding|INFO|Setting lport b3029020-e85b-4668-b0f6-5a9b030e9618 up in Southbound
Dec 06 10:18:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:28.240 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb3029020-e8: No such device
Dec 06 10:18:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:28.281 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:28.318 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:28 np0005548789.localdomain ceph-mon[298582]: pgmap v262: 177 pgs: 177 active+clean; 249 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:18:28 np0005548789.localdomain ceph-mon[298582]: osdmap e152: 6 total, 6 up, 6 in
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.656 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.658 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.661 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 11e42bea-b512-4655-8b5d-68b257966ab4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.661 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:28.662 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[29291bd4-9862-4f55-9606-ec80e8723c09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:29.031 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:29 np0005548789.localdomain podman[322542]: 
Dec 06 10:18:29 np0005548789.localdomain podman[322542]: 2025-12-06 10:18:29.305301146 +0000 UTC m=+0.097263801 container create a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:18:29 np0005548789.localdomain systemd[1]: Started libpod-conmon-a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8.scope.
Dec 06 10:18:29 np0005548789.localdomain podman[322542]: 2025-12-06 10:18:29.258832836 +0000 UTC m=+0.050795531 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:29 np0005548789.localdomain systemd[1]: tmp-crun.jfdhv0.mount: Deactivated successfully.
Dec 06 10:18:29 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:29 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e52e6fef944f14e3e41e1d4fa967fb083ae195089e656baef3c3bf80ae5494/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:29 np0005548789.localdomain podman[322542]: 2025-12-06 10:18:29.38617632 +0000 UTC m=+0.178138975 container init a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:18:29 np0005548789.localdomain podman[322542]: 2025-12-06 10:18:29.395134171 +0000 UTC m=+0.187096826 container start a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: started, version 2.85 cachesize 150
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: DNS service limited to local subnets
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: warning: no upstream servers configured
Dec 06 10:18:29 np0005548789.localdomain dnsmasq-dhcp[322560]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:29 np0005548789.localdomain dnsmasq-dhcp[322560]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:29 np0005548789.localdomain dnsmasq-dhcp[322560]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:29.605 263652 INFO neutron.agent.dhcp.agent [None req-36684ac6-7a56-44dd-916a-4c88129a7c55 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3964149539' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:29 np0005548789.localdomain dnsmasq[322560]: exiting on receipt of SIGTERM
Dec 06 10:18:29 np0005548789.localdomain podman[322577]: 2025-12-06 10:18:29.715796467 +0000 UTC m=+0.054185945 container kill a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:18:29 np0005548789.localdomain systemd[1]: libpod-a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8.scope: Deactivated successfully.
Dec 06 10:18:29 np0005548789.localdomain podman[322590]: 2025-12-06 10:18:29.775521419 +0000 UTC m=+0.045139960 container died a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:29 np0005548789.localdomain podman[322590]: 2025-12-06 10:18:29.813719877 +0000 UTC m=+0.083338388 container cleanup a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:29 np0005548789.localdomain systemd[1]: libpod-conmon-a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8.scope: Deactivated successfully.
Dec 06 10:18:29 np0005548789.localdomain podman[322592]: 2025-12-06 10:18:29.88635739 +0000 UTC m=+0.146742491 container remove a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:18:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:29.905 2 INFO neutron.agent.securitygroups_rpc [None req-cc7e06ae-2215-4c85-8ca6-e56c13503fc8 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-78e52e6fef944f14e3e41e1d4fa967fb083ae195089e656baef3c3bf80ae5494-merged.mount: Deactivated successfully.
Dec 06 10:18:30 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3cc50ea41d601a7d39e4271e5834a77ca0be189e992511b75f8bb5186dca4c8-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:30 np0005548789.localdomain ceph-mon[298582]: pgmap v264: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 161 KiB/s rd, 1.7 MiB/s wr, 226 op/s
Dec 06 10:18:30 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:30.759 2 INFO neutron.agent.securitygroups_rpc [None req-e8307117-28c2-4262-9c6e-dc24bf4a796c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:30 np0005548789.localdomain podman[322668]: 
Dec 06 10:18:30 np0005548789.localdomain podman[322668]: 2025-12-06 10:18:30.957686286 +0000 UTC m=+0.101700137 container create 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:18:30 np0005548789.localdomain systemd[1]: Started libpod-conmon-18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045.scope.
Dec 06 10:18:31 np0005548789.localdomain podman[322668]: 2025-12-06 10:18:30.908599387 +0000 UTC m=+0.052613298 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89e42998c88cb974d7c2b22853155a2a789a8b224fcbfc3b35a404588475f19e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:31 np0005548789.localdomain podman[322668]: 2025-12-06 10:18:31.028648538 +0000 UTC m=+0.172662369 container init 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:18:31 np0005548789.localdomain podman[322668]: 2025-12-06 10:18:31.03891264 +0000 UTC m=+0.182926491 container start 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: started, version 2.85 cachesize 150
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: DNS service limited to local subnets
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: warning: no upstream servers configured
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[322686]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[322686]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[322686]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[322686]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:31Z|00296|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:31.202 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:31 np0005548789.localdomain podman[322704]: 2025-12-06 10:18:31.257325354 +0000 UTC m=+0.075063018 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:18:31 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:31.270 263652 INFO neutron.agent.dhcp.agent [None req-17010f5e-7f00-4e31-8484-75a4746e68e8 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', 'b3029020-e85b-4668-b0f6-5a9b030e9618'} is completed
Dec 06 10:18:31 np0005548789.localdomain dnsmasq[322686]: exiting on receipt of SIGTERM
Dec 06 10:18:31 np0005548789.localdomain systemd[1]: libpod-18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045.scope: Deactivated successfully.
Dec 06 10:18:31 np0005548789.localdomain podman[322739]: 2025-12-06 10:18:31.434153248 +0000 UTC m=+0.080212374 container kill 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:31 np0005548789.localdomain podman[322757]: 2025-12-06 10:18:31.498671775 +0000 UTC m=+0.053461543 container died 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:31 np0005548789.localdomain systemd[1]: tmp-crun.tFc2sd.mount: Deactivated successfully.
Dec 06 10:18:31 np0005548789.localdomain podman[322757]: 2025-12-06 10:18:31.538134072 +0000 UTC m=+0.092923810 container cleanup 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:31 np0005548789.localdomain systemd[1]: libpod-conmon-18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045.scope: Deactivated successfully.
Dec 06 10:18:31 np0005548789.localdomain podman[322765]: 2025-12-06 10:18:31.563012726 +0000 UTC m=+0.106507841 container remove 18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:18:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e153 e153: 6 total, 6 up, 6 in
Dec 06 10:18:32 np0005548789.localdomain podman[322812]: 2025-12-06 10:18:32.194915262 +0000 UTC m=+0.093192897 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:32 np0005548789.localdomain podman[322812]: 2025-12-06 10:18:32.204086641 +0000 UTC m=+0.102364196 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:18:32 np0005548789.localdomain podman[322813]: 2025-12-06 10:18:32.298784153 +0000 UTC m=+0.194658565 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-89e42998c88cb974d7c2b22853155a2a789a8b224fcbfc3b35a404588475f19e-merged.mount: Deactivated successfully.
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18164ad664db41d65c729673ff53a331d1f1fc059dcd4257511254b0405a2045-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:32 np0005548789.localdomain podman[322813]: 2025-12-06 10:18:32.313322695 +0000 UTC m=+0.209197047 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:18:32 np0005548789.localdomain podman[322876]: 
Dec 06 10:18:32 np0005548789.localdomain podman[322876]: 2025-12-06 10:18:32.444529414 +0000 UTC m=+0.076201743 container create 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: Started libpod-conmon-3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f.scope.
Dec 06 10:18:32 np0005548789.localdomain podman[322876]: 2025-12-06 10:18:32.403396597 +0000 UTC m=+0.035068976 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:32 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9d73a2fdfb03ffbcd30162cfb7ffe49e8b3eded892e7281f4a620f84c192ffd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:32 np0005548789.localdomain podman[322876]: 2025-12-06 10:18:32.519398775 +0000 UTC m=+0.151071114 container init 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:18:32 np0005548789.localdomain podman[322876]: 2025-12-06 10:18:32.528329075 +0000 UTC m=+0.160001414 container start 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: started, version 2.85 cachesize 150
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: DNS service limited to local subnets
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: warning: no upstream servers configured
Dec 06 10:18:32 np0005548789.localdomain dnsmasq-dhcp[322895]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:32 np0005548789.localdomain dnsmasq-dhcp[322895]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:32 np0005548789.localdomain dnsmasq-dhcp[322895]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:32 np0005548789.localdomain ceph-mon[298582]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 1.3 MiB/s wr, 174 op/s
Dec 06 10:18:32 np0005548789.localdomain ceph-mon[298582]: osdmap e153: 6 total, 6 up, 6 in
Dec 06 10:18:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:32.773 263652 INFO neutron.agent.dhcp.agent [None req-b6712de3-87d8-4c48-96f2-e44ea4e39d7f - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', 'b3029020-e85b-4668-b0f6-5a9b030e9618'} is completed
Dec 06 10:18:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:32.893 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:32 np0005548789.localdomain dnsmasq[322895]: exiting on receipt of SIGTERM
Dec 06 10:18:32 np0005548789.localdomain podman[322911]: 2025-12-06 10:18:32.917549401 +0000 UTC m=+0.124533988 container kill 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:32 np0005548789.localdomain systemd[1]: libpod-3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f.scope: Deactivated successfully.
Dec 06 10:18:32 np0005548789.localdomain podman[322924]: 2025-12-06 10:18:32.989433322 +0000 UTC m=+0.060331682 container died 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:18:33 np0005548789.localdomain podman[322924]: 2025-12-06 10:18:33.025118894 +0000 UTC m=+0.096017214 container cleanup 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:33 np0005548789.localdomain systemd[1]: libpod-conmon-3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f.scope: Deactivated successfully.
Dec 06 10:18:33 np0005548789.localdomain podman[322926]: 2025-12-06 10:18:33.07872985 +0000 UTC m=+0.140447371 container remove 3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.089 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:33 np0005548789.localdomain kernel: device tapb3029020-e8 left promiscuous mode
Dec 06 10:18:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:33Z|00297|binding|INFO|Releasing lport b3029020-e85b-4668-b0f6-5a9b030e9618 from this chassis (sb_readonly=0)
Dec 06 10:18:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:33Z|00298|binding|INFO|Setting lport b3029020-e85b-4668-b0f6-5a9b030e9618 down in Southbound
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.098 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fed7:1d9d/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=b3029020-e85b-4668-b0f6-5a9b030e9618) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.099 160509 INFO neutron.agent.ovn.metadata.agent [-] Port b3029020-e85b-4668-b0f6-5a9b030e9618 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.101 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.102 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[11dffdbb-9fdc-4c0d-923d-88758d29ec30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.110 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.217 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.234 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.234 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-a9d73a2fdfb03ffbcd30162cfb7ffe49e8b3eded892e7281f4a620f84c192ffd-merged.mount: Deactivated successfully.
Dec 06 10:18:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3035dfe2999c811d0b2491162142d5fb209c1a29b600736554ddc43f5a334c6f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:33 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:33.442 263652 INFO neutron.agent.dhcp.agent [None req-9586a7e6-b676-4fc8-b691-b754c5e5f7ec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:33.443 263652 INFO neutron.agent.dhcp.agent [None req-9586a7e6-b676-4fc8-b691-b754c5e5f7ec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:33.443 263652 INFO neutron.agent.dhcp.agent [None req-9586a7e6-b676-4fc8-b691-b754c5e5f7ec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:33 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:33.525 2 INFO neutron.agent.securitygroups_rpc [None req-dd900d05-ceed-4a76-8792-94f73f7d9bdc b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2082246135' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.695 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.797 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.797 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.808 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 10.100.0.2 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.810 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.812 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:33.813 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[6520fdae-ab35-4375-b812-0f5a3613639b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.996 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.997 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11250MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.998 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:33.998 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.070 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.275 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.276 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.276 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:18:34 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:34.362 2 INFO neutron.agent.securitygroups_rpc [None req-155fc4a8-22cb-4d06-82dd-8cfd5b79a9e9 8705da02a69e4c3281916dd7bc9ac6d1 851f2bb5c4164322946aa41fe266eb66 - - default default] Security group member updated ['6607cea2-9b0f-45af-9864-1af2923eb94b']
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.562 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:18:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:34.590 263652 INFO neutron.agent.linux.ip_lib [None req-0a7ab7d8-06a3-4d20-b52b-e59c9a3a54a4 - - - - - -] Device tap1e7eac16-04 cannot be used as it has no MAC address
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.615 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain kernel: device tap1e7eac16-04 entered promiscuous mode
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.626 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:34Z|00299|binding|INFO|Claiming lport 1e7eac16-0466-4938-b460-d43f8a6e5320 for this chassis.
Dec 06 10:18:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:34Z|00300|binding|INFO|1e7eac16-0466-4938-b460-d43f8a6e5320: Claiming unknown
Dec 06 10:18:34 np0005548789.localdomain systemd-udevd[322988]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:34 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016314.6300] manager: (tap1e7eac16-04): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Dec 06 10:18:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:34.636 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5d:d6f0/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1e7eac16-0466-4938-b460-d43f8a6e5320) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:34Z|00301|binding|INFO|Setting lport 1e7eac16-0466-4938-b460-d43f8a6e5320 ovn-installed in OVS
Dec 06 10:18:34 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:34Z|00302|binding|INFO|Setting lport 1e7eac16-0466-4938-b460-d43f8a6e5320 up in Southbound
Dec 06 10:18:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:34.639 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1e7eac16-0466-4938-b460-d43f8a6e5320 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.639 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:34.641 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8e499ad3-7646-4ccf-b8fd-bda079c71614 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:34.641 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:34 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:34.642 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2160f273-e785-4c4b-90fc-15f7b03671cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.660 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain ceph-mon[298582]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 133 KiB/s rd, 1.2 MiB/s wr, 184 op/s
Dec 06 10:18:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2082246135' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1e7eac16-04: No such device
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.704 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.735 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.736 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.739 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.752 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.770 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:18:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:34.807 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:34 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:34.831 2 INFO neutron.agent.securitygroups_rpc [None req-d5c62043-5321-4cf5-baec-1c1605bc1cd9 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:35.046 2 INFO neutron.agent.securitygroups_rpc [None req-f9e16e57-d76f-4e49-8c54-adacc8516f8a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:35.145 2 INFO neutron.agent.securitygroups_rpc [None req-fe503d20-8e49-4871-94e0-efabb011ed42 8705da02a69e4c3281916dd7bc9ac6d1 851f2bb5c4164322946aa41fe266eb66 - - default default] Security group member updated ['6607cea2-9b0f-45af-9864-1af2923eb94b']
Dec 06 10:18:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:35 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1644367828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:35.272 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:35.278 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:18:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:35.297 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:18:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:35.299 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:18:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:35.300 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:35 np0005548789.localdomain sshd[323059]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:18:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:35.361 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:35.423 2 INFO neutron.agent.securitygroups_rpc [None req-f9e16e57-d76f-4e49-8c54-adacc8516f8a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:35 np0005548789.localdomain podman[323083]: 
Dec 06 10:18:35 np0005548789.localdomain podman[323083]: 2025-12-06 10:18:35.602718257 +0000 UTC m=+0.098817409 container create 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:18:35 np0005548789.localdomain systemd[1]: Started libpod-conmon-6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc.scope.
Dec 06 10:18:35 np0005548789.localdomain podman[323083]: 2025-12-06 10:18:35.558153015 +0000 UTC m=+0.054252157 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:35 np0005548789.localdomain systemd[1]: tmp-crun.3QH2a9.mount: Deactivated successfully.
Dec 06 10:18:35 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:35.687 2 INFO neutron.agent.securitygroups_rpc [None req-4e477950-abaa-4886-9df8-9dd5bb5175a4 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:35 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbc77840f75508555765143db8d6e2013f2107a72b75c59be1877c317e703b54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:35 np0005548789.localdomain podman[323083]: 2025-12-06 10:18:35.701504753 +0000 UTC m=+0.197603895 container init 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1644367828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:35 np0005548789.localdomain podman[323083]: 2025-12-06 10:18:35.717325184 +0000 UTC m=+0.213424336 container start 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: started, version 2.85 cachesize 150
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: DNS service limited to local subnets
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: warning: no upstream servers configured
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:35.786 263652 INFO neutron.agent.dhcp.agent [None req-0a7ab7d8-06a3-4d20-b52b-e59c9a3a54a4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:34Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa7ee50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa7e3a0>], id=317f9495-2ed2-45a2-afeb-683254f0b250, ip_allocation=immediate, mac_address=fa:16:3e:0b:59:61, name=tempest-NetworksTestDHCPv6-1599368332, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['d6c65525-ce8f-4af5-8cc0-7d2130f263c9'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:32Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1833, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:34Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:35.851 263652 INFO neutron.agent.dhcp.agent [None req-dba23f51-a783-4249-adb5-3f64dfb318ca - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:35 np0005548789.localdomain dnsmasq[323102]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:35 np0005548789.localdomain podman[323121]: 2025-12-06 10:18:35.996964235 +0000 UTC m=+0.062545208 container kill 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:36 np0005548789.localdomain sshd[323059]: Received disconnect from 80.94.93.119 port 17030:11:  [preauth]
Dec 06 10:18:36 np0005548789.localdomain sshd[323059]: Disconnected from authenticating user root 80.94.93.119 port 17030 [preauth]
Dec 06 10:18:36 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:36.116 2 INFO neutron.agent.securitygroups_rpc [None req-14f85306-cb54-46c0-a6f6-e09d3e175b2a b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:36.269 263652 INFO neutron.agent.dhcp.agent [None req-800a9886-b6d8-4c93-8c12-50ec2af162b9 - - - - - -] DHCP configuration for ports {'317f9495-2ed2-45a2-afeb-683254f0b250'} is completed
Dec 06 10:18:36 np0005548789.localdomain dnsmasq[323102]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:36 np0005548789.localdomain podman[323158]: 2025-12-06 10:18:36.338599267 +0000 UTC m=+0.062994821 container kill 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:18:36 np0005548789.localdomain systemd[1]: tmp-crun.zFfWmF.mount: Deactivated successfully.
Dec 06 10:18:36 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:36.651 2 INFO neutron.agent.securitygroups_rpc [None req-af2f65bf-97b7-4bb0-b9fe-3c28224c3c96 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:36 np0005548789.localdomain ceph-mon[298582]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 1.0 MiB/s wr, 151 op/s
Dec 06 10:18:36 np0005548789.localdomain dnsmasq[323102]: exiting on receipt of SIGTERM
Dec 06 10:18:36 np0005548789.localdomain systemd[1]: libpod-6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc.scope: Deactivated successfully.
Dec 06 10:18:36 np0005548789.localdomain podman[323195]: 2025-12-06 10:18:36.852614519 +0000 UTC m=+0.067258991 container kill 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:36 np0005548789.localdomain podman[323209]: 2025-12-06 10:18:36.931386798 +0000 UTC m=+0.066567641 container died 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:18:36 np0005548789.localdomain podman[323209]: 2025-12-06 10:18:36.962369897 +0000 UTC m=+0.097550710 container cleanup 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:18:36 np0005548789.localdomain systemd[1]: libpod-conmon-6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc.scope: Deactivated successfully.
Dec 06 10:18:37 np0005548789.localdomain podman[323216]: 2025-12-06 10:18:37.025226805 +0000 UTC m=+0.146064283 container remove 6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:37.039 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:37Z|00303|binding|INFO|Releasing lport 1e7eac16-0466-4938-b460-d43f8a6e5320 from this chassis (sb_readonly=0)
Dec 06 10:18:37 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:37Z|00304|binding|INFO|Setting lport 1e7eac16-0466-4938-b460-d43f8a6e5320 down in Southbound
Dec 06 10:18:37 np0005548789.localdomain kernel: device tap1e7eac16-04 left promiscuous mode
Dec 06 10:18:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:37.048 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5d:d6f0/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1e7eac16-0466-4938-b460-d43f8a6e5320) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:37.050 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1e7eac16-0466-4938-b460-d43f8a6e5320 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:37.052 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:37 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:37.053 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d90bc6-ba93-43c0-b7dc-b11d552f9c45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:37.060 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:37.364 263652 INFO neutron.agent.dhcp.agent [None req-9204d6d4-cd0f-4bea-8a38-f041253991b6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-fbc77840f75508555765143db8d6e2013f2107a72b75c59be1877c317e703b54-merged.mount: Deactivated successfully.
Dec 06 10:18:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a583563545dc7b7ec5c05c29bb95187c6f71b3276ad300d24bae1097cf7b6fc-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:37 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:37.926 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:38.026 2 INFO neutron.agent.securitygroups_rpc [None req-1f216153-8df9-4f5f-9520-bf151df27051 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:38 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:38.095 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e154 e154: 6 total, 6 up, 6 in
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:38 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:38.246 263652 INFO neutron.agent.linux.ip_lib [None req-fe796f91-c16f-44c5-93a2-1c017a79f914 - - - - - -] Device tap2b709350-c2 cannot be used as it has no MAC address
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain kernel: device tap2b709350-c2 entered promiscuous mode
Dec 06 10:18:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:38Z|00305|binding|INFO|Claiming lport 2b709350-c211-4254-8281-b64bea0c6f41 for this chassis.
Dec 06 10:18:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:38Z|00306|binding|INFO|2b709350-c211-4254-8281-b64bea0c6f41: Claiming unknown
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.283 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016318.2840] manager: (tap2b709350-c2): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Dec 06 10:18:38 np0005548789.localdomain systemd-udevd[323251]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:38Z|00307|binding|INFO|Setting lport 2b709350-c211-4254-8281-b64bea0c6f41 ovn-installed in OVS
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.293 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:38Z|00308|binding|INFO|Setting lport 2b709350-c211-4254-8281-b64bea0c6f41 up in Southbound
Dec 06 10:18:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:38.299 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef5:2012/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=2b709350-c211-4254-8281-b64bea0c6f41) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.302 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:38.303 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 2b709350-c211-4254-8281-b64bea0c6f41 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:38.311 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 160bb911-7f10-4f65-8dc2-37031a049d2c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:38.312 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:38.313 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ac010a82-b0c6-4e99-89ca-47c35a1025ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.322 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap2b709350-c2: No such device
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.357 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:38.387 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 863 KiB/s wr, 126 op/s
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: osdmap e154: 6 total, 6 up, 6 in
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:18:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:39.029 2 INFO neutron.agent.securitygroups_rpc [None req-df5d0c63-3dbc-41ec-8a7e-d627e1beca42 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.106 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.202 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.202 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.203 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:18:39 np0005548789.localdomain podman[323322]: 
Dec 06 10:18:39 np0005548789.localdomain podman[323322]: 2025-12-06 10:18:39.21652849 +0000 UTC m=+0.092030003 container create 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:18:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:39.254 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:39 np0005548789.localdomain systemd[1]: Started libpod-conmon-27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4.scope.
Dec 06 10:18:39 np0005548789.localdomain podman[323322]: 2025-12-06 10:18:39.172992879 +0000 UTC m=+0.048494442 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:39 np0005548789.localdomain systemd[1]: tmp-crun.kYTSBP.mount: Deactivated successfully.
Dec 06 10:18:39 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:39 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b7531a1e7d0325b8ef534d223b499f4804e3bafec7ac5d3faa5dc377932e096/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:39 np0005548789.localdomain podman[323322]: 2025-12-06 10:18:39.301384203 +0000 UTC m=+0.176885716 container init 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:39 np0005548789.localdomain podman[323322]: 2025-12-06 10:18:39.313416258 +0000 UTC m=+0.188917781 container start 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: started, version 2.85 cachesize 150
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: DNS service limited to local subnets
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: warning: no upstream servers configured
Dec 06 10:18:39 np0005548789.localdomain dnsmasq-dhcp[323340]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:39 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:39 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.326 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.326 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.327 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:18:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:39.327 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:18:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:39.378 263652 INFO neutron.agent.dhcp.agent [None req-fe796f91-c16f-44c5-93a2-1c017a79f914 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:37Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fae6d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fae67c0>], id=f67fb8ff-e5d2-4a19-a60a-1bd7d5cb713a, ip_allocation=immediate, mac_address=fa:16:3e:a2:18:23, name=tempest-NetworksTestDHCPv6-1097551674, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['2817cd55-6d79-47cf-850d-829aa44b7048'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:36Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1857, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:37Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:39.489 263652 INFO neutron.agent.dhcp.agent [None req-f8aae058-451a-4fb2-a7a9-6f6b2ff0cb84 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:39 np0005548789.localdomain podman[323360]: 2025-12-06 10:18:39.588196183 +0000 UTC m=+0.062710173 container kill 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:39 np0005548789.localdomain dnsmasq[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:39 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:39 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1177396961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:39.824 263652 INFO neutron.agent.dhcp.agent [None req-ebf14e78-44ca-4922-971d-95494b8ec3df - - - - - -] DHCP configuration for ports {'f67fb8ff-e5d2-4a19-a60a-1bd7d5cb713a'} is completed
Dec 06 10:18:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:18:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:18:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:39.881 2 INFO neutron.agent.securitygroups_rpc [None req-57bddb58-8e7b-4200-a14c-4d9431ae075f b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:39 np0005548789.localdomain podman[323396]: 2025-12-06 10:18:39.949563343 +0000 UTC m=+0.104027986 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:18:39 np0005548789.localdomain podman[323395]: 2025-12-06 10:18:39.915574823 +0000 UTC m=+0.077068318 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Dec 06 10:18:39 np0005548789.localdomain podman[323396]: 2025-12-06 10:18:39.997138856 +0000 UTC m=+0.151603439 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:18:40 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:18:40 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:40.016 263652 INFO neutron.agent.linux.ip_lib [None req-69fc1829-97fc-48f8-9b00-6782cc4cdde0 - - - - - -] Device tapdfa03f50-39 cannot be used as it has no MAC address
Dec 06 10:18:40 np0005548789.localdomain dnsmasq[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:40 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:40 np0005548789.localdomain dnsmasq-dhcp[323340]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:40 np0005548789.localdomain podman[323421]: 2025-12-06 10:18:40.040573264 +0000 UTC m=+0.141639557 container kill 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:18:40 np0005548789.localdomain podman[323395]: 2025-12-06 10:18:40.04737298 +0000 UTC m=+0.208866495 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.046 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain kernel: device tapdfa03f50-39 entered promiscuous mode
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00309|binding|INFO|Claiming lport dfa03f50-3905-4292-9cae-c03579192e4f for this chassis.
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00310|binding|INFO|dfa03f50-3905-4292-9cae-c03579192e4f: Claiming unknown
Dec 06 10:18:40 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016320.0558] manager: (tapdfa03f50-39): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.056 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain systemd-udevd[323254]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:40 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.070 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5d90c1d5-74b2-4b5c-9bf8-25a818641550', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d90c1d5-74b2-4b5c-9bf8-25a818641550', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b1d664fab0f4b7f87439c153244cdc1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=554a12c4-a3a9-4583-a7ca-9f004018b224, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=dfa03f50-3905-4292-9cae-c03579192e4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.072 160509 INFO neutron.agent.ovn.metadata.agent [-] Port dfa03f50-3905-4292-9cae-c03579192e4f in datapath 5d90c1d5-74b2-4b5c-9bf8-25a818641550 bound to our chassis
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.074 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5d90c1d5-74b2-4b5c-9bf8-25a818641550 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.075 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2877f908-f092-4181-8a4e-eb699fd468ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00311|binding|INFO|Setting lport dfa03f50-3905-4292-9cae-c03579192e4f ovn-installed in OVS
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00312|binding|INFO|Setting lport dfa03f50-3905-4292-9cae-c03579192e4f up in Southbound
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.162 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.191 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain systemd[1]: tmp-crun.lKkAOH.mount: Deactivated successfully.
Dec 06 10:18:40 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:40.271 2 INFO neutron.agent.securitygroups_rpc [None req-dae126c9-280d-4a4d-ad9b-17df376d8729 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:40 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:40.351 2 INFO neutron.agent.securitygroups_rpc [None req-e4bf1752-f0a3-4484-84a7-e670337a989c b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:40 np0005548789.localdomain dnsmasq[323340]: exiting on receipt of SIGTERM
Dec 06 10:18:40 np0005548789.localdomain podman[323505]: 2025-12-06 10:18:40.571088525 +0000 UTC m=+0.075872022 container kill 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:18:40 np0005548789.localdomain systemd[1]: libpod-27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4.scope: Deactivated successfully.
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.635 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:18:40 np0005548789.localdomain podman[323522]: 2025-12-06 10:18:40.646724399 +0000 UTC m=+0.060525956 container died 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.649 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.650 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.650 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.650 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.651 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.651 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:18:40 np0005548789.localdomain podman[323522]: 2025-12-06 10:18:40.685524357 +0000 UTC m=+0.099325914 container cleanup 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:18:40 np0005548789.localdomain systemd[1]: libpod-conmon-27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4.scope: Deactivated successfully.
Dec 06 10:18:40 np0005548789.localdomain podman[323524]: 2025-12-06 10:18:40.74000835 +0000 UTC m=+0.144425192 container remove 27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:18:40 np0005548789.localdomain ceph-mon[298582]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 895 B/s wr, 25 op/s
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.754 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00313|binding|INFO|Releasing lport 2b709350-c211-4254-8281-b64bea0c6f41 from this chassis (sb_readonly=0)
Dec 06 10:18:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:40Z|00314|binding|INFO|Setting lport 2b709350-c211-4254-8281-b64bea0c6f41 down in Southbound
Dec 06 10:18:40 np0005548789.localdomain kernel: device tap2b709350-c2 left promiscuous mode
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.763 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef5:2012/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=2b709350-c211-4254-8281-b64bea0c6f41) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.765 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 2b709350-c211-4254-8281-b64bea0c6f41 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.767 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:40.768 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfa9158-5d43-4ba7-bc04-feb4bbd06c17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e155 e155: 6 total, 6 up, 6 in
Dec 06 10:18:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:40.779 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:41.070 263652 INFO neutron.agent.dhcp.agent [None req-9245f6cc-a833-4e02-b7b5-df057860c471 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:41 np0005548789.localdomain podman[323582]: 
Dec 06 10:18:41 np0005548789.localdomain podman[323582]: 2025-12-06 10:18:41.164858815 +0000 UTC m=+0.068805407 container create b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:18:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:41.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:41 np0005548789.localdomain systemd[1]: Started libpod-conmon-b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0.scope.
Dec 06 10:18:41 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-8b7531a1e7d0325b8ef534d223b499f4804e3bafec7ac5d3faa5dc377932e096-merged.mount: Deactivated successfully.
Dec 06 10:18:41 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27452c8dc971f1931cc48233d860a5a4dae6b3dc752ed26dd6692132ab8623c4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:41 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:41 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e45d66b660b9cd6a8b001d4fbf1754f68fedf39a7ac1ddac57f6571d258512a4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:41 np0005548789.localdomain podman[323582]: 2025-12-06 10:18:41.129365939 +0000 UTC m=+0.033312581 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:41 np0005548789.localdomain podman[323582]: 2025-12-06 10:18:41.234985282 +0000 UTC m=+0.138931904 container init b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:41 np0005548789.localdomain podman[323582]: 2025-12-06 10:18:41.245155731 +0000 UTC m=+0.149102353 container start b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: started, version 2.85 cachesize 150
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: DNS service limited to local subnets
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: warning: no upstream servers configured
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 0 addresses
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:41.302 263652 INFO neutron.agent.dhcp.agent [None req-69fc1829-97fc-48f8-9b00-6782cc4cdde0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:39Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb0ddc0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa77b50>], id=8dd2142c-4669-4128-a7be-8660c8e4419c, ip_allocation=immediate, mac_address=fa:16:3e:f2:97:89, name=tempest-AllowedAddressPairIpV6TestJSON-1440071982, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1871, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:39Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:41.398 263652 INFO neutron.agent.dhcp.agent [None req-0b466d18-97be-4edf-aa1c-6b63c631ffde - - - - - -] DHCP configuration for ports {'b48170d1-76fd-43fa-87cb-3654efb179b3'} is completed
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 1 addresses
Dec 06 10:18:41 np0005548789.localdomain podman[323620]: 2025-12-06 10:18:41.469880857 +0000 UTC m=+0.049266414 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:41 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:41.533 2 INFO neutron.agent.securitygroups_rpc [None req-652b9b12-ec43-4a29-b268-053f1f58f2a3 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:41.661 263652 INFO neutron.agent.dhcp.agent [None req-69fc1829-97fc-48f8-9b00-6782cc4cdde0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faafc10>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaf880>], id=d4d6e3c8-5f61-4270-981b-d1cb751d801d, ip_allocation=immediate, mac_address=fa:16:3e:36:19:e3, name=tempest-AllowedAddressPairIpV6TestJSON-1440647520, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1881, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:41Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:41.746 263652 INFO neutron.agent.dhcp.agent [None req-21430afc-6124-45c3-bb3a-71c4ca114888 - - - - - -] DHCP configuration for ports {'8dd2142c-4669-4128-a7be-8660c8e4419c'} is completed
Dec 06 10:18:41 np0005548789.localdomain ceph-mon[298582]: osdmap e155: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548789.localdomain ceph-mon[298582]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 840 MiB used, 41 GiB / 42 GiB avail; 3.6 KiB/s rd, 511 B/s wr, 5 op/s
Dec 06 10:18:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e156 e156: 6 total, 6 up, 6 in
Dec 06 10:18:41 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 2 addresses
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:41 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:41 np0005548789.localdomain podman[323658]: 2025-12-06 10:18:41.892383723 +0000 UTC m=+0.076123920 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:42.023 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:41Z, description=, device_id=a193f2ef-cf4d-4f20-be3b-f48f023f218a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fafb8e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fdc7c70>], id=c2313164-caaf-42ef-9d7f-0859c1ed319e, ip_allocation=immediate, mac_address=fa:16:3e:62:bd:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1888, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:41Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:42.209 263652 INFO neutron.agent.dhcp.agent [None req-bc5fd4a8-f087-4927-9652-e510a2ff81ad - - - - - -] DHCP configuration for ports {'d4d6e3c8-5f61-4270-981b-d1cb751d801d'} is completed
Dec 06 10:18:42 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:18:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:42 np0005548789.localdomain podman[323697]: 2025-12-06 10:18:42.259779726 +0000 UTC m=+0.068098016 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:42.322 2 INFO neutron.agent.securitygroups_rpc [None req-b75aa7a3-5af1-4cd3-b1b1-cfd423d7e2ab 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:42 np0005548789.localdomain podman[323737]: 2025-12-06 10:18:42.586690902 +0000 UTC m=+0.074038307 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:18:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:42.587 2 INFO neutron.agent.securitygroups_rpc [None req-f4287026-ab49-4456-bd2b-fbcf52c0630e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:42 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 1 addresses
Dec 06 10:18:42 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:42 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:42.600 263652 INFO neutron.agent.dhcp.agent [None req-e9c9e6ea-11c3-461e-9133-d3211ce5e3e2 - - - - - -] DHCP configuration for ports {'c2313164-caaf-42ef-9d7f-0859c1ed319e'} is completed
Dec 06 10:18:42 np0005548789.localdomain ceph-mon[298582]: osdmap e156: 6 total, 6 up, 6 in
Dec 06 10:18:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1970788286' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:42.963 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:43.023 2 INFO neutron.agent.securitygroups_rpc [None req-55643603-9572-45b2-ae71-f445e3294506 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:43.073 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa25ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa25e50>], id=13eda6cd-ab50-4961-a56d-6f8d6f8094f4, ip_allocation=immediate, mac_address=fa:16:3e:c8:5e:55, name=tempest-AllowedAddressPairIpV6TestJSON-484857425, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1897, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:42Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:43.113 263652 INFO neutron.agent.linux.ip_lib [None req-80824c13-7455-4cc5-9ad8-0a2dcaae04fd - - - - - -] Device tape81dd3d5-52 cannot be used as it has no MAC address
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.148 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain kernel: device tape81dd3d5-52 entered promiscuous mode
Dec 06 10:18:43 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016323.1592] manager: (tape81dd3d5-52): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.163 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:43Z|00315|binding|INFO|Claiming lport e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 for this chassis.
Dec 06 10:18:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:43Z|00316|binding|INFO|e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6: Claiming unknown
Dec 06 10:18:43 np0005548789.localdomain systemd-udevd[323783]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:43Z|00317|binding|INFO|Setting lport e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 up in Southbound
Dec 06 10:18:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:43Z|00318|binding|INFO|Setting lport e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 ovn-installed in OVS
Dec 06 10:18:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:43.177 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee6:65a3/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:43.179 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.181 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.184 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:43.185 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 94117367-0e38-4e81-8f6f-3ab8f4ef7f72 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:43.186 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:43.187 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ff25a746-24bd-480f-b56c-f4826fcbc3d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.215 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.262 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:43.294 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:43 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 2 addresses
Dec 06 10:18:43 np0005548789.localdomain podman[323795]: 2025-12-06 10:18:43.31888266 +0000 UTC m=+0.072978894 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:18:43 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:43 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:43.707 263652 INFO neutron.agent.dhcp.agent [None req-8432ed46-288b-4870-95ba-d6a625fdb94d - - - - - -] DHCP configuration for ports {'13eda6cd-ab50-4961-a56d-6f8d6f8094f4'} is completed
Dec 06 10:18:43 np0005548789.localdomain ceph-mon[298582]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 5.2 KiB/s wr, 111 op/s
Dec 06 10:18:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4278276739' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4278276739' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/409284351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:18:43 np0005548789.localdomain podman[323842]: 2025-12-06 10:18:43.965306578 +0000 UTC m=+0.093389564 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:43 np0005548789.localdomain podman[323842]: 2025-12-06 10:18:43.980149338 +0000 UTC m=+0.108232344 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:43 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:18:44 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:44.059 2 INFO neutron.agent.securitygroups_rpc [None req-255af1be-4d8f-48ce-b409-88074fe3f28a a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:44.137 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:44 np0005548789.localdomain podman[323886]: 
Dec 06 10:18:44 np0005548789.localdomain podman[323886]: 2025-12-06 10:18:44.323522993 +0000 UTC m=+0.102481170 container create de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:44 np0005548789.localdomain systemd[1]: Started libpod-conmon-de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b.scope.
Dec 06 10:18:44 np0005548789.localdomain podman[323886]: 2025-12-06 10:18:44.276506587 +0000 UTC m=+0.055464764 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:44 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:44 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6081e04002876701154dea738aa910ba591c2cf9e99dc2fa391f57536d1f8d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:44 np0005548789.localdomain podman[323886]: 2025-12-06 10:18:44.416592846 +0000 UTC m=+0.195551063 container init de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:44 np0005548789.localdomain podman[323886]: 2025-12-06 10:18:44.433374205 +0000 UTC m=+0.212332382 container start de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: started, version 2.85 cachesize 150
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: DNS service limited to local subnets
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: warning: no upstream servers configured
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:44.503 263652 INFO neutron.agent.dhcp.agent [None req-80824c13-7455-4cc5-9ad8-0a2dcaae04fd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa995e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc55e20>], id=fc6df7b3-ec8e-4358-8cef-d1710c9514a1, ip_allocation=immediate, mac_address=fa:16:3e:6e:ef:c6, name=tempest-NetworksTestDHCPv6-461980463, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['4ba92dd9-8cae-472b-acec-9fa2369c51eb'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:41Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1890, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:42Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:44.617 263652 INFO neutron.agent.dhcp.agent [None req-e8cfefeb-67b3-4d91-a17e-b985596e9ec1 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:44 np0005548789.localdomain dnsmasq[323903]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:44 np0005548789.localdomain podman[323920]: 2025-12-06 10:18:44.72334556 +0000 UTC m=+0.065727935 container kill de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:18:44 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:44.760 2 INFO neutron.agent.securitygroups_rpc [None req-f9323177-d4e4-4dce-bd8f-2cc985b7b1dc 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2594254949' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3870109751' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:45 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 1 addresses
Dec 06 10:18:45 np0005548789.localdomain podman[323959]: 2025-12-06 10:18:45.022508684 +0000 UTC m=+0.061151116 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:45.062 263652 INFO neutron.agent.dhcp.agent [None req-d3d5c886-4c06-4e86-9ecb-f0a4eee6bfa8 - - - - - -] DHCP configuration for ports {'fc6df7b3-ec8e-4358-8cef-d1710c9514a1'} is completed
Dec 06 10:18:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:45.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:45.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:18:45 np0005548789.localdomain systemd[1]: tmp-crun.rmsrpE.mount: Deactivated successfully.
Dec 06 10:18:45 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:45.342 2 INFO neutron.agent.securitygroups_rpc [None req-ec83bf10-c909-4c2a-a1a4-0827521eece8 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:45 np0005548789.localdomain systemd[1]: tmp-crun.ubZWop.mount: Deactivated successfully.
Dec 06 10:18:45 np0005548789.localdomain dnsmasq[323903]: exiting on receipt of SIGTERM
Dec 06 10:18:45 np0005548789.localdomain systemd[1]: libpod-de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b.scope: Deactivated successfully.
Dec 06 10:18:45 np0005548789.localdomain podman[323997]: 2025-12-06 10:18:45.354536965 +0000 UTC m=+0.078802211 container kill de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:18:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:45.362 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:45Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc85b80>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc98c70>], id=cf17c12b-b261-4ebe-bdee-84820cc74501, ip_allocation=immediate, mac_address=fa:16:3e:50:03:6f, name=tempest-AllowedAddressPairIpV6TestJSON-192075535, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1913, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:45Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:45 np0005548789.localdomain podman[324009]: 2025-12-06 10:18:45.424104366 +0000 UTC m=+0.054542236 container died de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:18:45 np0005548789.localdomain podman[324009]: 2025-12-06 10:18:45.562472462 +0000 UTC m=+0.192910292 container cleanup de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:45 np0005548789.localdomain systemd[1]: libpod-conmon-de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b.scope: Deactivated successfully.
Dec 06 10:18:45 np0005548789.localdomain podman[324011]: 2025-12-06 10:18:45.587076969 +0000 UTC m=+0.208415904 container remove de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:18:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:45.606 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:45 np0005548789.localdomain kernel: device tape81dd3d5-52 left promiscuous mode
Dec 06 10:18:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:45Z|00319|binding|INFO|Releasing lport e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 from this chassis (sb_readonly=0)
Dec 06 10:18:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:45Z|00320|binding|INFO|Setting lport e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 down in Southbound
Dec 06 10:18:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:45.619 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:45.621 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e81dd3d5-5246-4ac5-a2a9-4f11cf2d2ff6 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:45.624 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:45.625 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[88537893-366a-4bd5-9d1b-0ef0a580c20b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:45.631 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:45 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 2 addresses
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:45 np0005548789.localdomain podman[324054]: 2025-12-06 10:18:45.656895456 +0000 UTC m=+0.122933830 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:45 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:18:45 np0005548789.localdomain podman[324084]: 2025-12-06 10:18:45.731038625 +0000 UTC m=+0.065567500 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:45 np0005548789.localdomain ceph-mon[298582]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 3.9 KiB/s wr, 89 op/s
Dec 06 10:18:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2023953488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:18:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:45.935 263652 INFO neutron.agent.dhcp.agent [None req-ce87d88a-68dd-48b6-823b-2e5c390614cb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:45.937 263652 INFO neutron.agent.dhcp.agent [None req-ce87d88a-68dd-48b6-823b-2e5c390614cb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:46.017 263652 INFO neutron.agent.dhcp.agent [None req-33f79474-9ca9-4963-91c5-f8c6e4b145a6 - - - - - -] DHCP configuration for ports {'cf17c12b-b261-4ebe-bdee-84820cc74501'} is completed
Dec 06 10:18:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:46.203 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-d6081e04002876701154dea738aa910ba591c2cf9e99dc2fa391f57536d1f8d4-merged.mount: Deactivated successfully.
Dec 06 10:18:46 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de7254f971a2477af23af1e54a0d0065537b2127e25260365f151fd088e6b93b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:46 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:46 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:46.531 2 INFO neutron.agent.securitygroups_rpc [None req-28a20db1-a3bb-47de-ad39-5c494614c36d 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:18:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:18:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:18:46 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 1 addresses
Dec 06 10:18:46 np0005548789.localdomain podman[324130]: 2025-12-06 10:18:46.825379148 +0000 UTC m=+0.067357734 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:46 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:46 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:46.903 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:46Z, description=, device_id=29ab744a-d31c-4586-ae5e-41341709a166, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaa5e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaa640>], id=b23af29b-8104-471f-94ba-80e710a74404, ip_allocation=immediate, mac_address=fa:16:3e:24:1e:64, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1921, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:46Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:47 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:18:47 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:47 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:47 np0005548789.localdomain podman[324166]: 2025-12-06 10:18:47.163489564 +0000 UTC m=+0.076140622 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:47 np0005548789.localdomain systemd[1]: tmp-crun.PQejOh.mount: Deactivated successfully.
Dec 06 10:18:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e157 e157: 6 total, 6 up, 6 in
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.307 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.308 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.309 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:47 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:47.313 2 INFO neutron.agent.securitygroups_rpc [None req-e6ce568b-b382-4ede-9132-8960f2608a77 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:47.349 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9fe4c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa233a0>], id=14568f10-d8d5-4f60-9847-c06c6ffea82d, ip_allocation=immediate, mac_address=fa:16:3e:fc:46:f8, name=tempest-AllowedAddressPairIpV6TestJSON-1644300393, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1927, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:46Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:47.499 263652 INFO neutron.agent.linux.ip_lib [None req-9de59c5c-4f9e-45fd-b701-32de4ffd1a2a - - - - - -] Device tapfca03880-79 cannot be used as it has no MAC address
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.537 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain kernel: device tapfca03880-79 entered promiscuous mode
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.545 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:47Z|00321|binding|INFO|Claiming lport fca03880-79ab-46f3-909b-a19baa4b2eea for this chassis.
Dec 06 10:18:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:47Z|00322|binding|INFO|fca03880-79ab-46f3-909b-a19baa4b2eea: Claiming unknown
Dec 06 10:18:47 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016327.5469] manager: (tapfca03880-79): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Dec 06 10:18:47 np0005548789.localdomain systemd-udevd[324224]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:47Z|00323|binding|INFO|Setting lport fca03880-79ab-46f3-909b-a19baa4b2eea ovn-installed in OVS
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.553 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:47Z|00324|binding|INFO|Setting lport fca03880-79ab-46f3-909b-a19baa4b2eea up in Southbound
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.556 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe35:2078/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=fca03880-79ab-46f3-909b-a19baa4b2eea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.558 160509 INFO neutron.agent.ovn.metadata.agent [-] Port fca03880-79ab-46f3-909b-a19baa4b2eea in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.564 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 05d71170-47d2-4bec-b410-aaa8f7634d28 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.567 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.565 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:47.568 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[173466d8-9b6a-4a7a-a511-e22ca27fc8de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:18:47 np0005548789.localdomain podman[324210]: 2025-12-06 10:18:47.580841122 +0000 UTC m=+0.076186312 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:47 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 2 addresses
Dec 06 10:18:47 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:47 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.589 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:47.595 263652 INFO neutron.agent.dhcp.agent [None req-aa15b631-6ac2-4c35-8ca0-3b26e595ee64 - - - - - -] DHCP configuration for ports {'b23af29b-8104-471f-94ba-80e710a74404'} is completed
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.637 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain podman[324228]: 2025-12-06 10:18:47.674996258 +0000 UTC m=+0.086574617 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.678 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:47 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:47.682 2 INFO neutron.agent.securitygroups_rpc [None req-7750bef0-0e9a-45d8-b031-72812daa7ba7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:47 np0005548789.localdomain podman[324228]: 2025-12-06 10:18:47.714249378 +0000 UTC m=+0.125827697 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:18:47 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:18:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:47.954 263652 INFO neutron.agent.dhcp.agent [None req-6fc97f4b-a09e-4296-9bf1-3d8361758acd - - - - - -] DHCP configuration for ports {'14568f10-d8d5-4f60-9847-c06c6ffea82d'} is completed
Dec 06 10:18:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:47.995 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:48 np0005548789.localdomain ceph-mon[298582]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 58 KiB/s rd, 3.4 KiB/s wr, 77 op/s
Dec 06 10:18:48 np0005548789.localdomain ceph-mon[298582]: osdmap e157: 6 total, 6 up, 6 in
Dec 06 10:18:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:48 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:48.427 2 INFO neutron.agent.securitygroups_rpc [None req-ca0f0e3b-d5e2-4383-9ed8-27fd62359bae 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:48.515 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:47Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa232b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa237f0>], id=f99905b4-4fb8-4a99-9e15-56d731748b6a, ip_allocation=immediate, mac_address=fa:16:3e:99:fd:f6, name=tempest-AllowedAddressPairIpV6TestJSON-1427594491, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:37Z, description=, dns_domain=, id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-1904552762, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27994, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1858, status=ACTIVE, subnets=['cff46a29-a5f7-45a0-9023-17533db086b9'], tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:38Z, vlan_transparent=None, network_id=5d90c1d5-74b2-4b5c-9bf8-25a818641550, port_security_enabled=True, project_id=2b1d664fab0f4b7f87439c153244cdc1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0'], standard_attr_id=1930, status=DOWN, tags=[], tenant_id=2b1d664fab0f4b7f87439c153244cdc1, updated_at=2025-12-06T10:18:47Z on network 5d90c1d5-74b2-4b5c-9bf8-25a818641550
Dec 06 10:18:48 np0005548789.localdomain podman[324311]: 
Dec 06 10:18:48 np0005548789.localdomain podman[324311]: 2025-12-06 10:18:48.611901116 +0000 UTC m=+0.097960882 container create 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9.scope.
Dec 06 10:18:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:48 np0005548789.localdomain podman[324311]: 2025-12-06 10:18:48.565946992 +0000 UTC m=+0.052006798 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f37c9dc86603901ba2225ec8af68b9f38ff4f01b21be5bbf89c6cec69efffdc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:48 np0005548789.localdomain podman[324311]: 2025-12-06 10:18:48.680581439 +0000 UTC m=+0.166641215 container init 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:48 np0005548789.localdomain podman[324311]: 2025-12-06 10:18:48.689542961 +0000 UTC m=+0.175602737 container start 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: started, version 2.85 cachesize 150
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: DNS service limited to local subnets
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: warning: no upstream servers configured
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[324353]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 3 addresses
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:48 np0005548789.localdomain podman[324346]: 2025-12-06 10:18:48.750004005 +0000 UTC m=+0.073930944 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:18:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:48.755 263652 INFO neutron.agent.dhcp.agent [None req-9de59c5c-4f9e-45fd-b701-32de4ffd1a2a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:47Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb42a00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb42f10>], id=69f92907-09fa-434d-8c15-91951719363e, ip_allocation=immediate, mac_address=fa:16:3e:be:7d:f4, name=tempest-NetworksTestDHCPv6-554797201, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['18b1569e-7ab6-4e66-a6b6-7c0f09d7e1be'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:45Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1929, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:47Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:48 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:48.810 2 INFO neutron.agent.securitygroups_rpc [None req-18dbc826-4a96-46e3-9a3f-9599a1372c95 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:48.816 263652 INFO neutron.agent.dhcp.agent [None req-f417bb3a-9f20-4ad4-9c9e-06b88ce4ba31 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:48 np0005548789.localdomain dnsmasq[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:48 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:48 np0005548789.localdomain podman[324385]: 2025-12-06 10:18:48.983448235 +0000 UTC m=+0.068473947 container kill 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:18:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:49.075 263652 INFO neutron.agent.dhcp.agent [None req-058db5ee-ef5a-4464-87e3-4f6af86c6840 - - - - - -] DHCP configuration for ports {'f99905b4-4fb8-4a99-9e15-56d731748b6a'} is completed
Dec 06 10:18:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:49.137 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:49.139 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:18:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:49.296 263652 INFO neutron.agent.dhcp.agent [None req-4cfc3920-e4cf-4f6b-a719-8d75342b3d22 - - - - - -] DHCP configuration for ports {'69f92907-09fa-434d-8c15-91951719363e'} is completed
Dec 06 10:18:49 np0005548789.localdomain dnsmasq[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:49 np0005548789.localdomain podman[324424]: 2025-12-06 10:18:49.407846999 +0000 UTC m=+0.064520048 container kill 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:18:49 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:49 np0005548789.localdomain dnsmasq-dhcp[324353]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:49 np0005548789.localdomain dnsmasq[324353]: exiting on receipt of SIGTERM
Dec 06 10:18:49 np0005548789.localdomain podman[324460]: 2025-12-06 10:18:49.887621191 +0000 UTC m=+0.064000793 container kill 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:18:49 np0005548789.localdomain systemd[1]: libpod-10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9.scope: Deactivated successfully.
Dec 06 10:18:49 np0005548789.localdomain podman[324474]: 2025-12-06 10:18:49.956669495 +0000 UTC m=+0.056344771 container died 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:18:50 np0005548789.localdomain podman[324474]: 2025-12-06 10:18:50.045287263 +0000 UTC m=+0.144962499 container cleanup 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:18:50 np0005548789.localdomain systemd[1]: libpod-conmon-10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9.scope: Deactivated successfully.
Dec 06 10:18:50 np0005548789.localdomain podman[324481]: 2025-12-06 10:18:50.070375514 +0000 UTC m=+0.156774997 container remove 10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:18:50 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:50Z|00325|binding|INFO|Releasing lport fca03880-79ab-46f3-909b-a19baa4b2eea from this chassis (sb_readonly=0)
Dec 06 10:18:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:50.083 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:50 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:50Z|00326|binding|INFO|Setting lport fca03880-79ab-46f3-909b-a19baa4b2eea down in Southbound
Dec 06 10:18:50 np0005548789.localdomain kernel: device tapfca03880-79 left promiscuous mode
Dec 06 10:18:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:50.093 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=fca03880-79ab-46f3-909b-a19baa4b2eea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:50.095 160509 INFO neutron.agent.ovn.metadata.agent [-] Port fca03880-79ab-46f3-909b-a19baa4b2eea in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:50.098 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:50 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:50.099 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[53d3ed5f-5ebb-461c-94fc-5cd17ea26ba4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:50.108 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "format": "json"}]: dispatch
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: pgmap v279: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 4.2 KiB/s wr, 86 op/s
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: mgrmap e47: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:18:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/352148416' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:18:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:50.334 263652 INFO neutron.agent.dhcp.agent [None req-34d19e8a-49b1-4bde-bfa5-07750f222db2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0f37c9dc86603901ba2225ec8af68b9f38ff4f01b21be5bbf89c6cec69efffdc-merged.mount: Deactivated successfully.
Dec 06 10:18:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10ef478f22f60a15bbd047555d12625c31619ccf5f16a21eb30af56364adeca9-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:50 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:50 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:50.837 2 INFO neutron.agent.securitygroups_rpc [None req-a73968c5-9b01-4d0f-920e-d4625a021612 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:51 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 2 addresses
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:51 np0005548789.localdomain podman[324518]: 2025-12-06 10:18:51.109160631 +0000 UTC m=+0.062823836 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:18:51 np0005548789.localdomain podman[324532]: 2025-12-06 10:18:51.219047464 +0000 UTC m=+0.081058800 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:51 np0005548789.localdomain podman[324532]: 2025-12-06 10:18:51.262631396 +0000 UTC m=+0.124642742 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 10:18:51 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:18:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:51.361 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:50Z, description=, device_id=0c9cf27e-a1eb-4ff5-a93a-5a27f1ed1aa2, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9d9820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9d9430>], id=1f968fe0-9266-48e7-b2ab-afd298c2d69e, ip_allocation=immediate, mac_address=fa:16:3e:50:84:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1938, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:50Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:51 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:51.392 2 INFO neutron.agent.securitygroups_rpc [None req-3e522469-6174-4bd8-8fa4-46c3281a8670 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:51 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:51.422 2 INFO neutron.agent.securitygroups_rpc [None req-c981ff7c-a5c4-4968-95e1-73b35c2abc32 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:51 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:51 np0005548789.localdomain podman[324596]: 2025-12-06 10:18:51.620883953 +0000 UTC m=+0.069917672 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:51 np0005548789.localdomain podman[324608]: 2025-12-06 10:18:51.657408341 +0000 UTC m=+0.065527309 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:18:51 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 1 addresses
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:51 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:51 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:51.784 2 INFO neutron.agent.securitygroups_rpc [None req-57ba5b5d-0488-433d-83ba-25f85616d546 9b1422e7ba894ba7b8e14df8e50e50d0 2b1d664fab0f4b7f87439c153244cdc1 - - default default] Security group member updated ['ec3ea7b0-8bb2-49ca-ac1e-7fc51bb6b4e0']
Dec 06 10:18:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:51.864 263652 INFO neutron.agent.linux.ip_lib [None req-92ab4545-1724-4aa7-8600-d5c37e6ae29d - - - - - -] Device tap7236378f-2c cannot be used as it has no MAC address
Dec 06 10:18:51 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:51.875 263652 INFO neutron.agent.dhcp.agent [None req-2b36f965-b4f9-4d9e-98b6-b6817e6edcbb - - - - - -] DHCP configuration for ports {'1f968fe0-9266-48e7-b2ab-afd298c2d69e'} is completed
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.894 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:51 np0005548789.localdomain kernel: device tap7236378f-2c entered promiscuous mode
Dec 06 10:18:51 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016331.9039] manager: (tap7236378f-2c): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Dec 06 10:18:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:51Z|00327|binding|INFO|Claiming lport 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be for this chassis.
Dec 06 10:18:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:51Z|00328|binding|INFO|7236378f-2c6f-4e9d-a6e5-4e6b08ae62be: Claiming unknown
Dec 06 10:18:51 np0005548789.localdomain systemd-udevd[324649]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.909 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:51.917 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7236378f-2c6f-4e9d-a6e5-4e6b08ae62be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:51Z|00329|binding|INFO|Setting lport 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be up in Southbound
Dec 06 10:18:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:51.919 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:51.921 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port cf9229a3-adf0-4ec8-9186-9b1891e2a252 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:51.921 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:51Z|00330|binding|INFO|Setting lport 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be ovn-installed in OVS
Dec 06 10:18:51 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:51.922 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8045b4-3528-4b8e-b960-48532f4ea7d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.922 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.925 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.943 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:51.992 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.024 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain podman[324676]: 2025-12-06 10:18:52.094792127 +0000 UTC m=+0.072690516 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:18:52 np0005548789.localdomain dnsmasq[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/addn_hosts - 0 addresses
Dec 06 10:18:52 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/host
Dec 06 10:18:52 np0005548789.localdomain dnsmasq-dhcp[323601]: read /var/lib/neutron/dhcp/5d90c1d5-74b2-4b5c-9bf8-25a818641550/opts
Dec 06 10:18:52 np0005548789.localdomain ceph-mon[298582]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 3.6 KiB/s wr, 73 op/s
Dec 06 10:18:52 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:52.374 2 INFO neutron.agent.securitygroups_rpc [None req-db878ba9-86fb-4d9b-8254-a77b4f9b264f a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:52Z|00331|binding|INFO|Removing iface tapdfa03f50-39 ovn-installed in OVS
Dec 06 10:18:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:52Z|00332|binding|INFO|Removing lport dfa03f50-3905-4292-9cae-c03579192e4f ovn-installed in OVS
Dec 06 10:18:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:52.432 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 58ad6037-7d9f-4f1f-85d0-c2c8d7ef9544 with type ""
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.434 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:52.434 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5d90c1d5-74b2-4b5c-9bf8-25a818641550', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d90c1d5-74b2-4b5c-9bf8-25a818641550', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b1d664fab0f4b7f87439c153244cdc1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=554a12c4-a3a9-4583-a7ca-9f004018b224, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=dfa03f50-3905-4292-9cae-c03579192e4f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:52.437 160509 INFO neutron.agent.ovn.metadata.agent [-] Port dfa03f50-3905-4292-9cae-c03579192e4f in datapath 5d90c1d5-74b2-4b5c-9bf8-25a818641550 unbound from our chassis
Dec 06 10:18:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:52.438 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5d90c1d5-74b2-4b5c-9bf8-25a818641550 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:18:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:52.439 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3cc018-e45e-4520-980e-60ebcbfadceb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:52 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:52.443 2 INFO neutron.agent.securitygroups_rpc [None req-d89000d1-9304-4659-90ea-3fec18561423 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.443 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain dnsmasq[323601]: exiting on receipt of SIGTERM
Dec 06 10:18:52 np0005548789.localdomain podman[324732]: 2025-12-06 10:18:52.565984299 +0000 UTC m=+0.068570811 container kill b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: libpod-b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0.scope: Deactivated successfully.
Dec 06 10:18:52 np0005548789.localdomain podman[324749]: 2025-12-06 10:18:52.643405318 +0000 UTC m=+0.064175058 container died b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e45d66b660b9cd6a8b001d4fbf1754f68fedf39a7ac1ddac57f6571d258512a4-merged.mount: Deactivated successfully.
Dec 06 10:18:52 np0005548789.localdomain podman[324749]: 2025-12-06 10:18:52.686504185 +0000 UTC m=+0.107273885 container cleanup b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: libpod-conmon-b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0.scope: Deactivated successfully.
Dec 06 10:18:52 np0005548789.localdomain podman[324751]: 2025-12-06 10:18:52.72360729 +0000 UTC m=+0.135786420 container remove b5a7ec053a2387ad1aeced549328179848c9c55d34e92e143426da0a9e07bdb0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d90c1d5-74b2-4b5c-9bf8-25a818641550, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:18:52 np0005548789.localdomain kernel: device tapdfa03f50-39 left promiscuous mode
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.777 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.791 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:52Z|00333|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:52.812 263652 INFO neutron.agent.dhcp.agent [None req-ab2b6543-bbea-467f-9a8d-936f28379033 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:52.813 263652 INFO neutron.agent.dhcp.agent [None req-ab2b6543-bbea-467f-9a8d-936f28379033 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.841 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:52 np0005548789.localdomain podman[324797]: 
Dec 06 10:18:52 np0005548789.localdomain podman[324797]: 2025-12-06 10:18:52.943962094 +0000 UTC m=+0.088049101 container create f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: Started libpod-conmon-f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997.scope.
Dec 06 10:18:52 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:52.998 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:53 np0005548789.localdomain podman[324797]: 2025-12-06 10:18:52.900008451 +0000 UTC m=+0.044095508 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4343818f1678526de5c96f3f64894c1a06c47ad9faa4d29e817addbb8dbc3a52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:53 np0005548789.localdomain podman[324797]: 2025-12-06 10:18:53.013895126 +0000 UTC m=+0.157982133 container init f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:18:53 np0005548789.localdomain podman[324797]: 2025-12-06 10:18:53.022254899 +0000 UTC m=+0.166341896 container start f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: started, version 2.85 cachesize 150
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: DNS service limited to local subnets
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: warning: no upstream servers configured
Dec 06 10:18:53 np0005548789.localdomain dnsmasq-dhcp[324815]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:53 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:53 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:53.062 263652 INFO neutron.agent.dhcp.agent [None req-92ab4545-1724-4aa7-8600-d5c37e6ae29d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:51Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa25b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa252b0>], id=1d094a5c-894f-4d7e-b6c1-2cb8c545fd07, ip_allocation=immediate, mac_address=fa:16:3e:c1:fc:7e, name=tempest-NetworksTestDHCPv6-264061737, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['b7683417-f5ad-4eab-8e1d-6d4e9c90ba29'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:50Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=1956, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:18:51Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:18:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:53.137 263652 INFO neutron.agent.dhcp.agent [None req-1c7dc463-69a7-4193-80ee-d2386385cfc3 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:53 np0005548789.localdomain dnsmasq[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 1 addresses
Dec 06 10:18:53 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:53 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:53 np0005548789.localdomain podman[324834]: 2025-12-06 10:18:53.273965154 +0000 UTC m=+0.066110067 container kill f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:53.561 263652 INFO neutron.agent.dhcp.agent [None req-39997d76-9c32-4ac3-814f-c2fb3c1592b0 - - - - - -] DHCP configuration for ports {'1d094a5c-894f-4d7e-b6c1-2cb8c545fd07'} is completed
Dec 06 10:18:53 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:53.632 2 INFO neutron.agent.securitygroups_rpc [None req-7967e84a-5cad-44b5-8ea9-0783854ccdc0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:53 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d5d90c1d5\x2d74b2\x2d4b5c\x2d9bf8\x2d25a818641550.mount: Deactivated successfully.
Dec 06 10:18:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:18:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:18:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:18:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157921 "" "Go-http-client/1.1"
Dec 06 10:18:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:53.943 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:18:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19736 "" "Go-http-client/1.1"
Dec 06 10:18:54 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:54.133 2 INFO neutron.agent.securitygroups_rpc [None req-47656a87-41ce-4f30-afbc-1720c247f9e1 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:54.141 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:54 np0005548789.localdomain ceph-mon[298582]: pgmap v281: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:54 np0005548789.localdomain dnsmasq[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:54 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:54 np0005548789.localdomain dnsmasq-dhcp[324815]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:18:54 np0005548789.localdomain podman[324871]: 2025-12-06 10:18:54.369666979 +0000 UTC m=+0.060649271 container kill f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:54 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:54.902 2 INFO neutron.agent.securitygroups_rpc [None req-fc107b59-6198-4194-ae82-133aadd3bf55 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:55 np0005548789.localdomain dnsmasq[324815]: exiting on receipt of SIGTERM
Dec 06 10:18:55 np0005548789.localdomain podman[324906]: 2025-12-06 10:18:55.171067825 +0000 UTC m=+0.061217618 container kill f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:18:55 np0005548789.localdomain systemd[1]: libpod-f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997.scope: Deactivated successfully.
Dec 06 10:18:55 np0005548789.localdomain podman[324919]: 2025-12-06 10:18:55.211288505 +0000 UTC m=+0.032905678 container died f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:18:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:55 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4343818f1678526de5c96f3f64894c1a06c47ad9faa4d29e817addbb8dbc3a52-merged.mount: Deactivated successfully.
Dec 06 10:18:55 np0005548789.localdomain podman[324919]: 2025-12-06 10:18:55.242367628 +0000 UTC m=+0.063984751 container cleanup f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:18:55 np0005548789.localdomain systemd[1]: libpod-conmon-f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997.scope: Deactivated successfully.
Dec 06 10:18:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "format": "json"}]: dispatch
Dec 06 10:18:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9cfe70fc-1bee-4bef-bb9d-910ccc1c896e", "force": true, "format": "json"}]: dispatch
Dec 06 10:18:55 np0005548789.localdomain podman[324926]: 2025-12-06 10:18:55.336076531 +0000 UTC m=+0.143041350 container remove f41ba5599c7eac6f7717b71ccd15b6985b89c6a43acec1218873c0dcb2997997 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:18:55 np0005548789.localdomain kernel: device tap7236378f-2c left promiscuous mode
Dec 06 10:18:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:55Z|00334|binding|INFO|Releasing lport 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be from this chassis (sb_readonly=0)
Dec 06 10:18:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:55Z|00335|binding|INFO|Setting lport 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be down in Southbound
Dec 06 10:18:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:55.351 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:55 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:55.353 2 INFO neutron.agent.securitygroups_rpc [None req-6d09d5a9-00b8-48ae-971d-442c4fe5ddd4 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:55.371 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:55.371 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7236378f-2c6f-4e9d-a6e5-4e6b08ae62be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:55.374 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7236378f-2c6f-4e9d-a6e5-4e6b08ae62be in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:18:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:55.377 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:55.378 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9ac1eb-4f31-41db-80eb-21ee01e4a4a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:55.616 263652 INFO neutron.agent.dhcp.agent [None req-f072d2b9-6577-4a38-86e7-fa140192b932 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:18:55 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:18:55 np0005548789.localdomain podman[324966]: 2025-12-06 10:18:55.629902543 +0000 UTC m=+0.066527529 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:55Z|00336|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:55.688 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.061 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.083 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.085 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.085 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.113 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:56 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:56.151 2 INFO neutron.agent.securitygroups_rpc [None req-49caf4e8-1cf3-4ac4-8ed0-7c14787e9a49 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:18:56 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:18:56 np0005548789.localdomain ceph-mon[298582]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:56 np0005548789.localdomain ceph-mon[298582]: mgrmap e48: np0005548790.kvkfyr(active, since 7m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:18:56 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:18:56 np0005548789.localdomain systemd[1]: tmp-crun.jcIxUK.mount: Deactivated successfully.
Dec 06 10:18:56 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:56 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:56 np0005548789.localdomain podman[325004]: 2025-12-06 10:18:56.523892268 +0000 UTC m=+0.052562855 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:56Z|00337|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.592 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:56.708 263652 INFO neutron.agent.linux.ip_lib [None req-3076b480-6a7a-4055-8fb1-ad825cedf939 - - - - - -] Device tapbb0ffdc5-89 cannot be used as it has no MAC address
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain kernel: device tapbb0ffdc5-89 entered promiscuous mode
Dec 06 10:18:56 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016336.7413] manager: (tapbb0ffdc5-89): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Dec 06 10:18:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:56Z|00338|binding|INFO|Claiming lport bb0ffdc5-896f-4862-9530-52cf0e1e0cda for this chassis.
Dec 06 10:18:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:56Z|00339|binding|INFO|bb0ffdc5-896f-4862-9530-52cf0e1e0cda: Claiming unknown
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.742 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain systemd-udevd[325037]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:18:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:56.762 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe61:486b/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=bb0ffdc5-896f-4862-9530-52cf0e1e0cda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:56.766 160509 INFO neutron.agent.ovn.metadata.agent [-] Port bb0ffdc5-896f-4862-9530-52cf0e1e0cda in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:18:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:56.769 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5c72d867-377b-401f-8969-dccc4aa84140 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:56.769 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:56 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:56.770 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[658e5e46-fc08-4e1e-b915-1910a57b4232]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:56Z|00340|binding|INFO|Setting lport bb0ffdc5-896f-4862-9530-52cf0e1e0cda ovn-installed in OVS
Dec 06 10:18:56 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:56Z|00341|binding|INFO|Setting lport bb0ffdc5-896f-4862-9530-52cf0e1e0cda up in Southbound
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.799 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.838 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:56.869 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:57.318 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:57Z, description=, device_id=01a5b100-c4e3-4e32-bdad-09ba7b504295, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e88e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e8ee0>], id=5e599779-403a-41e1-af63-c0b813d18ceb, ip_allocation=immediate, mac_address=fa:16:3e:e9:c7:0c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1979, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:18:57Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:18:57 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:57.373 2 INFO neutron.agent.securitygroups_rpc [None req-371fcf7f-6858-4f55-8627-7d44578a7f6c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:18:57 np0005548789.localdomain podman[325087]: 2025-12-06 10:18:57.519136445 +0000 UTC m=+0.046855452 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:18:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:18:57 np0005548789.localdomain sudo[325136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:18:57 np0005548789.localdomain sudo[325136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:57 np0005548789.localdomain sudo[325136]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:57 np0005548789.localdomain podman[325126]: 
Dec 06 10:18:57 np0005548789.localdomain podman[325126]: 2025-12-06 10:18:57.711938903 +0000 UTC m=+0.097102376 container create 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:18:57 np0005548789.localdomain systemd[1]: Started libpod-conmon-859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96.scope.
Dec 06 10:18:57 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:57 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb4226fe08368ea2d11dcba02b5011cfc19a25160fc2fe36ef2d9719be1bb1a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:57 np0005548789.localdomain podman[325126]: 2025-12-06 10:18:57.664452363 +0000 UTC m=+0.049615886 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:57 np0005548789.localdomain sudo[325161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:18:57 np0005548789.localdomain podman[325126]: 2025-12-06 10:18:57.771420537 +0000 UTC m=+0.156584020 container init 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:18:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:57.773 263652 INFO neutron.agent.dhcp.agent [None req-e13c65ba-d601-48b5-984e-bae0d4ee9d75 - - - - - -] DHCP configuration for ports {'5e599779-403a-41e1-af63-c0b813d18ceb'} is completed
Dec 06 10:18:57 np0005548789.localdomain sudo[325161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:57 np0005548789.localdomain podman[325126]: 2025-12-06 10:18:57.781742871 +0000 UTC m=+0.166906344 container start 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[325184]: started, version 2.85 cachesize 150
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[325184]: DNS service limited to local subnets
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[325184]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[325184]: warning: no upstream servers configured
Dec 06 10:18:57 np0005548789.localdomain dnsmasq[325184]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:18:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:58.031 263652 INFO neutron.agent.dhcp.agent [None req-48da9735-826a-4eba-baeb-4b3cfee13085 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:18:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:58.031 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:58 np0005548789.localdomain dnsmasq[325184]: exiting on receipt of SIGTERM
Dec 06 10:18:58 np0005548789.localdomain podman[325217]: 2025-12-06 10:18:58.205226736 +0000 UTC m=+0.043916294 container kill 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:18:58 np0005548789.localdomain systemd[1]: libpod-859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96.scope: Deactivated successfully.
Dec 06 10:18:58 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:58.223 2 INFO neutron.agent.securitygroups_rpc [None req-dcd65c97-9e9b-434b-8250-c459c3b8a42b a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:18:58 np0005548789.localdomain podman[325236]: 2025-12-06 10:18:58.284151449 +0000 UTC m=+0.058635019 container died 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:18:58 np0005548789.localdomain podman[325236]: 2025-12-06 10:18:58.338619522 +0000 UTC m=+0.113103062 container remove 859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:58 np0005548789.localdomain systemd[1]: libpod-conmon-859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96.scope: Deactivated successfully.
Dec 06 10:18:58 np0005548789.localdomain ceph-mon[298582]: pgmap v283: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 4.1 KiB/s wr, 23 op/s
Dec 06 10:18:58 np0005548789.localdomain sudo[325161]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-eb4226fe08368ea2d11dcba02b5011cfc19a25160fc2fe36ef2d9719be1bb1a8-merged.mount: Deactivated successfully.
Dec 06 10:18:58 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-859e6766fde586991ea9a9ea8b31b56366fcfc45fa8fbfd4233ca17be599de96-userdata-shm.mount: Deactivated successfully.
Dec 06 10:18:58 np0005548789.localdomain sudo[325277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:18:58 np0005548789.localdomain sudo[325277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:18:58 np0005548789.localdomain sudo[325277]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.181 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:18:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:18:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:18:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:18:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:18:59.696 263652 INFO neutron.agent.linux.ip_lib [None req-1209a318-25c3-4834-870a-ee0da04dc988 - - - - - -] Device tapf8d80242-07 cannot be used as it has no MAC address
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.726 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain kernel: device tapf8d80242-07 entered promiscuous mode
Dec 06 10:18:59 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016339.7343] manager: (tapf8d80242-07): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.738 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:18:59.741 2 INFO neutron.agent.securitygroups_rpc [None req-39208e2c-f7c0-489d-b264-ceaa95c43793 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:18:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:59Z|00342|binding|INFO|Claiming lport f8d80242-07b6-493f-8d5a-67d3b413e7c2 for this chassis.
Dec 06 10:18:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:59Z|00343|binding|INFO|f8d80242-07b6-493f-8d5a-67d3b413e7c2: Claiming unknown
Dec 06 10:18:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:59.762 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b84b6f67-f6c6-431b-82dc-4d4f6b20b084', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b84b6f67-f6c6-431b-82dc-4d4f6b20b084', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a269d8afc49848fbb8ce5cdb49ef37dc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b3e9ece-98ce-425e-b1a7-ae8b3622954c, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f8d80242-07b6-493f-8d5a-67d3b413e7c2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:18:59 np0005548789.localdomain podman[325351]: 
Dec 06 10:18:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:59.764 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f8d80242-07b6-493f-8d5a-67d3b413e7c2 in datapath b84b6f67-f6c6-431b-82dc-4d4f6b20b084 bound to our chassis
Dec 06 10:18:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:59.766 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1da81fc3-b0d7-4158-9ff3-f6eb5f1de696 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:18:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:59.766 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b84b6f67-f6c6-431b-82dc-4d4f6b20b084, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:18:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:18:59.767 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[763121e0-d652-4e64-8219-a8f2e31b16be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain podman[325351]: 2025-12-06 10:18:59.77532294 +0000 UTC m=+0.104329076 container create 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.778 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:59Z|00344|binding|INFO|Setting lport f8d80242-07b6-493f-8d5a-67d3b413e7c2 ovn-installed in OVS
Dec 06 10:18:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:18:59Z|00345|binding|INFO|Setting lport f8d80242-07b6-493f-8d5a-67d3b413e7c2 up in Southbound
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.791 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf8d80242-07: No such device
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.829 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain systemd[1]: Started libpod-conmon-1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976.scope.
Dec 06 10:18:59 np0005548789.localdomain podman[325351]: 2025-12-06 10:18:59.730842381 +0000 UTC m=+0.059848517 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:18:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:18:59.861 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:18:59 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:18:59 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b64405891051d8c84369c2a60215779dd9c625690bbd5ae710c835d97b69be8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:18:59 np0005548789.localdomain podman[325351]: 2025-12-06 10:18:59.881363006 +0000 UTC m=+0.210369122 container init 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:18:59 np0005548789.localdomain podman[325351]: 2025-12-06 10:18:59.891151223 +0000 UTC m=+0.220157339 container start 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:18:59 np0005548789.localdomain dnsmasq[325399]: started, version 2.85 cachesize 150
Dec 06 10:18:59 np0005548789.localdomain dnsmasq[325399]: DNS service limited to local subnets
Dec 06 10:18:59 np0005548789.localdomain dnsmasq[325399]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:18:59 np0005548789.localdomain dnsmasq[325399]: warning: no upstream servers configured
Dec 06 10:18:59 np0005548789.localdomain dnsmasq-dhcp[325399]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:18:59 np0005548789.localdomain dnsmasq[325399]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:18:59 np0005548789.localdomain dnsmasq-dhcp[325399]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:18:59 np0005548789.localdomain dnsmasq-dhcp[325399]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:00.300 263652 INFO neutron.agent.dhcp.agent [None req-d3b0ffe0-b1fb-4593-929c-3e31bcad28f9 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', '7368b3e8-d58f-4836-b822-3889974b0257', 'bb0ffdc5-896f-4862-9530-52cf0e1e0cda'} is completed
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325399]: exiting on receipt of SIGTERM
Dec 06 10:19:00 np0005548789.localdomain podman[325435]: 2025-12-06 10:19:00.418958142 +0000 UTC m=+0.098563321 container kill 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: libpod-1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976.scope: Deactivated successfully.
Dec 06 10:19:00 np0005548789.localdomain podman[325452]: 2025-12-06 10:19:00.494238145 +0000 UTC m=+0.060738323 container died 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:00 np0005548789.localdomain podman[325452]: 2025-12-06 10:19:00.54318559 +0000 UTC m=+0.109685738 container cleanup 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: libpod-conmon-1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976.scope: Deactivated successfully.
Dec 06 10:19:00 np0005548789.localdomain podman[325459]: 2025-12-06 10:19:00.582041528 +0000 UTC m=+0.129361624 container remove 1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:19:00 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:00Z|00346|binding|INFO|Releasing lport bb0ffdc5-896f-4862-9530-52cf0e1e0cda from this chassis (sb_readonly=0)
Dec 06 10:19:00 np0005548789.localdomain kernel: device tapbb0ffdc5-89 left promiscuous mode
Dec 06 10:19:00 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:00Z|00347|binding|INFO|Setting lport bb0ffdc5-896f-4862-9530-52cf0e1e0cda down in Southbound
Dec 06 10:19:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:00.593 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:00.601 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=bb0ffdc5-896f-4862-9530-52cf0e1e0cda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:00 np0005548789.localdomain ceph-mon[298582]: pgmap v284: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 6.7 KiB/s wr, 35 op/s
Dec 06 10:19:00 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:00.602 160509 INFO neutron.agent.ovn.metadata.agent [-] Port bb0ffdc5-896f-4862-9530-52cf0e1e0cda in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b64405891051d8c84369c2a60215779dd9c625690bbd5ae710c835d97b69be8e-merged.mount: Deactivated successfully.
Dec 06 10:19:00 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:00.603 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1356753d4b307eabcd1c93210192bf33b7751c5598848d4ac324142d8d20a976-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:00 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:00.603 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f65736ca-fdd7-4132-b10c-e08c90589b19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:00.612 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:00 np0005548789.localdomain podman[325501]: 
Dec 06 10:19:00 np0005548789.localdomain podman[325501]: 2025-12-06 10:19:00.742140815 +0000 UTC m=+0.067822009 container create 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: Started libpod-conmon-0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f.scope.
Dec 06 10:19:00 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:00 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75d9cf4f7f0f4147182b036eddac499be359e70f248dd76b6f98dcb4d0de61ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:00 np0005548789.localdomain podman[325501]: 2025-12-06 10:19:00.703818142 +0000 UTC m=+0.029499336 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:00 np0005548789.localdomain podman[325501]: 2025-12-06 10:19:00.804050513 +0000 UTC m=+0.129731707 container init 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:00 np0005548789.localdomain podman[325501]: 2025-12-06 10:19:00.814932963 +0000 UTC m=+0.140614157 container start 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325519]: started, version 2.85 cachesize 150
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325519]: DNS service limited to local subnets
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325519]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325519]: warning: no upstream servers configured
Dec 06 10:19:00 np0005548789.localdomain dnsmasq-dhcp[325519]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:00 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 0 addresses
Dec 06 10:19:00 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:00 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:00.887 263652 INFO neutron.agent.dhcp.agent [None req-584281be-9132-4dac-b7f4-2dada17bd960 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:59Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa40550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbd3310>], id=e398fe9a-c93c-418f-8af0-1f4523efdbbb, ip_allocation=immediate, mac_address=fa:16:3e:e5:dc:d9, name=tempest-ExtraDHCPOptionsTestJSON-1296193310, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:56Z, description=, dns_domain=, id=b84b6f67-f6c6-431b-82dc-4d4f6b20b084, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-1729131517, port_security_enabled=True, project_id=a269d8afc49848fbb8ce5cdb49ef37dc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34332, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1975, status=ACTIVE, subnets=['c069e8b4-a3d2-4787-b409-2897a52a3b9a'], tags=[], tenant_id=a269d8afc49848fbb8ce5cdb49ef37dc, updated_at=2025-12-06T10:18:57Z, vlan_transparent=None, network_id=b84b6f67-f6c6-431b-82dc-4d4f6b20b084, port_security_enabled=True, project_id=a269d8afc49848fbb8ce5cdb49ef37dc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a71b7ce5-d152-4b06-83b8-76d380ec29b6'], standard_attr_id=1997, status=DOWN, tags=[], tenant_id=a269d8afc49848fbb8ce5cdb49ef37dc, updated_at=2025-12-06T10:18:59Z on network b84b6f67-f6c6-431b-82dc-4d4f6b20b084
Dec 06 10:19:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:00.962 263652 INFO neutron.agent.dhcp.agent [None req-fb636907-d79c-4e5c-b427-aa5653be8ca5 - - - - - -] DHCP configuration for ports {'bc74c544-ee9e-41e3-9bd5-023a359c4f8c'} is completed
Dec 06 10:19:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:01.003 263652 INFO neutron.agent.dhcp.agent [None req-e3f33f0f-0988-46f8-babd-9c853e433e06 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:01.004 263652 INFO neutron.agent.dhcp.agent [None req-e3f33f0f-0988-46f8-babd-9c853e433e06 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:01 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:01 np0005548789.localdomain podman[325539]: 2025-12-06 10:19:01.014826896 +0000 UTC m=+0.052843444 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:01 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 1 addresses
Dec 06 10:19:01 np0005548789.localdomain podman[325568]: 2025-12-06 10:19:01.150857012 +0000 UTC m=+0.056594708 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:01 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:01.285 2 INFO neutron.agent.securitygroups_rpc [None req-7d4e4684-fcf3-4893-8a23-2e4dea6b64ed 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:01.358 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:00Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9a93d0>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9a9550>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9a9580>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9a9400>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9a9760>], id=ff894145-754e-443e-acc5-3e6fc6144ba8, ip_allocation=immediate, mac_address=fa:16:3e:c1:22:84, name=tempest-ExtraDHCPOptionsTestJSON-1755302425, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:18:56Z, description=, dns_domain=, id=b84b6f67-f6c6-431b-82dc-4d4f6b20b084, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-1729131517, port_security_enabled=True, project_id=a269d8afc49848fbb8ce5cdb49ef37dc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34332, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1975, status=ACTIVE, subnets=['c069e8b4-a3d2-4787-b409-2897a52a3b9a'], tags=[], tenant_id=a269d8afc49848fbb8ce5cdb49ef37dc, updated_at=2025-12-06T10:18:57Z, vlan_transparent=None, network_id=b84b6f67-f6c6-431b-82dc-4d4f6b20b084, port_security_enabled=True, project_id=a269d8afc49848fbb8ce5cdb49ef37dc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a71b7ce5-d152-4b06-83b8-76d380ec29b6'], standard_attr_id=2005, status=DOWN, tags=[], tenant_id=a269d8afc49848fbb8ce5cdb49ef37dc, updated_at=2025-12-06T10:19:00Z on network b84b6f67-f6c6-431b-82dc-4d4f6b20b084
Dec 06 10:19:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:01.415 263652 INFO neutron.agent.dhcp.agent [None req-44acbcb1-23a6-438c-b439-cbc8cb560404 - - - - - -] DHCP configuration for ports {'e398fe9a-c93c-418f-8af0-1f4523efdbbb'} is completed
Dec 06 10:19:01 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 2 addresses
Dec 06 10:19:01 np0005548789.localdomain podman[325611]: 2025-12-06 10:19:01.559135965 +0000 UTC m=+0.059395012 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:01 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:01 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:19:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:01.857 263652 INFO neutron.agent.dhcp.agent [None req-a6daa11e-a530-47c2-be2f-dac497fef326 - - - - - -] DHCP configuration for ports {'ff894145-754e-443e-acc5-3e6fc6144ba8'} is completed
Dec 06 10:19:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:02.059 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:01Z, description=, device_id=af23047c-2c71-405c-b139-618f66efd627, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9b1550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9b12e0>], id=de89ee12-af27-4b3e-a063-945dcc47a32d, ip_allocation=immediate, mac_address=fa:16:3e:be:47:18, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2020, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:19:01Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:19:02 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:19:02 np0005548789.localdomain podman[325648]: 2025-12-06 10:19:02.279097643 +0000 UTC m=+0.064391514 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:19:02 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:02 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:19:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:02.317 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:3e:86 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd8b2850-e3e7-477f-8017-199231500400, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5e418b23-64fb-4cc3-b4f5-351454b6f675) old=Port_Binding(mac=['fa:16:3e:fe:3e:86 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9beccfed-6ce7-4343-a09a-a10df412729f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:02.319 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5e418b23-64fb-4cc3-b4f5-351454b6f675 in datapath 9beccfed-6ce7-4343-a09a-a10df412729f updated
Dec 06 10:19:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:02.322 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9beccfed-6ce7-4343-a09a-a10df412729f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:02.323 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2be73684-346c-47fd-a8d5-71d0843bdd82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:02 np0005548789.localdomain podman[325661]: 2025-12-06 10:19:02.388276714 +0000 UTC m=+0.086552656 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:19:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:19:02 np0005548789.localdomain podman[325661]: 2025-12-06 10:19:02.419706047 +0000 UTC m=+0.117982009 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:02 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:19:02 np0005548789.localdomain podman[325685]: 2025-12-06 10:19:02.479223053 +0000 UTC m=+0.066413225 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:19:02 np0005548789.localdomain podman[325685]: 2025-12-06 10:19:02.488464903 +0000 UTC m=+0.075655095 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:19:02 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:19:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:02.548 263652 INFO neutron.agent.dhcp.agent [None req-ff7f66ba-a35f-4dd6-aee1-32b85ffe6de7 - - - - - -] DHCP configuration for ports {'de89ee12-af27-4b3e-a063-945dcc47a32d'} is completed
Dec 06 10:19:02 np0005548789.localdomain ceph-mon[298582]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 6.1 KiB/s wr, 29 op/s
Dec 06 10:19:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:19:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:02.686 2 INFO neutron.agent.securitygroups_rpc [None req-84b84473-0a2d-4cb4-b946-8bdffecc1ba7 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:02 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:02.870 2 INFO neutron.agent.securitygroups_rpc [None req-9f09c577-620d-43b3-bb16-a6c9b188fc98 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:03 np0005548789.localdomain podman[325727]: 2025-12-06 10:19:03.017685446 +0000 UTC m=+0.061283910 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:19:03 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 1 addresses
Dec 06 10:19:03 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:03 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.070 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:03.366 263652 INFO neutron.agent.linux.ip_lib [None req-37bfc739-3e5f-4189-852a-3ee87f21c002 - - - - - -] Device tap06b75261-40 cannot be used as it has no MAC address
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.503 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain kernel: device tap06b75261-40 entered promiscuous mode
Dec 06 10:19:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:03Z|00348|binding|INFO|Claiming lport 06b75261-40e0-4712-ac9e-63a2586b3a8c for this chassis.
Dec 06 10:19:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:03Z|00349|binding|INFO|06b75261-40e0-4712-ac9e-63a2586b3a8c: Claiming unknown
Dec 06 10:19:03 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016343.5099] manager: (tap06b75261-40): new Generic device (/org/freedesktop/NetworkManager/Devices/58)
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.511 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:03Z|00350|binding|INFO|Setting lport 06b75261-40e0-4712-ac9e-63a2586b3a8c ovn-installed in OVS
Dec 06 10:19:03 np0005548789.localdomain systemd-udevd[325760]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.516 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.518 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:03Z|00351|binding|INFO|Setting lport 06b75261-40e0-4712-ac9e-63a2586b3a8c up in Southbound
Dec 06 10:19:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:03.519 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe86:a151/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=06b75261-40e0-4712-ac9e-63a2586b3a8c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:03.521 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 06b75261-40e0-4712-ac9e-63a2586b3a8c in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:19:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:03.522 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 81d07f5f-20db-4fca-88f9-09f5f7d63de6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:03.522 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:03.523 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc0a258-5b44-42a4-b663-94c18d7f1c2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.545 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:03.570 2 INFO neutron.agent.securitygroups_rpc [None req-dcc65382-3d9e-4656-a85e-ef8650caf3cc b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap06b75261-40: No such device
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.578 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:03.602 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:03.783 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:18:59Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32550>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa328b0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32f40>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa320d0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32370>], id=e398fe9a-c93c-418f-8af0-1f4523efdbbb, ip_allocation=immediate, mac_address=fa:16:3e:e5:dc:d9, name=tempest-new-port-name-159393482, network_id=b84b6f67-f6c6-431b-82dc-4d4f6b20b084, port_security_enabled=True, project_id=a269d8afc49848fbb8ce5cdb49ef37dc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['a71b7ce5-d152-4b06-83b8-76d380ec29b6'], standard_attr_id=1997, status=DOWN, tags=[], tenant_id=a269d8afc49848fbb8ce5cdb49ef37dc, updated_at=2025-12-06T10:19:03Z on network b84b6f67-f6c6-431b-82dc-4d4f6b20b084
Dec 06 10:19:04 np0005548789.localdomain systemd[1]: tmp-crun.PEEiMd.mount: Deactivated successfully.
Dec 06 10:19:04 np0005548789.localdomain podman[325819]: 2025-12-06 10:19:04.038936091 +0000 UTC m=+0.069332993 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 1 addresses
Dec 06 10:19:04 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:04 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:04.212 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:04.336 263652 INFO neutron.agent.dhcp.agent [None req-e3d4edda-8596-49dc-bcb8-365e58fdd54c - - - - - -] DHCP configuration for ports {'e398fe9a-c93c-418f-8af0-1f4523efdbbb'} is completed
Dec 06 10:19:04 np0005548789.localdomain podman[325869]: 
Dec 06 10:19:04 np0005548789.localdomain podman[325869]: 2025-12-06 10:19:04.43388751 +0000 UTC m=+0.078208582 container create 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:04 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:04.433 2 INFO neutron.agent.securitygroups_rpc [None req-debf1305-ba6e-49c2-9083-8908dd68e972 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:04 np0005548789.localdomain podman[325869]: 2025-12-06 10:19:04.387432962 +0000 UTC m=+0.031754044 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1.scope.
Dec 06 10:19:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faf3f490170fa54cdf29bff10f6748300c6eee29114bdae5d18931b9dc259de4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:04 np0005548789.localdomain podman[325869]: 2025-12-06 10:19:04.522695235 +0000 UTC m=+0.167016277 container init 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:04 np0005548789.localdomain podman[325869]: 2025-12-06 10:19:04.531847562 +0000 UTC m=+0.176168604 container start 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: started, version 2.85 cachesize 150
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: DNS service limited to local subnets
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: warning: no upstream servers configured
Dec 06 10:19:04 np0005548789.localdomain dnsmasq-dhcp[325887]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:04 np0005548789.localdomain dnsmasq-dhcp[325887]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:04 np0005548789.localdomain dnsmasq-dhcp[325887]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:04 np0005548789.localdomain ceph-mon[298582]: pgmap v286: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 7.3 KiB/s wr, 39 op/s
Dec 06 10:19:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:04.704 263652 INFO neutron.agent.dhcp.agent [None req-3b43b9a3-cc0e-4969-b8ee-5f3a8d2a2048 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:04 np0005548789.localdomain dnsmasq[325887]: exiting on receipt of SIGTERM
Dec 06 10:19:04 np0005548789.localdomain podman[325905]: 2025-12-06 10:19:04.93136565 +0000 UTC m=+0.069118228 container kill 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:04 np0005548789.localdomain systemd[1]: libpod-00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1.scope: Deactivated successfully.
Dec 06 10:19:05 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:05.016 2 INFO neutron.agent.securitygroups_rpc [None req-91092cc5-6d84-4cc3-b0ed-55c483b81857 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:05 np0005548789.localdomain podman[325919]: 2025-12-06 10:19:05.018485403 +0000 UTC m=+0.068628182 container died 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:05 np0005548789.localdomain systemd[1]: tmp-crun.w6McMt.mount: Deactivated successfully.
Dec 06 10:19:05 np0005548789.localdomain podman[325919]: 2025-12-06 10:19:05.080582326 +0000 UTC m=+0.130725065 container cleanup 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:19:05 np0005548789.localdomain systemd[1]: libpod-conmon-00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1.scope: Deactivated successfully.
Dec 06 10:19:05 np0005548789.localdomain podman[325921]: 2025-12-06 10:19:05.103683777 +0000 UTC m=+0.145326749 container remove 00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:05 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:05.579 2 INFO neutron.agent.securitygroups_rpc [None req-03766a42-54e7-4e6a-a01a-d12c463a6613 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:05 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e158 e158: 6 total, 6 up, 6 in
Dec 06 10:19:05 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:05.723 2 INFO neutron.agent.securitygroups_rpc [None req-9d59c4b8-d3a8-40d9-8d73-2b90f45c1e12 1333c58cfc75447fad1b488a958549ce a269d8afc49848fbb8ce5cdb49ef37dc - - default default] Security group member updated ['a71b7ce5-d152-4b06-83b8-76d380ec29b6']
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/addn_hosts - 0 addresses
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/host
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[325519]: read /var/lib/neutron/dhcp/b84b6f67-f6c6-431b-82dc-4d4f6b20b084/opts
Dec 06 10:19:06 np0005548789.localdomain podman[325981]: 2025-12-06 10:19:06.002647273 +0000 UTC m=+0.061324500 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-faf3f490170fa54cdf29bff10f6748300c6eee29114bdae5d18931b9dc259de4-merged.mount: Deactivated successfully.
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-00206d12079198ff6a67ddc2240431deb4867e7f4507e18861ce3653f2eafae1-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:06.283 2 INFO neutron.agent.securitygroups_rpc [None req-13d35407-bef4-4c5e-baaa-9390a0fcd613 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: tmp-crun.JCj6ZZ.mount: Deactivated successfully.
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:06 np0005548789.localdomain podman[326026]: 2025-12-06 10:19:06.311275086 +0000 UTC m=+0.077193043 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:06Z|00352|binding|INFO|Removing iface tapf8d80242-07 ovn-installed in OVS
Dec 06 10:19:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:06Z|00353|binding|INFO|Removing lport f8d80242-07b6-493f-8d5a-67d3b413e7c2 ovn-installed in OVS
Dec 06 10:19:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:06.570 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 1da81fc3-b0d7-4158-9ff3-f6eb5f1de696 with type ""
Dec 06 10:19:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:06.601 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b84b6f67-f6c6-431b-82dc-4d4f6b20b084', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b84b6f67-f6c6-431b-82dc-4d4f6b20b084', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a269d8afc49848fbb8ce5cdb49ef37dc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b3e9ece-98ce-425e-b1a7-ae8b3622954c, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f8d80242-07b6-493f-8d5a-67d3b413e7c2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:06.601 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:06.602 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:06.603 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f8d80242-07b6-493f-8d5a-67d3b413e7c2 in datapath b84b6f67-f6c6-431b-82dc-4d4f6b20b084 unbound from our chassis
Dec 06 10:19:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:06.605 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b84b6f67-f6c6-431b-82dc-4d4f6b20b084, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:06.605 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[176f7abd-d36a-4feb-8abd-f9867d93b368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[325519]: exiting on receipt of SIGTERM
Dec 06 10:19:06 np0005548789.localdomain podman[326080]: 2025-12-06 10:19:06.654110684 +0000 UTC m=+0.052875315 container kill 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: libpod-0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f.scope: Deactivated successfully.
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.5 KiB/s wr, 25 op/s
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: osdmap e158: 6 total, 6 up, 6 in
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:06 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3936622661' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:06 np0005548789.localdomain podman[326095]: 
Dec 06 10:19:06 np0005548789.localdomain podman[326095]: 2025-12-06 10:19:06.703984447 +0000 UTC m=+0.075646296 container create 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:19:06 np0005548789.localdomain podman[326113]: 2025-12-06 10:19:06.740477744 +0000 UTC m=+0.061477386 container died 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: Started libpod-conmon-23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6.scope.
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:06 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a2207a38c3ce9b023fb76311185a6d9de756fe5150fd8266ec626a8df1dbd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:06 np0005548789.localdomain podman[326095]: 2025-12-06 10:19:06.666340624 +0000 UTC m=+0.038002474 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:06 np0005548789.localdomain podman[326095]: 2025-12-06 10:19:06.774567228 +0000 UTC m=+0.146229087 container init 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:06 np0005548789.localdomain podman[326095]: 2025-12-06 10:19:06.786663424 +0000 UTC m=+0.158325283 container start 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[326142]: started, version 2.85 cachesize 150
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[326142]: DNS service limited to local subnets
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[326142]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[326142]: warning: no upstream servers configured
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[326142]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[326142]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:06 np0005548789.localdomain dnsmasq[326142]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[326142]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:06 np0005548789.localdomain dnsmasq-dhcp[326142]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:06 np0005548789.localdomain podman[326113]: 2025-12-06 10:19:06.838486676 +0000 UTC m=+0.159486238 container remove 0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b84b6f67-f6c6-431b-82dc-4d4f6b20b084, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:06 np0005548789.localdomain systemd[1]: libpod-conmon-0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f.scope: Deactivated successfully.
Dec 06 10:19:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:06.854 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:06 np0005548789.localdomain kernel: device tapf8d80242-07 left promiscuous mode
Dec 06 10:19:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:06.867 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:06.919 263652 INFO neutron.agent.dhcp.agent [None req-85610208-1faa-4255-90a4-b8674ea410b5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:06.921 263652 INFO neutron.agent.dhcp.agent [None req-85610208-1faa-4255-90a4-b8674ea410b5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:06Z|00354|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:19:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:06.973 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: tmp-crun.4KrnKN.mount: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-75d9cf4f7f0f4147182b036eddac499be359e70f248dd76b6f98dcb4d0de61ef-merged.mount: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0816641e50d3344d04d04913a8061a709484ebf89102d281cfe008aeef60ed2f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2db84b6f67\x2df6c6\x2d431b\x2d82dc\x2d4d4f6b20b084.mount: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:07.072 263652 INFO neutron.agent.dhcp.agent [None req-96df3089-74ee-416a-a56a-68f56bfaa1f4 - - - - - -] DHCP configuration for ports {'06b75261-40e0-4712-ac9e-63a2586b3a8c', '687d7abb-e6aa-4047-aa26-552c962fcc91', '96643b3a-2587-4ba3-bc78-4df17e224aeb'} is completed
Dec 06 10:19:07 np0005548789.localdomain dnsmasq[326142]: exiting on receipt of SIGTERM
Dec 06 10:19:07 np0005548789.localdomain podman[326165]: 2025-12-06 10:19:07.110085745 +0000 UTC m=+0.066858079 container kill 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: libpod-23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6.scope: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain podman[326179]: 2025-12-06 10:19:07.181709206 +0000 UTC m=+0.059799064 container died 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: tmp-crun.KBD7F5.mount: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain podman[326179]: 2025-12-06 10:19:07.223132953 +0000 UTC m=+0.101222771 container cleanup 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:19:07 np0005548789.localdomain systemd[1]: libpod-conmon-23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6.scope: Deactivated successfully.
Dec 06 10:19:07 np0005548789.localdomain podman[326186]: 2025-12-06 10:19:07.268961393 +0000 UTC m=+0.135045947 container remove 23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:07Z|00355|binding|INFO|Releasing lport 06b75261-40e0-4712-ac9e-63a2586b3a8c from this chassis (sb_readonly=0)
Dec 06 10:19:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:07.284 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:07 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:07Z|00356|binding|INFO|Setting lport 06b75261-40e0-4712-ac9e-63a2586b3a8c down in Southbound
Dec 06 10:19:07 np0005548789.localdomain kernel: device tap06b75261-40 left promiscuous mode
Dec 06 10:19:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:07.292 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe86:a151/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=06b75261-40e0-4712-ac9e-63a2586b3a8c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:07.294 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 06b75261-40e0-4712-ac9e-63a2586b3a8c in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:19:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:07.297 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:07 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:07.297 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[aa961c16-70ac-48cb-95f9-f223cf1c9d1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:07.307 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:07.308 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:07 np0005548789.localdomain sshd[326209]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.916 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.922 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f9f95d-484c-4677-bb02-787c8d0e9731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:07.918181', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ffe96546-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': 'f6aeacf924948555e23a2bd9f6348258df0fa8d387a23da12c00f174211240b5'}]}, 'timestamp': '2025-12-06 10:19:07.922984', '_unique_id': 'e31ecbe835604ff38f61a1061a76491a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.944 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 17470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfafb631-b885-424e-a8c6-d48159746871', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17470000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:19:07.927080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ffecbe30-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.193450295, 'message_signature': 'be4ae09f4ac18ce6a03f59e295fe61aefe11b2c462ceb0b24a2c5d8ba31deb96'}]}, 'timestamp': '2025-12-06 10:19:07.944816', '_unique_id': '345d8ab88cb34a51b16e3331824065ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.945 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.981 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.981 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339b7501-3744-41e4-b1d2-7f581ea58dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:07.947429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff26092-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'b8edd53c700d6f1624588cdafba775a33025062f4369dcec11e2e6d7c81854f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:07.947429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff27712-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '05dd2f7aea62eb74001a6d21572e871933f37659e70f1f0e3959eedd465786eb'}]}, 'timestamp': '2025-12-06 10:19:07.982228', '_unique_id': 'b51bca0332d142f2b893758024a66a72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd6220e1-841c-4e19-813e-3148c9b21301', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:07.984877', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff4aa96-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': 'd03c1e3dd5be741e2d047c4704397b57c7a424a8c3b3466eeb245ec6cadc6b94'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:07.984877', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff4c120-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': 'c3e5a564f72c900a80786c36f88accb5d88a82c8fae29a8d12249108e6509e08'}]}, 'timestamp': '2025-12-06 10:19:07.997240', '_unique_id': 'a5d9e361b2a54a9ebf055d039d2ae3b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:07.999 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aa42e51-c552-4785-9d37-01cf736e54ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:07.999749', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff53d1c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': 'b62af848adb52288fe9cd5bc410412f02ce16e89971bba3eca3603c337188051'}]}, 'timestamp': '2025-12-06 10:19:08.000520', '_unique_id': '378b2001b45f4246800d53a5af820644'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.002 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a919f91-9035-4fb3-81a9-3a334487b80a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.002846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff5b17a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': '5d3b58493e19e195cf450d75496d7273f6bd83d3614ae4e9361590e4ae0fcc28'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.002846', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff5c656-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': '581500f7f0f38002a189581145c4dc6110332d34b8eca8f19ac9a1367515d093'}]}, 'timestamp': '2025-12-06 10:19:08.003995', '_unique_id': '54794c3d8d77494b99d61bcfbd73a4a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5e2b0a1-e337-4498-aac8-a60b7a8c9882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.006347', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff63a46-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': '7cf7a5b86e8c2766e37112cd18e47ad68223426a4e406600c5a87719c9cf2e6b'}]}, 'timestamp': '2025-12-06 10:19:08.006980', '_unique_id': '613afb679a394a3996e6240c36b73583'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dae32c01-7911-44f5-9de6-77f3d2ffbae2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.009263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff6abde-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'e3044fd548e398700884e1b92def3b95d22d650dace5bc95f00c53fc025b0474'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.009263', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff6c29a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '4517e382ddcd91889af05c34529bf1f94874e9e063fcb4d8667aa6bcb588d15e'}]}, 'timestamp': '2025-12-06 10:19:08.010377', '_unique_id': '307fc986014843c882a9bcc7e7d458a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.013 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f67b1c54-79da-4e2f-811d-8b5e352bc629', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.012712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff73464-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '8046d7311fcdb1525c062ab99dfec0c1f53627d092fd44695f71f020be047120'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.012712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff747ba-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'c22cb81ff697ec9701609e89afade7a218d7a84bf82df06988b595130d2d8d4b'}]}, 'timestamp': '2025-12-06 10:19:08.013835', '_unique_id': '822a66e13c4646308ad1f7a42f24d097'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.014 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '082205a4-8bf7-417c-a1e6-c2c1743ffacf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.016316', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff7bf9c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': '39b2e546eacb4306c48b930ae11cf7a3c17911ee7c0f5b17805a7db20bf7fbc0'}]}, 'timestamp': '2025-12-06 10:19:08.016978', '_unique_id': '53615a78db484d34a74b52ec593bb492'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.019 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9780f78c-a064-4c0e-94f3-d5f1290f1a0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.019630', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff8448a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': 'b95073e888c48c66143cfe6dc60c6d4116e64712f60fd250bd258ed8010e0789'}]}, 'timestamp': '2025-12-06 10:19:08.020404', '_unique_id': 'aca2592797c641bd96773fb68b0b09a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2643cbad-ed83-407f-94cc-1dec9e6b31db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.022661', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff8b94c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': '5e105f525e447ea790e5912aceea253477bad5fea49644cc27a902fbb76e6809'}]}, 'timestamp': '2025-12-06 10:19:08.023300', '_unique_id': 'd5aca57787d04639bddbce8e0f7504a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fdf6b6b-857c-41a4-a861-aa07db9ba4bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.025883', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fff9357a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': 'f0940a554d7d9c05dee34bba90700cce57975eef3995c1bf5e53a23db0fe99f9'}]}, 'timestamp': '2025-12-06 10:19:08.026476', '_unique_id': 'a7b5cc97c153459a90f602f47eff9ded'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.029 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-84a2207a38c3ce9b023fb76311185a6d9de756fe5150fd8266ec626a8df1dbd6-merged.mount: Deactivated successfully.
Dec 06 10:19:08 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23114774a6fb8cff7210fda601c8d1bd2082af42df010baf7781c360d2ff8fa6-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6a9d47d-8dc8-4ec9-80d8-ef52280de585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.028728', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fff9a5dc-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': '2f54158307620dfd2b5bcbc62e99226fee34dd0f231a3e64909b3b231addd5be'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.028728', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fff9b9a0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.234253903, 'message_signature': 'ea135b8ba2abc8423311cdafe03f756429385b50870a66f609f67c9d0bb0bff5'}]}, 'timestamp': '2025-12-06 10:19:08.029864', '_unique_id': '27880b051d114bb99870294b17ee0eb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.032 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a91cdd5c-b60f-46ae-8197-e714eaf29654', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.032258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fffa2e1c-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '487b88f707e445507b54dd12e479cb061fe5bcdfe0d08c7e9b42817d2ded2770'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.032258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fffa450a-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'b47bd163846c6296e2106715b7f15405455097eae272de5001f4b95501c64c63'}]}, 'timestamp': '2025-12-06 10:19:08.033402', '_unique_id': '8742d053de364ed68cf543756aff3c47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.034 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.035 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.035 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.036 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f857c1b0-4c03-434f-8dfd-9da116182cf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.036137', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fffac3cc-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': '65ab06bd55847a85b1105251809166c735a561d0b3df43e036fd88857ecd471d'}]}, 'timestamp': '2025-12-06 10:19:08.036653', '_unique_id': 'f7c7375318a44c23a89a9529f65f70da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.037 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f11a26cb-1aac-4ee2-b0de-373d9459be1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.039013', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fffb33d4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': 'fad12845c02294857a8b706db9e3fd73330f19de6bed16b3dfa3e0bb79d8ad9a'}]}, 'timestamp': '2025-12-06 10:19:08.039518', '_unique_id': 'e0011b0e6ed04b9b9f689ecfb792f97e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.041 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.041 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bab36104-e0f1-4ad6-84a9-8450d56574ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.041413', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fffb8de8-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '4605e77cf9de0c4bce548fe8ef318bfc49f1561eee4c347795de099615de3a89'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.041413', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fffb98e2-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': '526bc678867f909d232e1d4fcafe20ee952ef0632485139eeda9ef5ae8522bcc'}]}, 'timestamp': '2025-12-06 10:19:08.041980', '_unique_id': '34a41bbb7d9b4e5993c603ec082c8bbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.043 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30da0d22-728c-442b-bc3d-55d6a18154bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:19:08.043310', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'fffbd758-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.167552279, 'message_signature': '7859ad67059875dd4e58ca26820091daf3272f9866262ac0ce3723593c9a1e90'}]}, 'timestamp': '2025-12-06 10:19:08.043603', '_unique_id': 'b2c8f406e07c462c93d86b45ac3401f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.044 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c41677e5-26e6-4a86-8749-80e7b3bce979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:19:08.044818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fffc1092-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'd654997788eae63922a3a7bae2b4f3426b5b23669a2055ba21834daa59fd14d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:19:08.044818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fffc17a4-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.196807337, 'message_signature': 'ce825789d8545c5434c17f0d480ec4fd06f731217f69a2357d5d0fd6cc6528a6'}]}, 'timestamp': '2025-12-06 10:19:08.045187', '_unique_id': 'a570bb79dd2b45d89b2febea495e3690'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.045 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fdbadda-abe2-4663-901f-5521df459efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:19:08.046123', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fffc43a0-d28c-11f0-aaf2-fa163e118844', 'monotonic_time': 12766.193450295, 'message_signature': 'f5accdfa69a366b8b0635b57529f5c5a7e389a5642c3fb03a29ed89cb9ecfdf7'}]}, 'timestamp': '2025-12-06 10:19:08.046319', '_unique_id': '8efd25b91dad4598a0a0f8c9cad54af6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:19:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:19:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:19:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:08.105 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:08 np0005548789.localdomain ceph-mon[298582]: pgmap v289: 177 pgs: 177 active+clean; 145 MiB data, 841 MiB used, 41 GiB / 42 GiB avail; 4.1 MiB/s rd, 5.4 KiB/s wr, 30 op/s
Dec 06 10:19:08 np0005548789.localdomain sshd[326209]: Received disconnect from 14.194.101.210 port 38240:11: Bye Bye [preauth]
Dec 06 10:19:08 np0005548789.localdomain sshd[326209]: Disconnected from authenticating user root 14.194.101.210 port 38240 [preauth]
Dec 06 10:19:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:09.252 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:09 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:19:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e159 e159: 6 total, 6 up, 6 in
Dec 06 10:19:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:09.941 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:09Z, description=, device_id=9e1f6eb1-3712-4bad-ae30-ab318957491e, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa36c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa361c0>], id=9ebf0f4d-a252-46d9-a5a3-4880f54c157b, ip_allocation=immediate, mac_address=fa:16:3e:46:6a:0c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2038, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:19:09Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:19:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:19:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:19:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:10.206 263652 INFO neutron.agent.linux.ip_lib [None req-b83b2478-c8f5-4578-a617-ed1d2101d359 - - - - - -] Device tap71317000-7e cannot be used as it has no MAC address
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.231 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain kernel: device tap71317000-7e entered promiscuous mode
Dec 06 10:19:10 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016350.2396] manager: (tap71317000-7e): new Generic device (/org/freedesktop/NetworkManager/Devices/59)
Dec 06 10:19:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:10Z|00357|binding|INFO|Claiming lport 71317000-7e06-4580-adc9-235e7990a2e9 for this chassis.
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.241 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:10Z|00358|binding|INFO|71317000-7e06-4580-adc9-235e7990a2e9: Claiming unknown
Dec 06 10:19:10 np0005548789.localdomain systemd-udevd[326263]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:10Z|00359|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 up in Southbound
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.253 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:10Z|00360|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 ovn-installed in OVS
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.255 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:10.250 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=71317000-7e06-4580-adc9-235e7990a2e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:10.252 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 71317000-7e06-4580-adc9-235e7990a2e9 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:19:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:10.254 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 71a44691-315d-461a-89ef-7b4a5310a873 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:10.254 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:10.255 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[93214776-a626-4728-8a2b-bb3efde2dfb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap71317000-7e: No such device
Dec 06 10:19:10 np0005548789.localdomain podman[326250]: 2025-12-06 10:19:10.310335573 +0000 UTC m=+0.114449612 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:10 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:19:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:10 np0005548789.localdomain podman[326222]: 2025-12-06 10:19:10.314938453 +0000 UTC m=+0.197152781 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Dec 06 10:19:10 np0005548789.localdomain podman[326223]: 2025-12-06 10:19:10.28584968 +0000 UTC m=+0.166386447 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.365 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain podman[326222]: 2025-12-06 10:19:10.37252694 +0000 UTC m=+0.254741238 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 06 10:19:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:10.380 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:10 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:19:10 np0005548789.localdomain podman[326223]: 2025-12-06 10:19:10.397060684 +0000 UTC m=+0.277597421 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:19:10 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:19:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:10.637 263652 INFO neutron.agent.dhcp.agent [None req-58133c63-44aa-4ca6-9741-dcfa357cfbae - - - - - -] DHCP configuration for ports {'9ebf0f4d-a252-46d9-a5a3-4880f54c157b'} is completed
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: pgmap v290: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 4.3 MiB/s wr, 130 op/s
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: osdmap e159: 6 total, 6 up, 6 in
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/961294606' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/961294606' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:10 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:11 np0005548789.localdomain podman[326361]: 
Dec 06 10:19:11 np0005548789.localdomain podman[326361]: 2025-12-06 10:19:11.244653803 +0000 UTC m=+0.083018509 container create 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:19:11 np0005548789.localdomain podman[326361]: 2025-12-06 10:19:11.196081849 +0000 UTC m=+0.034446595 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:11 np0005548789.localdomain systemd[1]: Started libpod-conmon-2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf.scope.
Dec 06 10:19:11 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25b77508fc5e8b97d0f383ba076d5d2275da596e03d557121924e9aa163151dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:11 np0005548789.localdomain podman[326361]: 2025-12-06 10:19:11.326795684 +0000 UTC m=+0.165160400 container init 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:19:11 np0005548789.localdomain podman[326361]: 2025-12-06 10:19:11.337356035 +0000 UTC m=+0.175720741 container start 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: started, version 2.85 cachesize 150
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: DNS service limited to local subnets
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: warning: no upstream servers configured
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:11.403 263652 INFO neutron.agent.dhcp.agent [None req-b83b2478-c8f5-4578-a617-ed1d2101d359 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:03Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa6edf0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa6eb20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa25b50>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa253a0>], id=96643b3a-2587-4ba3-bc78-4df17e224aeb, ip_allocation=immediate, mac_address=fa:16:3e:78:6a:c8, name=tempest-NetworksTestDHCPv6-422539605, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['3be9e423-7cab-42de-a208-d0a663296188', '85ecd64d-20c3-4ae8-82ef-a956d1f47bf6'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:02Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=2026, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:03Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:19:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:11.538 263652 INFO neutron.agent.dhcp.agent [None req-b0518dc2-39d0-4260-824e-ca1a54b787ae - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:11.586 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8:0:1:f816:3eff:fe3f:fa29'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=687d7abb-e6aa-4047-aa26-552c962fcc91) old=Port_Binding(mac=['fa:16:3e:3f:fa:29 2001:db8::f816:3eff:fe3f:fa29'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:fa29/64', 'neutron:device_id': 'ovnmeta-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:11.588 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 687d7abb-e6aa-4047-aa26-552c962fcc91 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c updated
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:11.595 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 71a44691-315d-461a-89ef-7b4a5310a873 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:11.595 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:11 np0005548789.localdomain podman[326397]: 2025-12-06 10:19:11.595612588 +0000 UTC m=+0.067178519 container kill 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:11.596 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ac5b7f-936d-4279-8ebf-ee77d66080e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1079829945' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:11.855 263652 INFO neutron.agent.dhcp.agent [None req-80729c80-748d-478d-af43-dbc7d2f3b614 - - - - - -] DHCP configuration for ports {'96643b3a-2587-4ba3-bc78-4df17e224aeb'} is completed
Dec 06 10:19:11 np0005548789.localdomain dnsmasq[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:11 np0005548789.localdomain podman[326434]: 2025-12-06 10:19:11.927164674 +0000 UTC m=+0.055850735 container kill 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:11 np0005548789.localdomain dnsmasq-dhcp[326379]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:11 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:11.986 2 INFO neutron.agent.securitygroups_rpc [None req-cdcfd119-194b-4de8-98ff-a5e7eedce5b7 7365839d5bca455283c571ca0abd33bb 12673f85bb004c3c946338dc70e565e7 - - default default] Security group member updated ['5a014cda-2333-483a-bcd0-2243e387c412']
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.093 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00361|binding|INFO|Releasing lport 71317000-7e06-4580-adc9-235e7990a2e9 from this chassis (sb_readonly=0)
Dec 06 10:19:12 np0005548789.localdomain kernel: device tap71317000-7e left promiscuous mode
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00362|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 down in Southbound
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.105 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=71317000-7e06-4580-adc9-235e7990a2e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.107 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 71317000-7e06-4580-adc9-235e7990a2e9 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.112 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.112 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5bbd6629-5286-41e1-bd80-a057ca582f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.122 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.184 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.185 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.185 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain systemd[1]: tmp-crun.7ky53u.mount: Deactivated successfully.
Dec 06 10:19:12 np0005548789.localdomain dnsmasq[326379]: exiting on receipt of SIGTERM
Dec 06 10:19:12 np0005548789.localdomain podman[326474]: 2025-12-06 10:19:12.644076759 +0000 UTC m=+0.064103095 container kill 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:12 np0005548789.localdomain systemd[1]: libpod-2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf.scope: Deactivated successfully.
Dec 06 10:19:12 np0005548789.localdomain podman[326487]: 2025-12-06 10:19:12.702722348 +0000 UTC m=+0.045798540 container died 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:12 np0005548789.localdomain ceph-mon[298582]: pgmap v292: 177 pgs: 177 active+clean; 192 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 5.3 MiB/s wr, 147 op/s
Dec 06 10:19:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e160 e160: 6 total, 6 up, 6 in
Dec 06 10:19:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:12 np0005548789.localdomain podman[326487]: 2025-12-06 10:19:12.792046608 +0000 UTC m=+0.135122760 container cleanup 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:19:12 np0005548789.localdomain systemd[1]: libpod-conmon-2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf.scope: Deactivated successfully.
Dec 06 10:19:12 np0005548789.localdomain podman[326489]: 2025-12-06 10:19:12.82280337 +0000 UTC m=+0.155556149 container remove 2fa7bd0f92c875d1bd08c0768368cc684a6096ec2ef786a9089f18e133c12dbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:19:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:12.884 263652 INFO neutron.agent.linux.ip_lib [None req-71a8dcce-ead2-401d-9227-29b367ede1fa - - - - - -] Device tap71317000-7e cannot be used as it has no MAC address
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.953 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain kernel: device tap71317000-7e entered promiscuous mode
Dec 06 10:19:12 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016352.9633] manager: (tap71317000-7e): new Generic device (/org/freedesktop/NetworkManager/Devices/60)
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00363|binding|INFO|Claiming lport 71317000-7e06-4580-adc9-235e7990a2e9 for this chassis.
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00364|binding|INFO|71317000-7e06-4580-adc9-235e7990a2e9: Claiming unknown
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.966 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.972 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6c:348c/64 2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=71317000-7e06-4580-adc9-235e7990a2e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.974 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 71317000-7e06-4580-adc9-235e7990a2e9 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c bound to our chassis
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00365|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 up in Southbound
Dec 06 10:19:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:12Z|00366|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 ovn-installed in OVS
Dec 06 10:19:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:12.976 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.976 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 71a44691-315d-461a-89ef-7b4a5310a873 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.977 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:12.977 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[79484c5c-7005-4c11-a4ba-c41e157848fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:13.005 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:13.043 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:13.066 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:13.108 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-25b77508fc5e8b97d0f383ba076d5d2275da596e03d557121924e9aa163151dc-merged.mount: Deactivated successfully.
Dec 06 10:19:13 np0005548789.localdomain ceph-mon[298582]: osdmap e160: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548789.localdomain ceph-mon[298582]: pgmap v294: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 157 KiB/s rd, 5.7 MiB/s wr, 230 op/s
Dec 06 10:19:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e161 e161: 6 total, 6 up, 6 in
Dec 06 10:19:13 np0005548789.localdomain podman[326573]: 
Dec 06 10:19:13 np0005548789.localdomain podman[326573]: 2025-12-06 10:19:13.879873873 +0000 UTC m=+0.099904611 container create 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:19:13 np0005548789.localdomain systemd[1]: Started libpod-conmon-6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087.scope.
Dec 06 10:19:13 np0005548789.localdomain podman[326573]: 2025-12-06 10:19:13.835428185 +0000 UTC m=+0.055458953 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:13 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:13 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/516e33f92756df3ac203a82b130353d936333b23948958938909e6ece2d0c964/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:13 np0005548789.localdomain podman[326573]: 2025-12-06 10:19:13.961509839 +0000 UTC m=+0.181540577 container init 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:19:13 np0005548789.localdomain podman[326573]: 2025-12-06 10:19:13.967889823 +0000 UTC m=+0.187920561 container start 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:13 np0005548789.localdomain dnsmasq[326591]: started, version 2.85 cachesize 150
Dec 06 10:19:13 np0005548789.localdomain dnsmasq[326591]: DNS service limited to local subnets
Dec 06 10:19:13 np0005548789.localdomain dnsmasq[326591]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:13 np0005548789.localdomain dnsmasq[326591]: warning: no upstream servers configured
Dec 06 10:19:13 np0005548789.localdomain dnsmasq-dhcp[326591]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:13 np0005548789.localdomain dnsmasq[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:13 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:13 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:19:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:14.121 263652 INFO neutron.agent.dhcp.agent [None req-f2fa038d-9f03-4fd4-9bdf-2bdd29323423 - - - - - -] DHCP configuration for ports {'71317000-7e06-4580-adc9-235e7990a2e9', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:14 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:14.136 2 INFO neutron.agent.securitygroups_rpc [None req-a19ad948-85b8-4074-80f7-d1d223959ce7 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:14 np0005548789.localdomain podman[326592]: 2025-12-06 10:19:14.185254535 +0000 UTC m=+0.092078964 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:14 np0005548789.localdomain podman[326592]: 2025-12-06 10:19:14.197827937 +0000 UTC m=+0.104652436 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:14 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:19:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:14.222 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:13Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd30940>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd30250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd30910>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd30070>], id=5d1bde55-cf71-4361-9111-0d2bb82ecc07, ip_allocation=immediate, mac_address=fa:16:3e:22:52:10, name=tempest-NetworksTestDHCPv6-1010733462, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['59a88728-39f6-47b9-a901-d0a4fab9297e', '84b54cff-c31f-4590-8648-7c950796533e'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:09Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=2062, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:13Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:19:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:14.288 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:14 np0005548789.localdomain dnsmasq[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:19:14 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:14 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:14 np0005548789.localdomain podman[326629]: 2025-12-06 10:19:14.416038736 +0000 UTC m=+0.060671111 container kill 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:14.690 263652 INFO neutron.agent.dhcp.agent [None req-67594386-b5ab-4512-8ce1-23ace02a9fb5 - - - - - -] DHCP configuration for ports {'5d1bde55-cf71-4361-9111-0d2bb82ecc07'} is completed
Dec 06 10:19:14 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:19:14 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:14 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:14 np0005548789.localdomain podman[326667]: 2025-12-06 10:19:14.698211264 +0000 UTC m=+0.056620028 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: osdmap e161: 6 total, 6 up, 6 in
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/932893422' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:14 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:14.866 2 INFO neutron.agent.securitygroups_rpc [None req-83d5638e-2f4a-455b-b2d3-487dd6af4b6c a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:15 np0005548789.localdomain dnsmasq[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:15 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:15 np0005548789.localdomain podman[326707]: 2025-12-06 10:19:15.072190128 +0000 UTC m=+0.060289150 container kill 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:19:15 np0005548789.localdomain dnsmasq-dhcp[326591]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:15 np0005548789.localdomain ceph-mon[298582]: pgmap v296: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 3.5 KiB/s wr, 90 op/s
Dec 06 10:19:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:15 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:15 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:15 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:15.912 2 INFO neutron.agent.securitygroups_rpc [None req-ca0b48dc-f1ce-4207-b602-f1515b9dc7e0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:16 np0005548789.localdomain systemd[1]: tmp-crun.9Tkel7.mount: Deactivated successfully.
Dec 06 10:19:16 np0005548789.localdomain podman[326743]: 2025-12-06 10:19:16.391859356 +0000 UTC m=+0.072457818 container kill 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:16 np0005548789.localdomain dnsmasq[326591]: exiting on receipt of SIGTERM
Dec 06 10:19:16 np0005548789.localdomain systemd[1]: libpod-6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087.scope: Deactivated successfully.
Dec 06 10:19:16 np0005548789.localdomain podman[326759]: 2025-12-06 10:19:16.453343061 +0000 UTC m=+0.042225701 container died 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:16 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:16 np0005548789.localdomain podman[326759]: 2025-12-06 10:19:16.548621371 +0000 UTC m=+0.137504001 container remove 6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:19:16 np0005548789.localdomain systemd[1]: libpod-conmon-6858356f985a82d53b77c2c185ff099d70e1233b4e3a6899db120f14fb49b087.scope: Deactivated successfully.
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:19:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:19:16 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1061312478' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3547940131' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:16 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3547940131' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:17.123 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:16Z, description=, device_id=99ad1734-7772-49ab-8fb4-302cb49814eb, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9ffd60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9ff580>], id=07d3794f-2f7e-46e2-ac9d-c21c65319152, ip_allocation=immediate, mac_address=fa:16:3e:49:5b:1d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2078, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:19:16Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:19:17 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:17.187 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-516e33f92756df3ac203a82b130353d936333b23948958938909e6ece2d0c964-merged.mount: Deactivated successfully.
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: tmp-crun.CajdON.mount: Deactivated successfully.
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:19:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:17 np0005548789.localdomain podman[326833]: 2025-12-06 10:19:17.405879713 +0000 UTC m=+0.135283614 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:19:17 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:17 np0005548789.localdomain podman[326860]: 
Dec 06 10:19:17 np0005548789.localdomain podman[326860]: 2025-12-06 10:19:17.475209667 +0000 UTC m=+0.113615798 container create 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: Started libpod-conmon-1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc.scope.
Dec 06 10:19:17 np0005548789.localdomain podman[326860]: 2025-12-06 10:19:17.423833357 +0000 UTC m=+0.062239528 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:17 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/82c9b7b4cd83264fe5f4adcacf7a3b93a9ed162b717afd8963740bae1eb50e45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:17 np0005548789.localdomain podman[326860]: 2025-12-06 10:19:17.549165539 +0000 UTC m=+0.187571670 container init 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:19:17 np0005548789.localdomain podman[326860]: 2025-12-06 10:19:17.559616897 +0000 UTC m=+0.198023028 container start 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[326889]: started, version 2.85 cachesize 150
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[326889]: DNS service limited to local subnets
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[326889]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[326889]: warning: no upstream servers configured
Dec 06 10:19:17 np0005548789.localdomain dnsmasq-dhcp[326889]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:17 np0005548789.localdomain dnsmasq[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:17 np0005548789.localdomain dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:17 np0005548789.localdomain dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:17.660 2 INFO neutron.agent.securitygroups_rpc [None req-09283ffa-3b28-4158-b02d-0d7572bb2b32 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:17.694 263652 INFO neutron.agent.dhcp.agent [None req-1f4df94e-b656-4f75-a32e-bd7515effd96 - - - - - -] DHCP configuration for ports {'07d3794f-2f7e-46e2-ac9d-c21c65319152'} is completed
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:19:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:17.858 263652 INFO neutron.agent.dhcp.agent [None req-521b72fe-5f63-4cbc-8de6-2fa1e9bfebea - - - - - -] DHCP configuration for ports {'71317000-7e06-4580-adc9-235e7990a2e9', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:17 np0005548789.localdomain podman[326907]: 2025-12-06 10:19:17.918898453 +0000 UTC m=+0.084008568 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:19:17 np0005548789.localdomain podman[326907]: 2025-12-06 10:19:17.93121851 +0000 UTC m=+0.096328635 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:19:17 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:19:18 np0005548789.localdomain dnsmasq[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:18 np0005548789.localdomain dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:18 np0005548789.localdomain podman[326911]: 2025-12-06 10:19:18.005626354 +0000 UTC m=+0.159816816 container kill 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:18 np0005548789.localdomain dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:18.110 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:18 np0005548789.localdomain ceph-mon[298582]: pgmap v297: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 2.8 KiB/s wr, 72 op/s
Dec 06 10:19:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:18.247 2 INFO neutron.agent.securitygroups_rpc [None req-c5dcea7d-aa1b-4625-8fc4-dc86e9ad2a1a 7365839d5bca455283c571ca0abd33bb 12673f85bb004c3c946338dc70e565e7 - - default default] Security group member updated ['5a014cda-2333-483a-bcd0-2243e387c412']
Dec 06 10:19:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:18.277 263652 INFO neutron.agent.dhcp.agent [None req-6cefcc3e-a386-4a51-acd2-374cc98f4b85 - - - - - -] DHCP configuration for ports {'71317000-7e06-4580-adc9-235e7990a2e9', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:19.278 2 INFO neutron.agent.securitygroups_rpc [None req-7362e19c-e595-471d-b005-9585c5cb5a42 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:19.308 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:20 np0005548789.localdomain ceph-mon[298582]: pgmap v298: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 122 KiB/s rd, 5.1 KiB/s wr, 161 op/s
Dec 06 10:19:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:20.869 2 INFO neutron.agent.securitygroups_rpc [None req-8a7361c8-fe6c-42e8-b9eb-90548f1065a0 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1538185790' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:19:21 np0005548789.localdomain systemd[1]: tmp-crun.uWiS8D.mount: Deactivated successfully.
Dec 06 10:19:21 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:19:21 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:21 np0005548789.localdomain podman[326967]: 2025-12-06 10:19:21.861552432 +0000 UTC m=+0.076015746 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:21 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:21 np0005548789.localdomain podman[326978]: 2025-12-06 10:19:21.928289271 +0000 UTC m=+0.085492224 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:19:21 np0005548789.localdomain podman[326978]: 2025-12-06 10:19:21.999304483 +0000 UTC m=+0.156507446 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:22 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:19:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e162 e162: 6 total, 6 up, 6 in
Dec 06 10:19:22 np0005548789.localdomain ceph-mon[298582]: pgmap v299: 177 pgs: 177 active+clean; 192 MiB data, 935 MiB used, 41 GiB / 42 GiB avail; 116 KiB/s rd, 4.9 KiB/s wr, 154 op/s
Dec 06 10:19:22 np0005548789.localdomain ceph-mon[298582]: osdmap e162: 6 total, 6 up, 6 in
Dec 06 10:19:22 np0005548789.localdomain sshd[327014]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:19:22 np0005548789.localdomain dnsmasq[326889]: exiting on receipt of SIGTERM
Dec 06 10:19:22 np0005548789.localdomain podman[327033]: 2025-12-06 10:19:22.937806431 +0000 UTC m=+0.063775521 container kill 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:22 np0005548789.localdomain systemd[1]: libpod-1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc.scope: Deactivated successfully.
Dec 06 10:19:23 np0005548789.localdomain podman[327047]: 2025-12-06 10:19:23.019042014 +0000 UTC m=+0.055602571 container died 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:19:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:23 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-82c9b7b4cd83264fe5f4adcacf7a3b93a9ed162b717afd8963740bae1eb50e45-merged.mount: Deactivated successfully.
Dec 06 10:19:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:23.115 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:23 np0005548789.localdomain podman[327047]: 2025-12-06 10:19:23.118399521 +0000 UTC m=+0.154960028 container remove 1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:19:23 np0005548789.localdomain systemd[1]: libpod-conmon-1ddc436956cf71e3ab28ae1b4ad9296c7ca896a51f81c3e13b4e1305a0e2fdfc.scope: Deactivated successfully.
Dec 06 10:19:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3020127696' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3020127696' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:23 np0005548789.localdomain sshd[327014]: Received disconnect from 154.113.10.34 port 41002:11: Bye Bye [preauth]
Dec 06 10:19:23 np0005548789.localdomain sshd[327014]: Disconnected from authenticating user root 154.113.10.34 port 41002 [preauth]
Dec 06 10:19:23 np0005548789.localdomain sshd[327073]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:19:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:23.769 2 INFO neutron.agent.securitygroups_rpc [None req-7c634670-f27b-4241-a6ed-35c65bde0f68 a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:19:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:19:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:19:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:19:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:19:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Dec 06 10:19:24 np0005548789.localdomain ceph-mon[298582]: pgmap v301: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 4.1 KiB/s wr, 129 op/s
Dec 06 10:19:24 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3025114459' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:24 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3025114459' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:24.332 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:24 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:24.698 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:24Z, description=, device_id=5e763889-0545-429e-afe5-cf1946a7be48, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9febe0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9fe760>], id=4a3b4a33-39ad-4a3a-af8b-c09fba1ef46f, ip_allocation=immediate, mac_address=fa:16:3e:d7:10:85, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2100, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:19:24Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:19:24 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:19:24 np0005548789.localdomain podman[327119]: 2025-12-06 10:19:24.927439699 +0000 UTC m=+0.053098484 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:19:24 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:24 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:25 np0005548789.localdomain podman[327163]: 
Dec 06 10:19:25 np0005548789.localdomain podman[327163]: 2025-12-06 10:19:25.164350181 +0000 UTC m=+0.070155416 container create 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:25.167 263652 INFO neutron.agent.dhcp.agent [None req-cd265b96-18c3-4e54-879c-7483f74d5edc - - - - - -] DHCP configuration for ports {'4a3b4a33-39ad-4a3a-af8b-c09fba1ef46f'} is completed
Dec 06 10:19:25 np0005548789.localdomain systemd[1]: Started libpod-conmon-2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4.scope.
Dec 06 10:19:25 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:25 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/058ece422527a93ea9cfae4ee7d56c460b45c10e9d7892db0a314b3ee73ae7fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:25 np0005548789.localdomain podman[327163]: 2025-12-06 10:19:25.131388073 +0000 UTC m=+0.037193308 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:25 np0005548789.localdomain podman[327163]: 2025-12-06 10:19:25.239814417 +0000 UTC m=+0.145619682 container init 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:25.243 2 INFO neutron.agent.securitygroups_rpc [None req-e1143dbb-8340-4dac-af2c-b301e23bde0e a6a9256bca1441629c18003b71ba1c6f 34a17eee71de4bac8b71972a4b7b506c - - default default] Security group member updated ['d618a097-5989-47aa-9263-1c8a114ad269']
Dec 06 10:19:25 np0005548789.localdomain podman[327163]: 2025-12-06 10:19:25.248168692 +0000 UTC m=+0.153973927 container start 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: started, version 2.85 cachesize 150
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: DNS service limited to local subnets
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: warning: no upstream servers configured
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:25.309 263652 INFO neutron.agent.dhcp.agent [None req-6e1ebb70-7474-4b90-b690-943c03b4357a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:19:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fac1fd0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fac1af0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fac1070>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fac1370>], id=210a490d-79cd-4308-b6fe-935eca96b08e, ip_allocation=immediate, mac_address=fa:16:3e:4b:fd:6a, name=tempest-NetworksTestDHCPv6-825481970, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:17:28Z, description=, dns_domain=, id=43883dce-1590-48c4-987c-a21b63b82a1c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1975538139, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42818, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1415, status=ACTIVE, subnets=['2555de70-b983-4d04-8a68-2427fd11842b', 'b1d2f2d6-9c9e-4054-996f-58f985b37644'], tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:18Z, vlan_transparent=None, network_id=43883dce-1590-48c4-987c-a21b63b82a1c, port_security_enabled=True, project_id=34a17eee71de4bac8b71972a4b7b506c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d618a097-5989-47aa-9263-1c8a114ad269'], standard_attr_id=2099, status=DOWN, tags=[], tenant_id=34a17eee71de4bac8b71972a4b7b506c, updated_at=2025-12-06T10:19:23Z on network 43883dce-1590-48c4-987c-a21b63b82a1c
Dec 06 10:19:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 e163: 6 total, 6 up, 6 in
Dec 06 10:19:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:25.476 263652 INFO neutron.agent.dhcp.agent [None req-0c72838e-6c4e-4108-ba4e-af2becef5d48 - - - - - -] DHCP configuration for ports {'687d7abb-e6aa-4047-aa26-552c962fcc91', '71317000-7e06-4580-adc9-235e7990a2e9'} is completed
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 2 addresses
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:25 np0005548789.localdomain podman[327201]: 2025-12-06 10:19:25.498808965 +0000 UTC m=+0.066176715 container kill 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:25.516 2 INFO neutron.agent.securitygroups_rpc [None req-16b19944-c36d-4221-9d13-f63b2c9f61ac b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:25.677 2 INFO neutron.agent.securitygroups_rpc [None req-16b19944-c36d-4221-9d13-f63b2c9f61ac b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:25.692 263652 INFO neutron.agent.dhcp.agent [None req-670b6bc8-d0d9-45e0-9dfa-450619cb000a - - - - - -] DHCP configuration for ports {'210a490d-79cd-4308-b6fe-935eca96b08e'} is completed
Dec 06 10:19:25 np0005548789.localdomain dnsmasq[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:25 np0005548789.localdomain dnsmasq-dhcp[327181]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:25 np0005548789.localdomain podman[327239]: 2025-12-06 10:19:25.799333401 +0000 UTC m=+0.060125209 container kill 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:19:26 np0005548789.localdomain ceph-mon[298582]: pgmap v302: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 3.8 KiB/s wr, 121 op/s
Dec 06 10:19:26 np0005548789.localdomain ceph-mon[298582]: osdmap e163: 6 total, 6 up, 6 in
Dec 06 10:19:26 np0005548789.localdomain dnsmasq[327181]: exiting on receipt of SIGTERM
Dec 06 10:19:26 np0005548789.localdomain podman[327279]: 2025-12-06 10:19:26.569815022 +0000 UTC m=+0.066443942 container kill 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:19:26 np0005548789.localdomain systemd[1]: libpod-2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4.scope: Deactivated successfully.
Dec 06 10:19:26 np0005548789.localdomain podman[327292]: 2025-12-06 10:19:26.646133945 +0000 UTC m=+0.062823341 container died 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:19:26 np0005548789.localdomain podman[327292]: 2025-12-06 10:19:26.680017201 +0000 UTC m=+0.096706557 container cleanup 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:19:26 np0005548789.localdomain systemd[1]: libpod-conmon-2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4.scope: Deactivated successfully.
Dec 06 10:19:26 np0005548789.localdomain podman[327299]: 2025-12-06 10:19:26.727021777 +0000 UTC m=+0.129115537 container remove 2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:19:26 np0005548789.localdomain sshd[327073]: Received disconnect from 123.160.164.187 port 38530:11: Bye Bye [preauth]
Dec 06 10:19:26 np0005548789.localdomain sshd[327073]: Disconnected from authenticating user root 123.160.164.187 port 38530 [preauth]
Dec 06 10:19:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:26.796 2 INFO neutron.agent.securitygroups_rpc [None req-6fcbbc2a-54c0-4eb0-a7e2-cb02681a4453 b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:26.830 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-058ece422527a93ea9cfae4ee7d56c460b45c10e9d7892db0a314b3ee73ae7fb-merged.mount: Deactivated successfully.
Dec 06 10:19:26 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c89eb92c9d1573a87f913aba601dadadec7a0bfb9ab8a92d5709b06e4bf27c4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:27 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:27.176 2 INFO neutron.agent.securitygroups_rpc [None req-a0866618-9e73-4e70-a70b-4bf19bcc43ec b30ee2b2ade74f9e80de3f1afc291bda 29c573bcf157448abe548893ad01e3d2 - - default default] Security group member updated ['96d14d8a-5f78-4831-ba37-3f88bccdbe58']
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.207619) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367207707, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2235, "num_deletes": 268, "total_data_size": 4244735, "memory_usage": 4323144, "flush_reason": "Manual Compaction"}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367222196, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 2755035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24477, "largest_seqno": 26707, "table_properties": {"data_size": 2746461, "index_size": 5207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19127, "raw_average_key_size": 20, "raw_value_size": 2728650, "raw_average_value_size": 2991, "num_data_blocks": 225, "num_entries": 912, "num_filter_entries": 912, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016242, "oldest_key_time": 1765016242, "file_creation_time": 1765016367, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 14619 microseconds, and 4385 cpu microseconds.
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.222254) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 2755035 bytes OK
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.222282) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.224284) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.224314) EVENT_LOG_v1 {"time_micros": 1765016367224305, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.224345) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4234450, prev total WAL file size 4234450, number of live WAL files 2.
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.225469) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303137' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end)
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(2690KB)], [39(16MB)]
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367225536, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 20324186, "oldest_snapshot_seqno": -1}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12944 keys, 19835981 bytes, temperature: kUnknown
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367359250, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 19835981, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19760106, "index_size": 42430, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32389, "raw_key_size": 345814, "raw_average_key_size": 26, "raw_value_size": 19537821, "raw_average_value_size": 1509, "num_data_blocks": 1616, "num_entries": 12944, "num_filter_entries": 12944, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016367, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.359670) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 19835981 bytes
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.362376) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.8 rd, 148.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 16.8 +0.0 blob) out(18.9 +0.0 blob), read-write-amplify(14.6) write-amplify(7.2) OK, records in: 13495, records dropped: 551 output_compression: NoCompression
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.362416) EVENT_LOG_v1 {"time_micros": 1765016367362397, "job": 22, "event": "compaction_finished", "compaction_time_micros": 133863, "compaction_time_cpu_micros": 54263, "output_level": 6, "num_output_files": 1, "total_output_size": 19835981, "num_input_records": 13495, "num_output_records": 12944, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367362991, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016367365818, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.225336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365927) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:27.365937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:27 np0005548789.localdomain podman[327374]: 
Dec 06 10:19:27 np0005548789.localdomain podman[327374]: 2025-12-06 10:19:27.581971263 +0000 UTC m=+0.094623634 container create bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:27 np0005548789.localdomain systemd[1]: Started libpod-conmon-bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1.scope.
Dec 06 10:19:27 np0005548789.localdomain podman[327374]: 2025-12-06 10:19:27.540105433 +0000 UTC m=+0.052757844 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:27 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:27 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08ca1b43792c51b5d171ba38429a148b3fedf4393e22cecad6459d4f9db8f88a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:27 np0005548789.localdomain podman[327374]: 2025-12-06 10:19:27.67743093 +0000 UTC m=+0.190083291 container init bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:19:27 np0005548789.localdomain podman[327374]: 2025-12-06 10:19:27.687706924 +0000 UTC m=+0.200359295 container start bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:19:27 np0005548789.localdomain dnsmasq[327393]: started, version 2.85 cachesize 150
Dec 06 10:19:27 np0005548789.localdomain dnsmasq[327393]: DNS service limited to local subnets
Dec 06 10:19:27 np0005548789.localdomain dnsmasq[327393]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:27 np0005548789.localdomain dnsmasq[327393]: warning: no upstream servers configured
Dec 06 10:19:27 np0005548789.localdomain dnsmasq-dhcp[327393]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:19:27 np0005548789.localdomain dnsmasq[327393]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/addn_hosts - 0 addresses
Dec 06 10:19:27 np0005548789.localdomain dnsmasq-dhcp[327393]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/host
Dec 06 10:19:27 np0005548789.localdomain dnsmasq-dhcp[327393]: read /var/lib/neutron/dhcp/43883dce-1590-48c4-987c-a21b63b82a1c/opts
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:27 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:28.067 263652 INFO neutron.agent.dhcp.agent [None req-69ed11c0-041e-4c0e-84e7-211f55860449 - - - - - -] DHCP configuration for ports {'71317000-7e06-4580-adc9-235e7990a2e9', '687d7abb-e6aa-4047-aa26-552c962fcc91'} is completed
Dec 06 10:19:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:28.115 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548789.localdomain dnsmasq[327393]: exiting on receipt of SIGTERM
Dec 06 10:19:28 np0005548789.localdomain podman[327411]: 2025-12-06 10:19:28.185011825 +0000 UTC m=+0.049592077 container kill bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: libpod-bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1.scope: Deactivated successfully.
Dec 06 10:19:28 np0005548789.localdomain podman[327425]: 2025-12-06 10:19:28.240108389 +0000 UTC m=+0.045002507 container died bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:28 np0005548789.localdomain podman[327425]: 2025-12-06 10:19:28.327909653 +0000 UTC m=+0.132803721 container cleanup bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: libpod-conmon-bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1.scope: Deactivated successfully.
Dec 06 10:19:28 np0005548789.localdomain podman[327432]: 2025-12-06 10:19:28.348710869 +0000 UTC m=+0.139270829 container remove bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-43883dce-1590-48c4-987c-a21b63b82a1c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:28.361 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548789.localdomain kernel: device tap71317000-7e left promiscuous mode
Dec 06 10:19:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:28Z|00367|binding|INFO|Releasing lport 71317000-7e06-4580-adc9-235e7990a2e9 from this chassis (sb_readonly=0)
Dec 06 10:19:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:28Z|00368|binding|INFO|Setting lport 71317000-7e06-4580-adc9-235e7990a2e9 down in Southbound
Dec 06 10:19:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:28.373 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6c:348c/64 2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-43883dce-1590-48c4-987c-a21b63b82a1c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34a17eee71de4bac8b71972a4b7b506c', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=82d32fcc-fa15-458b-9d3c-0c87258ef71d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=71317000-7e06-4580-adc9-235e7990a2e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:28.375 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 71317000-7e06-4580-adc9-235e7990a2e9 in datapath 43883dce-1590-48c4-987c-a21b63b82a1c unbound from our chassis
Dec 06 10:19:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:28.377 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 43883dce-1590-48c4-987c-a21b63b82a1c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:28 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:28.378 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a675313e-8b0e-4dea-8ba5-e2aeb7a94ed5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:28 np0005548789.localdomain ceph-mon[298582]: pgmap v304: 177 pgs: 177 active+clean; 192 MiB data, 939 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 2.2 KiB/s wr, 57 op/s
Dec 06 10:19:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3594078665' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:28.385 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:28.766 263652 INFO neutron.agent.dhcp.agent [None req-7309e927-c81a-4a1b-981d-b71b7dc7f8d0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:28 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:19:28 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:19:28 np0005548789.localdomain podman[327474]: 2025-12-06 10:19:28.773114922 +0000 UTC m=+0.064817592 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:19:28 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: tmp-crun.u48Zvq.mount: Deactivated successfully.
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-08ca1b43792c51b5d171ba38429a148b3fedf4393e22cecad6459d4f9db8f88a-merged.mount: Deactivated successfully.
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bed0476a1e794dd5f1daab1ec80beaa8ead2b2df45feba8ef98898bb4e2ac0d1-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:28 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d43883dce\x2d1590\x2d48c4\x2d987c\x2da21b63b82a1c.mount: Deactivated successfully.
Dec 06 10:19:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:29.368 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:29.372 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3743484833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:29 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:29Z|00369|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:19:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:29.607 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:30 np0005548789.localdomain ceph-mon[298582]: pgmap v305: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 5.2 KiB/s wr, 116 op/s
Dec 06 10:19:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:30 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4147539828' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:30 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4147539828' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:31 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:31.272 2 INFO neutron.agent.securitygroups_rpc [None req-33076df9-23c5-4745-bba5-728ca02b1a7f b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "format": "json"}]: dispatch
Dec 06 10:19:32 np0005548789.localdomain sshd[327496]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:19:32 np0005548789.localdomain ceph-mon[298582]: pgmap v306: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 4.7 KiB/s wr, 103 op/s
Dec 06 10:19:32 np0005548789.localdomain sshd[327496]: Received disconnect from 64.227.102.57 port 60232:11: Bye Bye [preauth]
Dec 06 10:19:32 np0005548789.localdomain sshd[327496]: Disconnected from authenticating user root 64.227.102.57 port 60232 [preauth]
Dec 06 10:19:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:19:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:19:32 np0005548789.localdomain podman[327499]: 2025-12-06 10:19:32.940913673 +0000 UTC m=+0.096498461 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:19:32 np0005548789.localdomain podman[327499]: 2025-12-06 10:19:32.978701088 +0000 UTC m=+0.134285846 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:19:32 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:19:32 np0005548789.localdomain podman[327498]: 2025-12-06 10:19:32.993793869 +0000 UTC m=+0.151836162 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:19:33 np0005548789.localdomain podman[327498]: 2025-12-06 10:19:33.001134164 +0000 UTC m=+0.159176477 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:33 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.118 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.221 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.222 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.222 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.222 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.222 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1956276203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.684 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.765 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.766 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.974 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.976 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11239MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.977 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:33.977 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.084 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.085 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.085 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.138 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.370 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:34 np0005548789.localdomain ceph-mon[298582]: pgmap v307: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 5.6 KiB/s wr, 84 op/s
Dec 06 10:19:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1956276203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2644755643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.598 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.604 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.655 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.658 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:19:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:34.659 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:35.112 2 INFO neutron.agent.securitygroups_rpc [None req-dad18757-c8ae-4573-92a7-49e2b9f564ab a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:35.250 263652 INFO neutron.agent.linux.ip_lib [None req-273b9099-78e2-422b-a659-0c74343e45f5 - - - - - -] Device tap972f93d0-ef cannot be used as it has no MAC address
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.274 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain kernel: device tap972f93d0-ef entered promiscuous mode
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.282 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016375.2870] manager: (tap972f93d0-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/61)
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00370|binding|INFO|Claiming lport 972f93d0-ef12-4f24-a9a3-a699348b3358 for this chassis.
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00371|binding|INFO|972f93d0-ef12-4f24-a9a3-a699348b3358: Claiming unknown
Dec 06 10:19:35 np0005548789.localdomain systemd-udevd[327593]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.309 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5a779660-e992-4a3c-97a9-04be836f7fcf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a779660-e992-4a3c-97a9-04be836f7fcf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29c573bcf157448abe548893ad01e3d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd5b3e84-a0b8-4106-a60c-6b065c1db991, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=972f93d0-ef12-4f24-a9a3-a699348b3358) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.311 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 972f93d0-ef12-4f24-a9a3-a699348b3358 in datapath 5a779660-e992-4a3c-97a9-04be836f7fcf bound to our chassis
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.314 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a779660-e992-4a3c-97a9-04be836f7fcf or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.316 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[750721b6-fdfd-4310-bf44-bc060cf5bfdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00372|binding|INFO|Setting lport 972f93d0-ef12-4f24-a9a3-a699348b3358 ovn-installed in OVS
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00373|binding|INFO|Setting lport 972f93d0-ef12-4f24-a9a3-a699348b3358 up in Southbound
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.319 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.356 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap972f93d0-ef: No such device
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.387 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:35.418 2 INFO neutron.agent.securitygroups_rpc [None req-dad18757-c8ae-4573-92a7-49e2b9f564ab a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "format": "json"}]: dispatch
Dec 06 10:19:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3837e3e6-e965-4096-a547-bca7edbdb76b", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2644755643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00374|binding|INFO|Removing iface tap972f93d0-ef ovn-installed in OVS
Dec 06 10:19:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:35Z|00375|binding|INFO|Removing lport 972f93d0-ef12-4f24-a9a3-a699348b3358 ovn-installed in OVS
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.913 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 740f31b0-86f5-42a1-89fc-5a7cfe5f636e with type ""
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.915 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.915 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::1/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5a779660-e992-4a3c-97a9-04be836f7fcf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5a779660-e992-4a3c-97a9-04be836f7fcf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29c573bcf157448abe548893ad01e3d2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd5b3e84-a0b8-4106-a60c-6b065c1db991, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=972f93d0-ef12-4f24-a9a3-a699348b3358) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.918 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 972f93d0-ef12-4f24-a9a3-a699348b3358 in datapath 5a779660-e992-4a3c-97a9-04be836f7fcf unbound from our chassis
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.920 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5a779660-e992-4a3c-97a9-04be836f7fcf or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:35.921 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f62d5ccc-e875-4e21-badf-38f0cba563e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:35.923 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:36.139 2 INFO neutron.agent.securitygroups_rpc [None req-792339ab-c7cd-409a-a342-ae21c75c2ee5 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:19:36 np0005548789.localdomain podman[327664]: 
Dec 06 10:19:36 np0005548789.localdomain podman[327664]: 2025-12-06 10:19:36.306224635 +0000 UTC m=+0.099043149 container create 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:19:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed.scope.
Dec 06 10:19:36 np0005548789.localdomain podman[327664]: 2025-12-06 10:19:36.258495656 +0000 UTC m=+0.051314210 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ac63c69befceb1700e6f815e6fb9f6bb1f8c8c5843c6564c9613824e209b1c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:36 np0005548789.localdomain podman[327664]: 2025-12-06 10:19:36.378431092 +0000 UTC m=+0.171249596 container init 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:36 np0005548789.localdomain podman[327664]: 2025-12-06 10:19:36.388066156 +0000 UTC m=+0.180884670 container start 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: started, version 2.85 cachesize 150
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: DNS service limited to local subnets
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: warning: no upstream servers configured
Dec 06 10:19:36 np0005548789.localdomain dnsmasq-dhcp[327682]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: read /var/lib/neutron/dhcp/5a779660-e992-4a3c-97a9-04be836f7fcf/addn_hosts - 0 addresses
Dec 06 10:19:36 np0005548789.localdomain dnsmasq-dhcp[327682]: read /var/lib/neutron/dhcp/5a779660-e992-4a3c-97a9-04be836f7fcf/host
Dec 06 10:19:36 np0005548789.localdomain dnsmasq-dhcp[327682]: read /var/lib/neutron/dhcp/5a779660-e992-4a3c-97a9-04be836f7fcf/opts
Dec 06 10:19:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:36.456 263652 INFO neutron.agent.dhcp.agent [None req-90b892c8-0506-4dbe-af3a-d4a70d2244e9 - - - - - -] Synchronizing state
Dec 06 10:19:36 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:36.470 2 INFO neutron.agent.securitygroups_rpc [None req-21429926-074a-46a0-a4f4-611f2e364131 a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:36 np0005548789.localdomain ceph-mon[298582]: pgmap v308: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 5.6 KiB/s wr, 84 op/s
Dec 06 10:19:36 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:36Z|00376|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:19:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:36.564 263652 INFO neutron.agent.dhcp.agent [None req-e1eef1f3-ad01-420a-aeea-e1814b5e1031 - - - - - -] DHCP configuration for ports {'84119ae3-1fc0-42ee-88f4-2202a230930e'} is completed
Dec 06 10:19:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:36.580 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:36.757 263652 INFO neutron.agent.dhcp.agent [None req-cca5e3b7-9de6-4346-ba1d-d46e49c59ff8 - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:19:36 np0005548789.localdomain dnsmasq[327682]: exiting on receipt of SIGTERM
Dec 06 10:19:36 np0005548789.localdomain podman[327700]: 2025-12-06 10:19:36.947511478 +0000 UTC m=+0.063637387 container kill 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:19:36 np0005548789.localdomain systemd[1]: libpod-1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed.scope: Deactivated successfully.
Dec 06 10:19:37 np0005548789.localdomain podman[327713]: 2025-12-06 10:19:37.025601675 +0000 UTC m=+0.061641175 container died 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:19:37 np0005548789.localdomain podman[327713]: 2025-12-06 10:19:37.073502609 +0000 UTC m=+0.109542079 container cleanup 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:37 np0005548789.localdomain systemd[1]: libpod-conmon-1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed.scope: Deactivated successfully.
Dec 06 10:19:37 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:37.107 2 INFO neutron.agent.securitygroups_rpc [None req-bb88ef2d-64f1-4b09-a81f-2bd8c1d4b6c6 a31577503edf4745abb112adc3113276 90bd35d6ab7c40c58d9d1d61ff7a12d3 - - default default] Security group member updated ['1fabbc74-497e-44d5-8d22-97b341de2968']
Dec 06 10:19:37 np0005548789.localdomain podman[327715]: 2025-12-06 10:19:37.150784121 +0000 UTC m=+0.178755396 container remove 1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5a779660-e992-4a3c-97a9-04be836f7fcf, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:19:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:37.161 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:37 np0005548789.localdomain kernel: device tap972f93d0-ef left promiscuous mode
Dec 06 10:19:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:37.175 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.199 263652 INFO neutron.agent.dhcp.agent [-] Starting network 64b8068a-5126-4521-be60-754a588ea213 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.200 263652 INFO neutron.agent.dhcp.agent [-] Finished network 64b8068a-5126-4521-be60-754a588ea213 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.200 263652 INFO neutron.agent.dhcp.agent [-] Starting network 86cd7531-ca23-4747-83b1-28bcd175a277 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.201 263652 INFO neutron.agent.dhcp.agent [-] Finished network 86cd7531-ca23-4747-83b1-28bcd175a277 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.201 263652 INFO neutron.agent.dhcp.agent [-] Starting network ba9cb6a7-7d80-4f37-aa3f-eaee69fb8585 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.202 263652 INFO neutron.agent.dhcp.agent [-] Finished network ba9cb6a7-7d80-4f37-aa3f-eaee69fb8585 dhcp configuration
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.203 263652 INFO neutron.agent.dhcp.agent [None req-52fc0508-2b36-4e56-82d0-a682e93b4cc9 - - - - - -] Synchronizing state complete
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.203 263652 INFO neutron.agent.dhcp.agent [None req-273b9099-78e2-422b-a659-0c74343e45f5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.204 263652 INFO neutron.agent.dhcp.agent [None req-273b9099-78e2-422b-a659-0c74343e45f5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.204 263652 INFO neutron.agent.dhcp.agent [None req-ea612e7a-694f-480c-a463-f4911cd2ede5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.204 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2ac63c69befceb1700e6f815e6fb9f6bb1f8c8c5843c6564c9613824e209b1c6-merged.mount: Deactivated successfully.
Dec 06 10:19:37 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1541aa106705a20c01b44e91351c5106244b5f8e047bafeec8275898714300ed-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:37 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d5a779660\x2de992\x2d4a3c\x2d97a9\x2d04be836f7fcf.mount: Deactivated successfully.
Dec 06 10:19:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:37.358 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "format": "json"}]: dispatch
Dec 06 10:19:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:38.121 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:38 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:38.294 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:38 np0005548789.localdomain ceph-mon[298582]: pgmap v309: 177 pgs: 177 active+clean; 192 MiB data, 923 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 4.7 KiB/s wr, 71 op/s
Dec 06 10:19:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:38.657 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.419 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.543 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.543 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.544 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:19:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:39.544 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:19:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3512849964' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:40 np0005548789.localdomain ceph-mon[298582]: pgmap v310: 177 pgs: 177 active+clean; 238 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Dec 06 10:19:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-659509012", "format": "json"} : dispatch
Dec 06 10:19:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:19:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:19:40 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-659509012", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a05360b-59a7-495e-a884-ff87c0880377/50397115-0c2d-4191-896c-db5fe71ed3ba", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a05360b-59a7-495e-a884-ff87c0880377", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:19:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:40.828 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:19:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:19:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:19:40 np0005548789.localdomain podman[327742]: 2025-12-06 10:19:40.93216835 +0000 UTC m=+0.089852267 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=)
Dec 06 10:19:40 np0005548789.localdomain podman[327742]: 2025-12-06 10:19:40.948647844 +0000 UTC m=+0.106331731 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64)
Dec 06 10:19:40 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:19:41 np0005548789.localdomain podman[327743]: 2025-12-06 10:19:41.030752484 +0000 UTC m=+0.185788940 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:19:41 np0005548789.localdomain podman[327743]: 2025-12-06 10:19:41.07218819 +0000 UTC m=+0.227224626 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:41 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.136 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.137 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.138 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:41.180 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "tenant_id": "d694f30d513746329568207534277c9c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/467307692' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/467307692' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-659509012", "format": "json"} : dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} : dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"} : dispatch
Dec 06 10:19:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-659509012"}]': finished
Dec 06 10:19:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:42.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: pgmap v311: 177 pgs: 177 active+clean; 238 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 67 op/s
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "auth_id": "tempest-cephx-id-659509012", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7a05360b-59a7-495e-a884-ff87c0880377", "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7a05360b-59a7-495e-a884-ff87c0880377", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:19:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/139921279' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:19:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:43.154 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:43 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:43.623 2 INFO neutron.agent.securitygroups_rpc [None req-f21d32c3-41e3-465d-a5ba-39b4b631a0c1 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['ae1eaa44-7360-485a-b85b-f1bfb95ce20b']
Dec 06 10:19:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:19:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "format": "json"}]: dispatch
Dec 06 10:19:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:44.454 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:44 np0005548789.localdomain ceph-mon[298582]: pgmap v312: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:19:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1258063083' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:19:44 np0005548789.localdomain podman[327777]: 2025-12-06 10:19:44.916679038 +0000 UTC m=+0.078472089 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 06 10:19:44 np0005548789.localdomain podman[327777]: 2025-12-06 10:19:44.955293499 +0000 UTC m=+0.117086560 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:44 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:19:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:45.006 263652 INFO neutron.agent.linux.ip_lib [None req-d2a896ef-8f87-4210-8e7e-42e780412f4b - - - - - -] Device tapf66469ad-cc cannot be used as it has no MAC address
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.034 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548789.localdomain kernel: device tapf66469ad-cc entered promiscuous mode
Dec 06 10:19:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:45Z|00377|binding|INFO|Claiming lport f66469ad-cca4-4e75-8ad1-16dcdb97964a for this chassis.
Dec 06 10:19:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:45Z|00378|binding|INFO|f66469ad-cca4-4e75-8ad1-16dcdb97964a: Claiming unknown
Dec 06 10:19:45 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016385.0424] manager: (tapf66469ad-cc): new Generic device (/org/freedesktop/NetworkManager/Devices/62)
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.042 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548789.localdomain systemd-udevd[327807]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:45Z|00379|binding|INFO|Setting lport f66469ad-cca4-4e75-8ad1-16dcdb97964a ovn-installed in OVS
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapf66469ad-cc: No such device
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.126 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.154 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:45.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:45Z|00380|binding|INFO|Setting lport f66469ad-cca4-4e75-8ad1-16dcdb97964a up in Southbound
Dec 06 10:19:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:45.482 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-3dc43717-9c00-4de5-8dc8-b5288e2abad9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3dc43717-9c00-4de5-8dc8-b5288e2abad9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90bd35d6ab7c40c58d9d1d61ff7a12d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7698555-bf3e-4a92-a2b0-48becfd360ed, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f66469ad-cca4-4e75-8ad1-16dcdb97964a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:45.486 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f66469ad-cca4-4e75-8ad1-16dcdb97964a in datapath 3dc43717-9c00-4de5-8dc8-b5288e2abad9 bound to our chassis
Dec 06 10:19:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:45.489 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port aa67f9cf-8b12-49fc-b2c3-d95c1add9eae IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:19:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:45.489 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3dc43717-9c00-4de5-8dc8-b5288e2abad9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:45 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:45.493 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[6f43c383-e81b-4700-a6fb-65e143ad9531]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2619362809' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:46 np0005548789.localdomain podman[327878]: 
Dec 06 10:19:46 np0005548789.localdomain podman[327878]: 2025-12-06 10:19:46.105724716 +0000 UTC m=+0.098176952 container create c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:19:46 np0005548789.localdomain podman[327878]: 2025-12-06 10:19:46.055588413 +0000 UTC m=+0.048040699 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:46 np0005548789.localdomain systemd[1]: Started libpod-conmon-c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc.scope.
Dec 06 10:19:46 np0005548789.localdomain systemd[1]: tmp-crun.Apo2WE.mount: Deactivated successfully.
Dec 06 10:19:46 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:46 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d40a863adfa6ac1229342b29110117480605cd9c7a0dd59af495be4287ab44f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:46 np0005548789.localdomain podman[327878]: 2025-12-06 10:19:46.207852417 +0000 UTC m=+0.200304663 container init c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:46 np0005548789.localdomain podman[327878]: 2025-12-06 10:19:46.219717141 +0000 UTC m=+0.212169377 container start c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:19:46 np0005548789.localdomain dnsmasq[327896]: started, version 2.85 cachesize 150
Dec 06 10:19:46 np0005548789.localdomain dnsmasq[327896]: DNS service limited to local subnets
Dec 06 10:19:46 np0005548789.localdomain dnsmasq[327896]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:46 np0005548789.localdomain dnsmasq[327896]: warning: no upstream servers configured
Dec 06 10:19:46 np0005548789.localdomain dnsmasq-dhcp[327896]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:19:46 np0005548789.localdomain dnsmasq[327896]: read /var/lib/neutron/dhcp/3dc43717-9c00-4de5-8dc8-b5288e2abad9/addn_hosts - 0 addresses
Dec 06 10:19:46 np0005548789.localdomain dnsmasq-dhcp[327896]: read /var/lib/neutron/dhcp/3dc43717-9c00-4de5-8dc8-b5288e2abad9/host
Dec 06 10:19:46 np0005548789.localdomain dnsmasq-dhcp[327896]: read /var/lib/neutron/dhcp/3dc43717-9c00-4de5-8dc8-b5288e2abad9/opts
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:19:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:19:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:19:46 np0005548789.localdomain ceph-mon[298582]: pgmap v313: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 70 op/s
Dec 06 10:19:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "format": "json"}]: dispatch
Dec 06 10:19:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d55ecf85-9ebf-4a13-b414-b26993895e14", "force": true, "format": "json"}]: dispatch
Dec 06 10:19:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2321015285' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:46.789 263652 INFO neutron.agent.dhcp.agent [None req-40d3a584-ba15-47e8-92c6-15a36d08dec5 - - - - - -] DHCP configuration for ports {'62c4ed28-0829-462e-ad0f-ed2a041f9945'} is completed
Dec 06 10:19:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:47.244 263652 INFO neutron.agent.linux.ip_lib [None req-d91cf00b-8b62-45e5-aaf8-aca75f50b829 - - - - - -] Device tap1ca7855c-cd cannot be used as it has no MAC address
Dec 06 10:19:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:47.263 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:47 np0005548789.localdomain kernel: device tap1ca7855c-cd entered promiscuous mode
Dec 06 10:19:47 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016387.2717] manager: (tap1ca7855c-cd): new Generic device (/org/freedesktop/NetworkManager/Devices/63)
Dec 06 10:19:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:47Z|00381|binding|INFO|Claiming lport 1ca7855c-cd02-499a-a723-f901eb28ad76 for this chassis.
Dec 06 10:19:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:47Z|00382|binding|INFO|1ca7855c-cd02-499a-a723-f901eb28ad76: Claiming unknown
Dec 06 10:19:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:47.274 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.308 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.308 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.309 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:47Z|00383|binding|INFO|Setting lport 1ca7855c-cd02-499a-a723-f901eb28ad76 ovn-installed in OVS
Dec 06 10:19:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:47.314 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1ca7855c-cd: No such device
Dec 06 10:19:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:47.351 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:47Z|00384|binding|INFO|Setting lport 1ca7855c-cd02-499a-a723-f901eb28ad76 up in Southbound
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.383 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-1898c940-0651-45db-aebd-630d54fbe329', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1898c940-0651-45db-aebd-630d54fbe329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c44abc1-74e7-483f-a478-b580dd3fd31f, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1ca7855c-cd02-499a-a723-f901eb28ad76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.385 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1ca7855c-cd02-499a-a723-f901eb28ad76 in datapath 1898c940-0651-45db-aebd-630d54fbe329 bound to our chassis
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.386 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1898c940-0651-45db-aebd-630d54fbe329 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:47.387 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b428d420-dbb6-4ae6-bc9b-0fd85ba32eff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:47.396 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain podman[327971]: 2025-12-06 10:19:48.038890379 +0000 UTC m=+0.074508469 container kill c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[327896]: exiting on receipt of SIGTERM
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: tmp-crun.su1KFg.mount: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: libpod-c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc.scope: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain podman[327983]: 2025-12-06 10:19:48.12040348 +0000 UTC m=+0.064271835 container died c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4d40a863adfa6ac1229342b29110117480605cd9c7a0dd59af495be4287ab44f-merged.mount: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain podman[327983]: 2025-12-06 10:19:48.147695565 +0000 UTC m=+0.091563880 container cleanup c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: libpod-conmon-c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc.scope: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:48.198 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain podman[327985]: 2025-12-06 10:19:48.218709865 +0000 UTC m=+0.153899405 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:19:48 np0005548789.localdomain ceph-mon[298582]: pgmap v314: 177 pgs: 177 active+clean; 192 MiB data, 927 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 70 op/s
Dec 06 10:19:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2332530373' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:19:48 np0005548789.localdomain podman[327992]: 2025-12-06 10:19:48.304782267 +0000 UTC m=+0.231698694 container remove c31077edc184a24e36da53f7b38bbed4198fa3943c8edd21da2fe4e57d6bbbfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3dc43717-9c00-4de5-8dc8-b5288e2abad9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:48.317 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:48Z|00385|binding|INFO|Releasing lport f66469ad-cca4-4e75-8ad1-16dcdb97964a from this chassis (sb_readonly=0)
Dec 06 10:19:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:48Z|00386|binding|INFO|Setting lport f66469ad-cca4-4e75-8ad1-16dcdb97964a down in Southbound
Dec 06 10:19:48 np0005548789.localdomain kernel: device tapf66469ad-cc left promiscuous mode
Dec 06 10:19:48 np0005548789.localdomain podman[327985]: 2025-12-06 10:19:48.328576844 +0000 UTC m=+0.263766424 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:19:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:48.350 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.366 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-3dc43717-9c00-4de5-8dc8-b5288e2abad9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3dc43717-9c00-4de5-8dc8-b5288e2abad9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90bd35d6ab7c40c58d9d1d61ff7a12d3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7698555-bf3e-4a92-a2b0-48becfd360ed, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=f66469ad-cca4-4e75-8ad1-16dcdb97964a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.368 160509 INFO neutron.agent.ovn.metadata.agent [-] Port f66469ad-cca4-4e75-8ad1-16dcdb97964a in datapath 3dc43717-9c00-4de5-8dc8-b5288e2abad9 unbound from our chassis
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.371 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3dc43717-9c00-4de5-8dc8-b5288e2abad9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.372 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[bce42a0b-acd6-4e53-9715-b083344cf114]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:48 np0005548789.localdomain podman[328080]: 
Dec 06 10:19:48 np0005548789.localdomain podman[328080]: 2025-12-06 10:19:48.529236918 +0000 UTC m=+0.081100740 container create c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a.scope.
Dec 06 10:19:48 np0005548789.localdomain podman[328080]: 2025-12-06 10:19:48.485460079 +0000 UTC m=+0.037323931 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a91e6d853e6221211e9d3c2ca40331c0a06702c1a305cff3b145b6afe2f5cd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.597 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.600 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:19:48 np0005548789.localdomain podman[328080]: 2025-12-06 10:19:48.602891489 +0000 UTC m=+0.154755321 container init c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.603 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.604 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[acf880d5-73f1-462b-8d3c-a3d22efb3d7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:48 np0005548789.localdomain podman[328080]: 2025-12-06 10:19:48.61372512 +0000 UTC m=+0.165588952 container start c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[328099]: started, version 2.85 cachesize 150
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[328099]: DNS service limited to local subnets
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[328099]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[328099]: warning: no upstream servers configured
Dec 06 10:19:48 np0005548789.localdomain dnsmasq-dhcp[328099]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:19:48 np0005548789.localdomain dnsmasq[328099]: read /var/lib/neutron/dhcp/1898c940-0651-45db-aebd-630d54fbe329/addn_hosts - 0 addresses
Dec 06 10:19:48 np0005548789.localdomain dnsmasq-dhcp[328099]: read /var/lib/neutron/dhcp/1898c940-0651-45db-aebd-630d54fbe329/host
Dec 06 10:19:48 np0005548789.localdomain dnsmasq-dhcp[328099]: read /var/lib/neutron/dhcp/1898c940-0651-45db-aebd-630d54fbe329/opts
Dec 06 10:19:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:48Z|00387|binding|INFO|Removing iface tap1ca7855c-cd ovn-installed in OVS
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.646 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 68df6cee-6a4d-4258-87de-8bc5bc40efa1 with type ""
Dec 06 10:19:48 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:48Z|00388|binding|INFO|Removing lport 1ca7855c-cd02-499a-a723-f901eb28ad76 ovn-installed in OVS
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.648 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-1898c940-0651-45db-aebd-630d54fbe329', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1898c940-0651-45db-aebd-630d54fbe329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8c44abc1-74e7-483f-a478-b580dd3fd31f, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1ca7855c-cd02-499a-a723-f901eb28ad76) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:48.648 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.650 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1ca7855c-cd02-499a-a723-f901eb28ad76 in datapath 1898c940-0651-45db-aebd-630d54fbe329 unbound from our chassis
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.651 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1898c940-0651-45db-aebd-630d54fbe329 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:48 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:48.652 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[286e55f1-7ceb-40fd-8491-fc78373c1a68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:48.654 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:48.980 263652 INFO neutron.agent.dhcp.agent [None req-153de0db-46c2-43a3-bd21-0a0173f33aeb - - - - - -] DHCP configuration for ports {'c00098c6-48f4-4539-8926-9f3b3e3be6e9'} is completed
Dec 06 10:19:49 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d3dc43717\x2d9c00\x2d4de5\x2d8dc8\x2db5288e2abad9.mount: Deactivated successfully.
Dec 06 10:19:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:49.151 263652 INFO neutron.agent.dhcp.agent [None req-5beb0022-7483-49e2-a0d6-eacfb6043978 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:49Z|00389|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:19:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:49.205 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:49 np0005548789.localdomain dnsmasq[328099]: exiting on receipt of SIGTERM
Dec 06 10:19:49 np0005548789.localdomain podman[328115]: 2025-12-06 10:19:49.254874609 +0000 UTC m=+0.050910287 container kill c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:19:49 np0005548789.localdomain systemd[1]: libpod-c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a.scope: Deactivated successfully.
Dec 06 10:19:49 np0005548789.localdomain podman[328128]: 2025-12-06 10:19:49.323165596 +0000 UTC m=+0.054011501 container died c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:19:49 np0005548789.localdomain podman[328128]: 2025-12-06 10:19:49.357987521 +0000 UTC m=+0.088833386 container cleanup c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:19:49 np0005548789.localdomain systemd[1]: libpod-conmon-c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a.scope: Deactivated successfully.
Dec 06 10:19:49 np0005548789.localdomain podman[328130]: 2025-12-06 10:19:49.409192446 +0000 UTC m=+0.131545862 container remove c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1898c940-0651-45db-aebd-630d54fbe329, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:19:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:49.462 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:49 np0005548789.localdomain kernel: device tap1ca7855c-cd left promiscuous mode
Dec 06 10:19:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:49.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:49.669 263652 INFO neutron.agent.dhcp.agent [None req-d109619d-c45f-4705-b2e4-ba28a4fe13d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:49.671 263652 INFO neutron.agent.dhcp.agent [None req-d109619d-c45f-4705-b2e4-ba28a4fe13d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e3a91e6d853e6221211e9d3c2ca40331c0a06702c1a305cff3b145b6afe2f5cd-merged.mount: Deactivated successfully.
Dec 06 10:19:50 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1aa008254c45e53dc61ebe069ba0c87cd28f3a675b669f9e3eef43a4322839a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:19:50 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d1898c940\x2d0651\x2d45db\x2daebd\x2d630d54fbe329.mount: Deactivated successfully.
Dec 06 10:19:50 np0005548789.localdomain ceph-mon[298582]: pgmap v315: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 1.8 MiB/s wr, 72 op/s
Dec 06 10:19:50 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:50.922 2 INFO neutron.agent.securitygroups_rpc [None req-8ab54d3f-0dba-4adf-88cd-ebbf59b7b541 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['86fafa90-40d2-4e2b-87d7-dc3d530576aa', 'ae1eaa44-7360-485a-b85b-f1bfb95ce20b']
Dec 06 10:19:52 np0005548789.localdomain ceph-mon[298582]: pgmap v316: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 13 KiB/s wr, 35 op/s
Dec 06 10:19:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:19:52 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:52.389 263652 INFO neutron.agent.linux.ip_lib [None req-14bfe837-9375-4020-9865-030c147dcb1d - - - - - -] Device tap3adb2c37-0f cannot be used as it has no MAC address
Dec 06 10:19:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:52.411 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:52 np0005548789.localdomain systemd[1]: tmp-crun.bYEjzT.mount: Deactivated successfully.
Dec 06 10:19:52 np0005548789.localdomain kernel: device tap3adb2c37-0f entered promiscuous mode
Dec 06 10:19:52 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016392.4266] manager: (tap3adb2c37-0f): new Generic device (/org/freedesktop/NetworkManager/Devices/64)
Dec 06 10:19:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:52.427 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:52Z|00390|binding|INFO|Claiming lport 3adb2c37-0f70-478d-98be-4e26b3a4f4ff for this chassis.
Dec 06 10:19:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:52Z|00391|binding|INFO|3adb2c37-0f70-478d-98be-4e26b3a4f4ff: Claiming unknown
Dec 06 10:19:52 np0005548789.localdomain systemd-udevd[328185]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:19:52 np0005548789.localdomain podman[328161]: 2025-12-06 10:19:52.43338996 +0000 UTC m=+0.114351076 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:52Z|00392|binding|INFO|Setting lport 3adb2c37-0f70-478d-98be-4e26b3a4f4ff ovn-installed in OVS
Dec 06 10:19:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:52.470 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:52 np0005548789.localdomain podman[328161]: 2025-12-06 10:19:52.473256809 +0000 UTC m=+0.154217935 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:52.488 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-c68f9a6d-f183-4c32-ae20-3af5e94473b3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68f9a6d-f183-4c32-ae20-3af5e94473b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=caee8882-f3cb-4a2a-a1c8-8579f9a721cf, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=3adb2c37-0f70-478d-98be-4e26b3a4f4ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:19:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:52.490 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 3adb2c37-0f70-478d-98be-4e26b3a4f4ff in datapath c68f9a6d-f183-4c32-ae20-3af5e94473b3 bound to our chassis
Dec 06 10:19:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:52.490 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c68f9a6d-f183-4c32-ae20-3af5e94473b3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:19:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:19:52.491 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[126a3f1c-091c-401b-8662-29dcb32f9647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:19:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:52Z|00393|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:19:52Z|00394|binding|INFO|Setting lport 3adb2c37-0f70-478d-98be-4e26b3a4f4ff up in Southbound
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap3adb2c37-0f: No such device
Dec 06 10:19:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:52.520 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:52.563 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:53 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:19:53.185 2 INFO neutron.agent.securitygroups_rpc [None req-cbc27e7f-4bef-4dfe-ad5b-dd1345427342 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['86fafa90-40d2-4e2b-87d7-dc3d530576aa']
Dec 06 10:19:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:53.239 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:53 np0005548789.localdomain podman[328263]: 
Dec 06 10:19:53 np0005548789.localdomain podman[328263]: 2025-12-06 10:19:53.656647542 +0000 UTC m=+0.095499920 container create c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:19:53 np0005548789.localdomain systemd[1]: Started libpod-conmon-c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4.scope.
Dec 06 10:19:53 np0005548789.localdomain podman[328263]: 2025-12-06 10:19:53.611735819 +0000 UTC m=+0.050588207 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:19:53 np0005548789.localdomain systemd[1]: tmp-crun.4hhJ8a.mount: Deactivated successfully.
Dec 06 10:19:53 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:19:53 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d060ccaa9e70944fae8106a7cb62cb34c009da5ff146db45b624802025af9fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:19:53 np0005548789.localdomain podman[328263]: 2025-12-06 10:19:53.750383038 +0000 UTC m=+0.189235436 container init c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:19:53 np0005548789.localdomain podman[328263]: 2025-12-06 10:19:53.761258231 +0000 UTC m=+0.200110609 container start c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:19:53 np0005548789.localdomain dnsmasq[328281]: started, version 2.85 cachesize 150
Dec 06 10:19:53 np0005548789.localdomain dnsmasq[328281]: DNS service limited to local subnets
Dec 06 10:19:53 np0005548789.localdomain dnsmasq[328281]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:19:53 np0005548789.localdomain dnsmasq[328281]: warning: no upstream servers configured
Dec 06 10:19:53 np0005548789.localdomain dnsmasq-dhcp[328281]: DHCP, static leases only on 10.100.255.240, lease time 1d
Dec 06 10:19:53 np0005548789.localdomain dnsmasq[328281]: read /var/lib/neutron/dhcp/c68f9a6d-f183-4c32-ae20-3af5e94473b3/addn_hosts - 0 addresses
Dec 06 10:19:53 np0005548789.localdomain dnsmasq-dhcp[328281]: read /var/lib/neutron/dhcp/c68f9a6d-f183-4c32-ae20-3af5e94473b3/host
Dec 06 10:19:53 np0005548789.localdomain dnsmasq-dhcp[328281]: read /var/lib/neutron/dhcp/c68f9a6d-f183-4c32-ae20-3af5e94473b3/opts
Dec 06 10:19:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:19:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:19:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:19:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157932 "" "Go-http-client/1.1"
Dec 06 10:19:53 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:53.969 263652 INFO neutron.agent.dhcp.agent [None req-7491ba73-f23b-433e-8431-90e61e1a197e - - - - - -] DHCP configuration for ports {'ffdb62da-674d-4d01-8db8-0f5fd1e913bf'} is completed
Dec 06 10:19:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:19:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19738 "" "Go-http-client/1.1"
Dec 06 10:19:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:54.122 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: pgmap v317: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 14 KiB/s wr, 36 op/s
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.319446) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394319514, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 666, "num_deletes": 251, "total_data_size": 537930, "memory_usage": 550184, "flush_reason": "Manual Compaction"}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394324029, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 345511, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26712, "largest_seqno": 27373, "table_properties": {"data_size": 342325, "index_size": 1041, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8349, "raw_average_key_size": 20, "raw_value_size": 335709, "raw_average_value_size": 818, "num_data_blocks": 46, "num_entries": 410, "num_filter_entries": 410, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016367, "oldest_key_time": 1765016367, "file_creation_time": 1765016394, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4652 microseconds, and 1897 cpu microseconds.
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.324098) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 345511 bytes OK
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.324126) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.326188) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.326210) EVENT_LOG_v1 {"time_micros": 1765016394326203, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.326233) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 534157, prev total WAL file size 534157, number of live WAL files 2.
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.326817) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(337KB)], [42(18MB)]
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394326900, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 20181492, "oldest_snapshot_seqno": -1}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12835 keys, 18822323 bytes, temperature: kUnknown
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394427947, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18822323, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18748158, "index_size": 40976, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32133, "raw_key_size": 344198, "raw_average_key_size": 26, "raw_value_size": 18528952, "raw_average_value_size": 1443, "num_data_blocks": 1551, "num_entries": 12835, "num_filter_entries": 12835, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016394, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.428307) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18822323 bytes
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.430365) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 199.5 rd, 186.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.9 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(112.9) write-amplify(54.5) OK, records in: 13354, records dropped: 519 output_compression: NoCompression
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.430405) EVENT_LOG_v1 {"time_micros": 1765016394430388, "job": 24, "event": "compaction_finished", "compaction_time_micros": 101142, "compaction_time_cpu_micros": 50983, "output_level": 6, "num_output_files": 1, "total_output_size": 18822323, "num_input_records": 13354, "num_output_records": 12835, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394430662, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016394433497, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.326661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433612) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:19:54.433621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:19:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:54.512 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:56 np0005548789.localdomain ceph-mon[298582]: pgmap v318: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 4.5 KiB/s wr, 2 op/s
Dec 06 10:19:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:58.285 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:19:58 np0005548789.localdomain ceph-mon[298582]: pgmap v319: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 4.5 KiB/s wr, 2 op/s
Dec 06 10:19:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:19:58.417 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:19:58 np0005548789.localdomain sudo[328282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:19:58 np0005548789.localdomain sudo[328282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:19:58 np0005548789.localdomain sudo[328282]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:59 np0005548789.localdomain sudo[328300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:19:59 np0005548789.localdomain sudo[328300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:19:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:19:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:19:59.515 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:19:59 np0005548789.localdomain sudo[328300]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:00 np0005548789.localdomain sudo[328351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:20:00 np0005548789.localdomain sudo[328351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:20:00 np0005548789.localdomain sudo[328351]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "format": "json"}]: dispatch
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: pgmap v320: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 5.0 KiB/s wr, 2 op/s
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:20:00 np0005548789.localdomain ceph-mon[298582]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:20:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:01.583 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:01.585 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:20:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:01.588 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:01.589 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1fc5cd-9f32-4e61-84a9-cf7d3f2a6672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:02 np0005548789.localdomain ceph-mon[298582]: pgmap v321: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s wr, 0 op/s
Dec 06 10:20:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:20:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:03.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f", "format": "json"}]: dispatch
Dec 06 10:20:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:20:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:20:03 np0005548789.localdomain podman[328370]: 2025-12-06 10:20:03.931249928 +0000 UTC m=+0.085993080 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:20:03 np0005548789.localdomain podman[328370]: 2025-12-06 10:20:03.944161623 +0000 UTC m=+0.098904825 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:20:03 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:20:04 np0005548789.localdomain podman[328369]: 2025-12-06 10:20:04.031313757 +0000 UTC m=+0.188769492 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:04 np0005548789.localdomain podman[328369]: 2025-12-06 10:20:04.04127306 +0000 UTC m=+0.198728825 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 10:20:04 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:20:04 np0005548789.localdomain sshd[328410]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:20:04 np0005548789.localdomain ceph-mon[298582]: pgmap v322: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.8 KiB/s wr, 1 op/s
Dec 06 10:20:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:04.573 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:05 np0005548789.localdomain sshd[328410]: Received disconnect from 118.219.234.233 port 34262:11: Bye Bye [preauth]
Dec 06 10:20:05 np0005548789.localdomain sshd[328410]: Disconnected from authenticating user root 118.219.234.233 port 34262 [preauth]
Dec 06 10:20:06 np0005548789.localdomain ceph-mon[298582]: pgmap v323: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s wr, 1 op/s
Dec 06 10:20:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:06.796 263652 INFO neutron.agent.linux.ip_lib [None req-38231a6d-8e5c-41d2-a990-f1b82c8e9de5 - - - - - -] Device tap8502c635-ed cannot be used as it has no MAC address
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.856 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548789.localdomain kernel: device tap8502c635-ed entered promiscuous mode
Dec 06 10:20:06 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016406.8682] manager: (tap8502c635-ed): new Generic device (/org/freedesktop/NetworkManager/Devices/65)
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.868 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:06Z|00395|binding|INFO|Claiming lport 8502c635-ed1a-4597-9657-4577483e7713 for this chassis.
Dec 06 10:20:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:06Z|00396|binding|INFO|8502c635-ed1a-4597-9657-4577483e7713: Claiming unknown
Dec 06 10:20:06 np0005548789.localdomain systemd-udevd[328422]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:06.883 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5059c6b1-bf63-4619-b361-4c64f7e8a30d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=8502c635-ed1a-4597-9657-4577483e7713) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:06.885 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 8502c635-ed1a-4597-9657-4577483e7713 in datapath b2c47b1f-f8cf-41da-adf1-6c6404edb8e3 bound to our chassis
Dec 06 10:20:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:06.886 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b2c47b1f-f8cf-41da-adf1-6c6404edb8e3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:06.887 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cb6d72-175c-4538-8cb0-ce444dc3eabd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:06Z|00397|binding|INFO|Setting lport 8502c635-ed1a-4597-9657-4577483e7713 ovn-installed in OVS
Dec 06 10:20:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:06Z|00398|binding|INFO|Setting lport 8502c635-ed1a-4597-9657-4577483e7713 up in Southbound
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.905 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.908 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap8502c635-ed: No such device
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.948 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:06.976 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f_1b65360c-9474-4053-9db5-09821cc600f9", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "snap_name": "86b1b260-491a-4ff7-9199-adafe7f88c8f", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:07 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:07.587 2 INFO neutron.agent.securitygroups_rpc [None req-6d59f8dd-76ee-4672-86ac-2d91b87c0791 260dfc8941214c308c05293af65bdae9 24086b701d6b4d4081d2e63578d18d24 - - default default] Security group member updated ['ea587027-2c02-4165-a90f-98eaf0ce1ddb']
Dec 06 10:20:07 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:07.614 2 INFO neutron.agent.securitygroups_rpc [None req-1250ea59-7c13-4a58-b22f-38de2df53542 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['c05cd5e8-c5d4-4d05-80ba-b6a4af8b3ba8']
Dec 06 10:20:07 np0005548789.localdomain podman[328493]: 
Dec 06 10:20:07 np0005548789.localdomain podman[328493]: 2025-12-06 10:20:07.824293312 +0000 UTC m=+0.082418552 container create b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:07 np0005548789.localdomain systemd[1]: Started libpod-conmon-b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f.scope.
Dec 06 10:20:07 np0005548789.localdomain podman[328493]: 2025-12-06 10:20:07.78043439 +0000 UTC m=+0.038559670 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:07 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3566bd4719ade8695fae6a6e305e235adfa409a11feb22336d90853652527834/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:07 np0005548789.localdomain podman[328493]: 2025-12-06 10:20:07.906326929 +0000 UTC m=+0.164452169 container init b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:20:07 np0005548789.localdomain podman[328493]: 2025-12-06 10:20:07.91620008 +0000 UTC m=+0.174325330 container start b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:07 np0005548789.localdomain dnsmasq[328512]: started, version 2.85 cachesize 150
Dec 06 10:20:07 np0005548789.localdomain dnsmasq[328512]: DNS service limited to local subnets
Dec 06 10:20:07 np0005548789.localdomain dnsmasq[328512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:07 np0005548789.localdomain dnsmasq[328512]: warning: no upstream servers configured
Dec 06 10:20:07 np0005548789.localdomain dnsmasq-dhcp[328512]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:07 np0005548789.localdomain dnsmasq[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/addn_hosts - 0 addresses
Dec 06 10:20:07 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/host
Dec 06 10:20:07 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/opts
Dec 06 10:20:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:07.983 263652 INFO neutron.agent.dhcp.agent [None req-38231a6d-8e5c-41d2-a990-f1b82c8e9de5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:07Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcd57c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcaa880>], id=f254b4d6-48c8-4533-8087-a9ef7c023950, ip_allocation=immediate, mac_address=fa:16:3e:a8:a3:b9, name=tempest-RoutersIpV6Test-2080841993, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:00Z, description=, dns_domain=, id=b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-836254303, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57617, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2265, status=ACTIVE, subnets=['e24a4bb6-1a63-4e1f-97ef-3bc7eb6a4ce8'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:05Z, vlan_transparent=None, network_id=b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ea587027-2c02-4165-a90f-98eaf0ce1ddb'], standard_attr_id=2310, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:07Z on network b2c47b1f-f8cf-41da-adf1-6c6404edb8e3
Dec 06 10:20:08 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:08.118 263652 INFO neutron.agent.dhcp.agent [None req-491c8e88-3dac-4fc7-8bbd-6ccaae48b5bb - - - - - -] DHCP configuration for ports {'8be8da87-467b-4f9c-8bc8-059ed9eceb24'} is completed
Dec 06 10:20:08 np0005548789.localdomain dnsmasq[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/addn_hosts - 1 addresses
Dec 06 10:20:08 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/host
Dec 06 10:20:08 np0005548789.localdomain podman[328530]: 2025-12-06 10:20:08.181849381 +0000 UTC m=+0.061386698 container kill b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:20:08 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/opts
Dec 06 10:20:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:08.359 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:08 np0005548789.localdomain ceph-mon[298582]: pgmap v324: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s wr, 1 op/s
Dec 06 10:20:08 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:08.523 263652 INFO neutron.agent.dhcp.agent [None req-b44f8ac8-6f06-4658-b123-14f741668d51 - - - - - -] DHCP configuration for ports {'f254b4d6-48c8-4533-8087-a9ef7c023950'} is completed
Dec 06 10:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:20:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 48K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 13K writes, 4005 syncs, 3.29 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7263 writes, 22K keys, 7263 commit groups, 1.0 writes per commit group, ingest: 18.95 MB, 0.03 MB/s
                                                          Interval WAL: 7263 writes, 3188 syncs, 2.28 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:20:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:09.602 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:09.670 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:07Z, description=, device_id=b00fbecf-d8af-4c63-88f1-d68107f5afd3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcc7430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fae6160>], id=f254b4d6-48c8-4533-8087-a9ef7c023950, ip_allocation=immediate, mac_address=fa:16:3e:a8:a3:b9, name=tempest-RoutersIpV6Test-2080841993, network_id=b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['ea587027-2c02-4165-a90f-98eaf0ce1ddb'], standard_attr_id=2310, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:08Z on network b2c47b1f-f8cf-41da-adf1-6c6404edb8e3
Dec 06 10:20:09 np0005548789.localdomain podman[328568]: 2025-12-06 10:20:09.874473271 +0000 UTC m=+0.070508686 container kill b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:20:09 np0005548789.localdomain systemd[1]: tmp-crun.LZUfzJ.mount: Deactivated successfully.
Dec 06 10:20:09 np0005548789.localdomain dnsmasq[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/addn_hosts - 1 addresses
Dec 06 10:20:09 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/host
Dec 06 10:20:09 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/opts
Dec 06 10:20:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:10.110 263652 INFO neutron.agent.linux.ip_lib [None req-29c63b20-7da5-44f9-a255-d5f09d479629 - - - - - -] Device tap225f6418-78 cannot be used as it has no MAC address
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.132 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain kernel: device tap225f6418-78 entered promiscuous mode
Dec 06 10:20:10 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016410.1407] manager: (tap225f6418-78): new Generic device (/org/freedesktop/NetworkManager/Devices/66)
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00399|binding|INFO|Claiming lport 225f6418-78e0-4a61-a073-a03b711b3e97 for this chassis.
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00400|binding|INFO|225f6418-78e0-4a61-a073-a03b711b3e97: Claiming unknown
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.141 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain systemd-udevd[328599]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.157 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c448e0c7-883f-4055-a342-20d4d6819f0c, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=225f6418-78e0-4a61-a073-a03b711b3e97) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.159 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 225f6418-78e0-4a61-a073-a03b711b3e97 in datapath 25c4a3e3-dd82-4090-9ea0-aa2af92e22bf bound to our chassis
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.162 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port dc92a6e7-934f-4ad1-bb78-6f002a01bc3a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.162 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.163 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6ab2ce-9f25-4e03-a999-04908075e354]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00401|binding|INFO|Setting lport 225f6418-78e0-4a61-a073-a03b711b3e97 ovn-installed in OVS
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00402|binding|INFO|Setting lport 225f6418-78e0-4a61-a073-a03b711b3e97 up in Southbound
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.182 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:10.198 263652 INFO neutron.agent.dhcp.agent [None req-0b84d4ff-4686-46c0-8f31-61d69a1a4b35 - - - - - -] DHCP configuration for ports {'f254b4d6-48c8-4533-8087-a9ef7c023950'} is completed
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap225f6418-78: No such device
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.219 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.246 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00403|binding|INFO|Removing iface tap225f6418-78 ovn-installed in OVS
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.411 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port dc92a6e7-934f-4ad1-bb78-6f002a01bc3a with type ""
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00404|binding|INFO|Removing lport 225f6418-78e0-4a61-a073-a03b711b3e97 ovn-installed in OVS
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.413 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c448e0c7-883f-4055-a342-20d4d6819f0c, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=225f6418-78e0-4a61-a073-a03b711b3e97) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.415 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 225f6418-78e0-4a61-a073-a03b711b3e97 in datapath 25c4a3e3-dd82-4090-9ea0-aa2af92e22bf unbound from our chassis
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.418 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.419 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4745e0e8-09f8-4275-84ee-d2d1ad07670d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:10 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:10.428 2 INFO neutron.agent.securitygroups_rpc [None req-e7d31636-8439-4e6f-9785-a953cb0386af 260dfc8941214c308c05293af65bdae9 24086b701d6b4d4081d2e63578d18d24 - - default default] Security group member updated ['ea587027-2c02-4165-a90f-98eaf0ce1ddb']
Dec 06 10:20:10 np0005548789.localdomain ceph-mon[298582]: pgmap v325: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 7.3 KiB/s wr, 2 op/s
Dec 06 10:20:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "aaafb223-94a1-4885-a088-5199e647d774", "format": "json"}]: dispatch
Dec 06 10:20:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "aaafb223-94a1-4885-a088-5199e647d774", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e164 e164: 6 total, 6 up, 6 in
Dec 06 10:20:10 np0005548789.localdomain dnsmasq[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/addn_hosts - 0 addresses
Dec 06 10:20:10 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/host
Dec 06 10:20:10 np0005548789.localdomain dnsmasq-dhcp[328512]: read /var/lib/neutron/dhcp/b2c47b1f-f8cf-41da-adf1-6c6404edb8e3/opts
Dec 06 10:20:10 np0005548789.localdomain podman[328654]: 2025-12-06 10:20:10.647306945 +0000 UTC m=+0.066950258 container kill b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00405|binding|INFO|Releasing lport 8502c635-ed1a-4597-9657-4577483e7713 from this chassis (sb_readonly=0)
Dec 06 10:20:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:10Z|00406|binding|INFO|Setting lport 8502c635-ed1a-4597-9657-4577483e7713 down in Southbound
Dec 06 10:20:10 np0005548789.localdomain kernel: device tap8502c635-ed left promiscuous mode
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.859 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.864 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5059c6b1-bf63-4619-b361-4c64f7e8a30d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=8502c635-ed1a-4597-9657-4577483e7713) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.866 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 8502c635-ed1a-4597-9657-4577483e7713 in datapath b2c47b1f-f8cf-41da-adf1-6c6404edb8e3 unbound from our chassis
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.867 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b2c47b1f-f8cf-41da-adf1-6c6404edb8e3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:10 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:10.868 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[4c93f3d3-6f19-4c02-bfab-a26fb627ab21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:10.878 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:11 np0005548789.localdomain podman[328706]: 
Dec 06 10:20:11 np0005548789.localdomain podman[328706]: 2025-12-06 10:20:11.156672095 +0000 UTC m=+0.073645021 container create c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: Started libpod-conmon-c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3.scope.
Dec 06 10:20:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:11.211 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=528a9f17-509a-4c49-a9ac-4a6363f2178f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=659e29bd-a84c-4733-b754-dbb7b70b98cc) old=Port_Binding(mac=['fa:16:3e:94:c3:b8 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667a7cf2-00f8-4896-8e3d-8222fad7f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5f8e1c4c589749b99178bbc7c2bea3f0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:11.214 160509 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 659e29bd-a84c-4733-b754-dbb7b70b98cc in datapath 667a7cf2-00f8-4896-8e3d-8222fad7f397 updated
Dec 06 10:20:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:11.217 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667a7cf2-00f8-4896-8e3d-8222fad7f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:11.219 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a89a3d4e-5289-48a9-a1bb-32d10a0df882]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:11 np0005548789.localdomain podman[328706]: 2025-12-06 10:20:11.122579493 +0000 UTC m=+0.039552409 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:11 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dec80bbf34eccee69cc589d9da96e60fb9d53df365ac4bfeaf1ca640c086573/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:11 np0005548789.localdomain podman[328706]: 2025-12-06 10:20:11.297082967 +0000 UTC m=+0.214055863 container init c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:20:11 np0005548789.localdomain podman[328706]: 2025-12-06 10:20:11.303452193 +0000 UTC m=+0.220425099 container start c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: started, version 2.85 cachesize 150
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: DNS service limited to local subnets
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: warning: no upstream servers configured
Dec 06 10:20:11 np0005548789.localdomain dnsmasq-dhcp[328758]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/addn_hosts - 0 addresses
Dec 06 10:20:11 np0005548789.localdomain dnsmasq-dhcp[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/host
Dec 06 10:20:11 np0005548789.localdomain dnsmasq-dhcp[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/opts
Dec 06 10:20:11 np0005548789.localdomain podman[328719]: 2025-12-06 10:20:11.310104556 +0000 UTC m=+0.104625169 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Dec 06 10:20:11 np0005548789.localdomain podman[328719]: 2025-12-06 10:20:11.324141995 +0000 UTC m=+0.118662618 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:20:11 np0005548789.localdomain podman[328720]: 2025-12-06 10:20:11.285598207 +0000 UTC m=+0.085260407 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:20:11 np0005548789.localdomain podman[328720]: 2025-12-06 10:20:11.365235311 +0000 UTC m=+0.164897531 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:11 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:20:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:11.394 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:11 np0005548789.localdomain kernel: device tap225f6418-78 left promiscuous mode
Dec 06 10:20:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:11.411 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.431 263652 INFO neutron.agent.dhcp.agent [None req-ac73be23-35e0-4a50-8455-0b348b06b099 - - - - - -] DHCP configuration for ports {'1998c7e0-78ce-4456-9959-95d99e0050bc'} is completed
Dec 06 10:20:11 np0005548789.localdomain ceph-mon[298582]: osdmap e164: 6 total, 6 up, 6 in
Dec 06 10:20:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:11 np0005548789.localdomain dnsmasq[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/addn_hosts - 0 addresses
Dec 06 10:20:11 np0005548789.localdomain dnsmasq-dhcp[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/host
Dec 06 10:20:11 np0005548789.localdomain dnsmasq-dhcp[328758]: read /var/lib/neutron/dhcp/25c4a3e3-dd82-4090-9ea0-aa2af92e22bf/opts
Dec 06 10:20:11 np0005548789.localdomain podman[328780]: 2025-12-06 10:20:11.604410532 +0000 UTC m=+0.047827453 container kill c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent [None req-dc16eb19-be2b-4ac1-b923-1d1738c06083 - - - - - -] Unable to reload_allocations dhcp for 25c4a3e3-dd82-4090-9ea0-aa2af92e22bf.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap225f6418-78 not found in namespace qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf.
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap225f6418-78 not found in namespace qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf.
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.634 263652 ERROR neutron.agent.dhcp.agent 
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.638 263652 INFO neutron.agent.dhcp.agent [None req-52fc0508-2b36-4e56-82d0-a682e93b4cc9 - - - - - -] Synchronizing state
Dec 06 10:20:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:11Z|00407|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:11.728 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:11.858 263652 INFO neutron.agent.dhcp.agent [None req-10e72291-99ff-473a-8e06-8c56fde98d4a - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:12 np0005548789.localdomain dnsmasq[328758]: exiting on receipt of SIGTERM
Dec 06 10:20:12 np0005548789.localdomain podman[328812]: 2025-12-06 10:20:12.054033296 +0000 UTC m=+0.061400297 container kill c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: libpod-c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3.scope: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain podman[328826]: 2025-12-06 10:20:12.130748111 +0000 UTC m=+0.057619252 container died c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:12 np0005548789.localdomain podman[328826]: 2025-12-06 10:20:12.158589332 +0000 UTC m=+0.085460453 container cleanup c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-0dec80bbf34eccee69cc589d9da96e60fb9d53df365ac4bfeaf1ca640c086573-merged.mount: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: libpod-conmon-c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3.scope: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain podman[328827]: 2025-12-06 10:20:12.215596735 +0000 UTC m=+0.133925965 container remove c4d5652f0ca83ac56260b3c6e7d7bc8fce4934ba3122d2234540336c428473c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-25c4a3e3-dd82-4090-9ea0-aa2af92e22bf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:20:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:12.220 2 INFO neutron.agent.securitygroups_rpc [None req-6d93ce58-a6ee-4351-b70e-4269edfdd4c8 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['81acd248-ff6c-407a-a3e7-57e59597aa28', 'c05cd5e8-c5d4-4d05-80ba-b6a4af8b3ba8', '1d275e53-d6a2-4014-8325-c04642bc5279']
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d25c4a3e3\x2ddd82\x2d4090\x2d9ea0\x2daa2af92e22bf.mount: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:12.249 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:12.250 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:12.251 263652 INFO neutron.agent.dhcp.agent [None req-5a5595b0-a0f8-44d7-9627-4542518c5211 - - - - - -] Synchronizing state complete
Dec 06 10:20:12 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:12.252 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:20:12 np0005548789.localdomain dnsmasq[328512]: exiting on receipt of SIGTERM
Dec 06 10:20:12 np0005548789.localdomain podman[328871]: 2025-12-06 10:20:12.523543488 +0000 UTC m=+0.066273187 container kill b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: libpod-b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f.scope: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain podman[328884]: 2025-12-06 10:20:12.579018904 +0000 UTC m=+0.046432251 container died b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "format": "json"}]: dispatch
Dec 06 10:20:12 np0005548789.localdomain ceph-mon[298582]: pgmap v327: 177 pgs: 177 active+clean; 192 MiB data, 912 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s wr, 2 op/s
Dec 06 10:20:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:12 np0005548789.localdomain podman[328884]: 2025-12-06 10:20:12.605926936 +0000 UTC m=+0.073340263 container cleanup b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:20:12 np0005548789.localdomain systemd[1]: libpod-conmon-b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f.scope: Deactivated successfully.
Dec 06 10:20:12 np0005548789.localdomain podman[328891]: 2025-12-06 10:20:12.679543887 +0000 UTC m=+0.129643925 container remove b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c47b1f-f8cf-41da-adf1-6c6404edb8e3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:20:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:12.730 263652 INFO neutron.agent.dhcp.agent [None req-95c11c3d-bd4a-43a9-8a7d-7e4eb6958961 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:12 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:12.731 263652 INFO neutron.agent.dhcp.agent [None req-95c11c3d-bd4a-43a9-8a7d-7e4eb6958961 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:12.871 2 INFO neutron.agent.securitygroups_rpc [None req-5ab424f2-09a3-4942-a99f-ad10877e0761 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['81acd248-ff6c-407a-a3e7-57e59597aa28', '1d275e53-d6a2-4014-8325-c04642bc5279']
Dec 06 10:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:20:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 3013 syncs, 3.41 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 5151 writes, 16K keys, 5151 commit groups, 1.0 writes per commit group, ingest: 14.16 MB, 0.02 MB/s
                                                          Interval WAL: 5151 writes, 2234 syncs, 2.31 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:20:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3566bd4719ade8695fae6a6e305e235adfa409a11feb22336d90853652527834-merged.mount: Deactivated successfully.
Dec 06 10:20:13 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b702010cfde6ddd33cba348d96e0b98a1b493114b99e87a7ebe4d35b59d0803f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:13 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2db2c47b1f\x2df8cf\x2d41da\x2dadf1\x2d6c6404edb8e3.mount: Deactivated successfully.
Dec 06 10:20:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e165 e165: 6 total, 6 up, 6 in
Dec 06 10:20:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:13.410 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "format": "json"}]: dispatch
Dec 06 10:20:13 np0005548789.localdomain ceph-mon[298582]: osdmap e165: 6 total, 6 up, 6 in
Dec 06 10:20:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:14.605 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:14 np0005548789.localdomain ceph-mon[298582]: pgmap v328: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 3.3 KiB/s rd, 15 KiB/s wr, 9 op/s
Dec 06 10:20:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:15 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:15.254 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:20:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "format": "json"}]: dispatch
Dec 06 10:20:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e166 e166: 6 total, 6 up, 6 in
Dec 06 10:20:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:20:15 np0005548789.localdomain podman[328915]: 2025-12-06 10:20:15.929546904 +0000 UTC m=+0.089639151 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:20:15 np0005548789.localdomain podman[328915]: 2025-12-06 10:20:15.944239353 +0000 UTC m=+0.104331600 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 10:20:15 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:20:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:20:16 np0005548789.localdomain ceph-mon[298582]: pgmap v330: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 4.1 KiB/s rd, 19 KiB/s wr, 11 op/s
Dec 06 10:20:16 np0005548789.localdomain ceph-mon[298582]: osdmap e166: 6 total, 6 up, 6 in
Dec 06 10:20:16 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:16.701 2 INFO neutron.agent.securitygroups_rpc [None req-9395d16c-29ac-47bb-b03d-1c577d966648 b9f4d254b45d482eab7cfb178c231d9a 5f8e1c4c589749b99178bbc7c2bea3f0 - - default default] Security group member updated ['cd56abe4-204c-4363-ad64-0a6840260727']
Dec 06 10:20:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e167 e167: 6 total, 6 up, 6 in
Dec 06 10:20:18 np0005548789.localdomain ceph-mon[298582]: pgmap v332: 177 pgs: 177 active+clean; 192 MiB data, 896 MiB used, 41 GiB / 42 GiB avail; 5.0 KiB/s rd, 16 KiB/s wr, 11 op/s
Dec 06 10:20:18 np0005548789.localdomain ceph-mon[298582]: osdmap e167: 6 total, 6 up, 6 in
Dec 06 10:20:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:18 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:18.382 263652 INFO neutron.agent.linux.ip_lib [None req-9d5a4069-5851-4fe0-bc47-a6ba9dbe8444 - - - - - -] Device tap4df77c93-33 cannot be used as it has no MAC address
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.405 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain kernel: device tap4df77c93-33 entered promiscuous mode
Dec 06 10:20:18 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016418.4130] manager: (tap4df77c93-33): new Generic device (/org/freedesktop/NetworkManager/Devices/67)
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain systemd-udevd[328944]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.424 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.449 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.487 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:18.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:18 np0005548789.localdomain podman[328947]: 2025-12-06 10:20:18.536260746 +0000 UTC m=+0.078352836 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:20:18 np0005548789.localdomain systemd[1]: tmp-crun.h55rUG.mount: Deactivated successfully.
Dec 06 10:20:18 np0005548789.localdomain podman[328947]: 2025-12-06 10:20:18.546159719 +0000 UTC m=+0.088251859 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:20:18 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:20:19 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e168 e168: 6 total, 6 up, 6 in
Dec 06 10:20:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "format": "json"}]: dispatch
Dec 06 10:20:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4d6e1245-952e-4af4-91c8-d91b7174dcef", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:19 np0005548789.localdomain podman[329023]: 
Dec 06 10:20:19 np0005548789.localdomain podman[329023]: 2025-12-06 10:20:19.325257915 +0000 UTC m=+0.081795411 container create 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:19 np0005548789.localdomain systemd[1]: Started libpod-conmon-70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395.scope.
Dec 06 10:20:19 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:19 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/439f5a96e55010ba06d9a70e7060d6f3f7af4a09fb294473de0406a522f7e8e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:19 np0005548789.localdomain podman[329023]: 2025-12-06 10:20:19.382425352 +0000 UTC m=+0.138962858 container init 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:20:19 np0005548789.localdomain podman[329023]: 2025-12-06 10:20:19.289274625 +0000 UTC m=+0.045812121 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:19 np0005548789.localdomain podman[329023]: 2025-12-06 10:20:19.389999334 +0000 UTC m=+0.146536830 container start 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: started, version 2.85 cachesize 150
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: DNS service limited to local subnets
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: warning: no upstream servers configured
Dec 06 10:20:19 np0005548789.localdomain dnsmasq-dhcp[329041]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: read /var/lib/neutron/dhcp/080f9d33-1223-44cc-b553-017f3a017f1d/addn_hosts - 0 addresses
Dec 06 10:20:19 np0005548789.localdomain dnsmasq-dhcp[329041]: read /var/lib/neutron/dhcp/080f9d33-1223-44cc-b553-017f3a017f1d/host
Dec 06 10:20:19 np0005548789.localdomain dnsmasq-dhcp[329041]: read /var/lib/neutron/dhcp/080f9d33-1223-44cc-b553-017f3a017f1d/opts
Dec 06 10:20:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:19.531 263652 INFO neutron.agent.dhcp.agent [None req-e1f0a4c4-a881-40d6-a9c3-786641c231ec - - - - - -] DHCP configuration for ports {'ae9783fa-c0a8-4028-9d3b-28cdd45ffbb6'} is completed
Dec 06 10:20:19 np0005548789.localdomain dnsmasq[329041]: exiting on receipt of SIGTERM
Dec 06 10:20:19 np0005548789.localdomain podman[329059]: 2025-12-06 10:20:19.592468673 +0000 UTC m=+0.038972972 container kill 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:19 np0005548789.localdomain systemd[1]: libpod-70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395.scope: Deactivated successfully.
Dec 06 10:20:19 np0005548789.localdomain podman[329072]: 2025-12-06 10:20:19.63721376 +0000 UTC m=+0.034932538 container died 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:19.642 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:19 np0005548789.localdomain podman[329072]: 2025-12-06 10:20:19.667696902 +0000 UTC m=+0.065415650 container cleanup 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:20:19 np0005548789.localdomain systemd[1]: libpod-conmon-70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395.scope: Deactivated successfully.
Dec 06 10:20:19 np0005548789.localdomain podman[329079]: 2025-12-06 10:20:19.687962423 +0000 UTC m=+0.075081687 container remove 70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-080f9d33-1223-44cc-b553-017f3a017f1d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:19.697 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:19 np0005548789.localdomain kernel: device tap4df77c93-33 left promiscuous mode
Dec 06 10:20:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:19.724 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:19.759 263652 INFO neutron.agent.dhcp.agent [None req-d28f9adb-6d87-44f3-8ef6-5c1777b22602 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:19 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:19.759 263652 INFO neutron.agent.dhcp.agent [None req-d28f9adb-6d87-44f3-8ef6-5c1777b22602 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:20.199 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:19Z, description=, device_id=f5c53eaf-931c-42d6-8b97-9823f000abba, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa841c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fe38220>], id=2042a880-40b6-4791-9487-9eabb2033780, ip_allocation=immediate, mac_address=fa:16:3e:46:b1:83, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2383, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:20:19Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:20:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "format": "json"}]: dispatch
Dec 06 10:20:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7adc54b5-eca0-4f1a-9df6-2cf3e56ffc2b", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:20 np0005548789.localdomain ceph-mon[298582]: pgmap v334: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 KiB/s wr, 47 op/s
Dec 06 10:20:20 np0005548789.localdomain ceph-mon[298582]: osdmap e168: 6 total, 6 up, 6 in
Dec 06 10:20:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e169 e169: 6 total, 6 up, 6 in
Dec 06 10:20:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-439f5a96e55010ba06d9a70e7060d6f3f7af4a09fb294473de0406a522f7e8e8-merged.mount: Deactivated successfully.
Dec 06 10:20:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70ddb52186bceeb98bb0bc6042403e0e8607b5fa5ef68f5b2c3a3df973255395-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:20 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d080f9d33\x2d1223\x2d44cc\x2db553\x2d017f3a017f1d.mount: Deactivated successfully.
Dec 06 10:20:20 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:20:20 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:20:20 np0005548789.localdomain podman[329120]: 2025-12-06 10:20:20.456900327 +0000 UTC m=+0.072405474 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:20:20 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:20:20 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:20.686 263652 INFO neutron.agent.dhcp.agent [None req-84d092d6-73e7-4993-89ec-d6ca5f2f8089 - - - - - -] DHCP configuration for ports {'2042a880-40b6-4791-9487-9eabb2033780'} is completed
Dec 06 10:20:21 np0005548789.localdomain ceph-mon[298582]: osdmap e169: 6 total, 6 up, 6 in
Dec 06 10:20:21 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:21 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e170 e170: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e171 e171: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: pgmap v337: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 21 KiB/s wr, 51 op/s
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "format": "json"}]: dispatch
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: osdmap e170: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548789.localdomain ceph-mon[298582]: osdmap e171: 6 total, 6 up, 6 in
Dec 06 10:20:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:20:22 np0005548789.localdomain systemd[1]: tmp-crun.PpAEGo.mount: Deactivated successfully.
Dec 06 10:20:22 np0005548789.localdomain podman[329142]: 2025-12-06 10:20:22.929829722 +0000 UTC m=+0.085605248 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:20:22 np0005548789.localdomain podman[329142]: 2025-12-06 10:20:22.968394051 +0000 UTC m=+0.124169617 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:20:22 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:20:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:22.996 263652 INFO neutron.agent.linux.ip_lib [None req-60cf68e3-5e8c-4911-9c07-5a4b824aa225 - - - - - -] Device tap7c8805e4-f0 cannot be used as it has no MAC address
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.023 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain kernel: device tap7c8805e4-f0 entered promiscuous mode
Dec 06 10:20:23 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016423.0310] manager: (tap7c8805e4-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/68)
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00408|binding|INFO|Claiming lport 7c8805e4-f06e-4359-b7b5-effc8da5aad8 for this chassis.
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.033 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00409|binding|INFO|7c8805e4-f06e-4359-b7b5-effc8da5aad8: Claiming unknown
Dec 06 10:20:23 np0005548789.localdomain systemd-udevd[329176]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.044 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-941344ac-1e9e-4ba5-9592-4a1e73ea58e6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-941344ac-1e9e-4ba5-9592-4a1e73ea58e6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f2c5640-07d2-4d8e-95f6-82e2d2dfdf54, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7c8805e4-f06e-4359-b7b5-effc8da5aad8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.046 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7c8805e4-f06e-4359-b7b5-effc8da5aad8 in datapath 941344ac-1e9e-4ba5-9592-4a1e73ea58e6 bound to our chassis
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.047 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 941344ac-1e9e-4ba5-9592-4a1e73ea58e6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.047 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[b055b9a8-5240-4166-b99e-829f96c39a2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00410|binding|INFO|Setting lport 7c8805e4-f06e-4359-b7b5-effc8da5aad8 ovn-installed in OVS
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00411|binding|INFO|Setting lport 7c8805e4-f06e-4359-b7b5-effc8da5aad8 up in Southbound
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.075 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap7c8805e4-f0: No such device
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.110 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.132 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e172 e172: 6 total, 6 up, 6 in
Dec 06 10:20:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.450 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:23.709 2 INFO neutron.agent.securitygroups_rpc [None req-048ea5a4-2d0a-4365-a398-a917ab48f027 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['eb426258-160f-4f74-a9d2-50e476134e75']
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00412|binding|INFO|Removing iface tap7c8805e4-f0 ovn-installed in OVS
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00413|binding|INFO|Removing lport 7c8805e4-f06e-4359-b7b5-effc8da5aad8 ovn-installed in OVS
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.797 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 4c677c22-7fad-48c9-83cf-9d6b14ff5d3f with type ""
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.798 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.798 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-941344ac-1e9e-4ba5-9592-4a1e73ea58e6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-941344ac-1e9e-4ba5-9592-4a1e73ea58e6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f00ab5f7d934f62991ed1e7e798e47e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f2c5640-07d2-4d8e-95f6-82e2d2dfdf54, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=1, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7c8805e4-f06e-4359-b7b5-effc8da5aad8) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:23.801 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.802 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7c8805e4-f06e-4359-b7b5-effc8da5aad8 in datapath 941344ac-1e9e-4ba5-9592-4a1e73ea58e6 unbound from our chassis
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.803 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 941344ac-1e9e-4ba5-9592-4a1e73ea58e6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:23.804 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[90ed9ad9-08d1-4a14-adca-a4302cf751df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:23 np0005548789.localdomain podman[329245]: 
Dec 06 10:20:23 np0005548789.localdomain podman[329245]: 2025-12-06 10:20:23.916845603 +0000 UTC m=+0.084213775 container create 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:20:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:20:23 np0005548789.localdomain podman[329245]: 2025-12-06 10:20:23.886485605 +0000 UTC m=+0.053853807 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:23.991 2 INFO neutron.agent.securitygroups_rpc [None req-6df8f9bf-dce2-42c5-9279-2397b4b4c0d3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['eb426258-160f-4f74-a9d2-50e476134e75']
Dec 06 10:20:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:23Z|00414|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:24.014 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b.scope.
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6420eab620073affa12b038addfba3666aa3c4527d56e42cdcfa58ae57886adb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:24 np0005548789.localdomain podman[329245]: 2025-12-06 10:20:24.048956951 +0000 UTC m=+0.216325153 container init 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:20:24 np0005548789.localdomain podman[329245]: 2025-12-06 10:20:24.056031788 +0000 UTC m=+0.223399990 container start 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: started, version 2.85 cachesize 150
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: DNS service limited to local subnets
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: warning: no upstream servers configured
Dec 06 10:20:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:20:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159746 "" "Go-http-client/1.1"
Dec 06 10:20:24 np0005548789.localdomain dnsmasq-dhcp[329286]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: read /var/lib/neutron/dhcp/941344ac-1e9e-4ba5-9592-4a1e73ea58e6/addn_hosts - 0 addresses
Dec 06 10:20:24 np0005548789.localdomain dnsmasq-dhcp[329286]: read /var/lib/neutron/dhcp/941344ac-1e9e-4ba5-9592-4a1e73ea58e6/host
Dec 06 10:20:24 np0005548789.localdomain dnsmasq-dhcp[329286]: read /var/lib/neutron/dhcp/941344ac-1e9e-4ba5-9592-4a1e73ea58e6/opts
Dec 06 10:20:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:20:24 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20222 "" "Go-http-client/1.1"
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:20:24 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:20:24 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:20:24 np0005548789.localdomain podman[329279]: 2025-12-06 10:20:24.127243464 +0000 UTC m=+0.086553637 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:20:24 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:24.249 263652 INFO neutron.agent.dhcp.agent [None req-33d184c0-582c-48ac-8fa8-48de9fb994df - - - - - -] DHCP configuration for ports {'9608e96c-6e07-47bb-b306-34f8154f24ff'} is completed
Dec 06 10:20:24 np0005548789.localdomain dnsmasq[329286]: exiting on receipt of SIGTERM
Dec 06 10:20:24 np0005548789.localdomain podman[329313]: 2025-12-06 10:20:24.287471262 +0000 UTC m=+0.053044113 container kill 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: libpod-5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b.scope: Deactivated successfully.
Dec 06 10:20:24 np0005548789.localdomain podman[329330]: 2025-12-06 10:20:24.342924817 +0000 UTC m=+0.044796520 container died 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:24 np0005548789.localdomain podman[329330]: 2025-12-06 10:20:24.37932263 +0000 UTC m=+0.081194273 container cleanup 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:20:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "format": "json"}]: dispatch
Dec 06 10:20:24 np0005548789.localdomain ceph-mon[298582]: pgmap v340: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 29 KiB/s wr, 113 op/s
Dec 06 10:20:24 np0005548789.localdomain ceph-mon[298582]: osdmap e172: 6 total, 6 up, 6 in
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: libpod-conmon-5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b.scope: Deactivated successfully.
Dec 06 10:20:24 np0005548789.localdomain podman[329332]: 2025-12-06 10:20:24.429840954 +0000 UTC m=+0.123376462 container remove 5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-941344ac-1e9e-4ba5-9592-4a1e73ea58e6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:24.438 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:24 np0005548789.localdomain kernel: device tap7c8805e4-f0 left promiscuous mode
Dec 06 10:20:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:24.449 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:24 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:24.525 263652 INFO neutron.agent.dhcp.agent [None req-f1a2618a-d715-487f-8a7f-5aa8f033dbd7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:24 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:24.526 263652 INFO neutron.agent.dhcp.agent [None req-f1a2618a-d715-487f-8a7f-5aa8f033dbd7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:24.681 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6420eab620073affa12b038addfba3666aa3c4527d56e42cdcfa58ae57886adb-merged.mount: Deactivated successfully.
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5652484e04839d706157bf2db8d0c37ce6784ac6075c24bee3fa0a80f4ff462b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:24 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d941344ac\x2d1e9e\x2d4ba5\x2d9592\x2d4a1e73ea58e6.mount: Deactivated successfully.
Dec 06 10:20:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e173 e173: 6 total, 6 up, 6 in
Dec 06 10:20:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:26.135 2 INFO neutron.agent.securitygroups_rpc [None req-ef8cf78f-f1a9-46f9-a6c4-622f166b1f57 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548789.localdomain ceph-mon[298582]: pgmap v342: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 24 KiB/s wr, 93 op/s
Dec 06 10:20:26 np0005548789.localdomain ceph-mon[298582]: osdmap e173: 6 total, 6 up, 6 in
Dec 06 10:20:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:26.557 2 INFO neutron.agent.securitygroups_rpc [None req-6a0c3d03-5496-4b9d-aee6-2794cf73d3e3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:26.778 2 INFO neutron.agent.securitygroups_rpc [None req-cb5ad30b-b885-42c3-a286-99bf227690f5 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:26.972 2 INFO neutron.agent.securitygroups_rpc [None req-415499aa-cb15-4206-8b55-b9a21ed2dc86 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:27.101 2 INFO neutron.agent.securitygroups_rpc [None req-2e89e39c-a441-4c37-8f6e-df561eb77ca2 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:27.211 2 INFO neutron.agent.securitygroups_rpc [None req-a03daea5-0289-4f98-a7ae-aa6379a3c0f5 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e174 e174: 6 total, 6 up, 6 in
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "81753d92-4847-43cb-b357-c4adab052a83", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "81753d92-4847-43cb-b357-c4adab052a83", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "format": "json"}]: dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:27 np0005548789.localdomain ceph-mon[298582]: osdmap e174: 6 total, 6 up, 6 in
Dec 06 10:20:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:28 np0005548789.localdomain ceph-mon[298582]: pgmap v344: 177 pgs: 177 active+clean; 192 MiB data, 901 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 20 KiB/s wr, 77 op/s
Dec 06 10:20:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:28.488 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:29.040 2 INFO neutron.agent.securitygroups_rpc [None req-3c4f32bd-b684-4b66-9a34-68450fbeb73b 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:29.255 2 INFO neutron.agent.securitygroups_rpc [None req-03f78b6a-c5ab-4048-95e5-dac6933624ce 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e175 e175: 6 total, 6 up, 6 in
Dec 06 10:20:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:29.502 2 INFO neutron.agent.securitygroups_rpc [None req-bec77d6f-35e0-4121-9e3c-321d796fa6a3 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:29.683 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:29 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:29.696 2 INFO neutron.agent.securitygroups_rpc [None req-b83ddb52-2122-4d26-8d71-acc6737aed87 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['bd391031-3a56-454d-bf4f-b44b517a0aeb']
Dec 06 10:20:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548789.localdomain ceph-mon[298582]: pgmap v346: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 19 KiB/s wr, 42 op/s
Dec 06 10:20:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548789.localdomain ceph-mon[298582]: osdmap e175: 6 total, 6 up, 6 in
Dec 06 10:20:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b", "format": "json"}]: dispatch
Dec 06 10:20:30 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:30.741 2 INFO neutron.agent.securitygroups_rpc [None req-1a7eb84d-a3b4-4a88-a0ae-062b0b90ebc4 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['ea4ca242-5187-4603-82cf-af66665b0039']
Dec 06 10:20:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:31.297 263652 INFO neutron.agent.linux.ip_lib [None req-9bd62092-9152-4a74-9bc1-8ba6ab839186 - - - - - -] Device tap902b329b-7b cannot be used as it has no MAC address
Dec 06 10:20:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:31.320 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:31 np0005548789.localdomain kernel: device tap902b329b-7b entered promiscuous mode
Dec 06 10:20:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:31Z|00415|binding|INFO|Claiming lport 902b329b-7b8a-46c2-a01e-dfe82eef6b46 for this chassis.
Dec 06 10:20:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:31Z|00416|binding|INFO|902b329b-7b8a-46c2-a01e-dfe82eef6b46: Claiming unknown
Dec 06 10:20:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:31.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:31 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016431.3306] manager: (tap902b329b-7b): new Generic device (/org/freedesktop/NetworkManager/Devices/69)
Dec 06 10:20:31 np0005548789.localdomain systemd-udevd[329370]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:31Z|00417|binding|INFO|Setting lport 902b329b-7b8a-46c2-a01e-dfe82eef6b46 ovn-installed in OVS
Dec 06 10:20:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:31.366 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:31Z|00418|binding|INFO|Setting lport 902b329b-7b8a-46c2-a01e-dfe82eef6b46 up in Southbound
Dec 06 10:20:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:31.401 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-e449a5e0-9225-4a29-ab74-be48f680b8f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e449a5e0-9225-4a29-ab74-be48f680b8f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9489b31d-85bf-439b-b0c5-aab9e51b25ad, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=902b329b-7b8a-46c2-a01e-dfe82eef6b46) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:31.403 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 902b329b-7b8a-46c2-a01e-dfe82eef6b46 in datapath e449a5e0-9225-4a29-ab74-be48f680b8f1 bound to our chassis
Dec 06 10:20:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:31.405 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e449a5e0-9225-4a29-ab74-be48f680b8f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:31.406 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5a06b9c8-0085-444f-9897-df294b7b66db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:31.412 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:31.442 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "format": "json"}]: dispatch
Dec 06 10:20:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3989965424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:32 np0005548789.localdomain podman[329423]: 
Dec 06 10:20:32 np0005548789.localdomain podman[329423]: 2025-12-06 10:20:32.227858195 +0000 UTC m=+0.078443949 container create 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:32 np0005548789.localdomain systemd[1]: Started libpod-conmon-0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b.scope.
Dec 06 10:20:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e176 e176: 6 total, 6 up, 6 in
Dec 06 10:20:32 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:32 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cca8e8acafd6a16f5767d7885b992dc7463fe2148062a0fa246de7d9c1c0baa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:32 np0005548789.localdomain podman[329423]: 2025-12-06 10:20:32.290775568 +0000 UTC m=+0.141361342 container init 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:20:32 np0005548789.localdomain podman[329423]: 2025-12-06 10:20:32.298872076 +0000 UTC m=+0.149457820 container start 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:32 np0005548789.localdomain podman[329423]: 2025-12-06 10:20:32.200894271 +0000 UTC m=+0.051480025 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:32 np0005548789.localdomain dnsmasq[329441]: started, version 2.85 cachesize 150
Dec 06 10:20:32 np0005548789.localdomain dnsmasq[329441]: DNS service limited to local subnets
Dec 06 10:20:32 np0005548789.localdomain dnsmasq[329441]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:32 np0005548789.localdomain dnsmasq[329441]: warning: no upstream servers configured
Dec 06 10:20:32 np0005548789.localdomain dnsmasq-dhcp[329441]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:32 np0005548789.localdomain dnsmasq[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/addn_hosts - 0 addresses
Dec 06 10:20:32 np0005548789.localdomain dnsmasq-dhcp[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/host
Dec 06 10:20:32 np0005548789.localdomain dnsmasq-dhcp[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/opts
Dec 06 10:20:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:32.475 263652 INFO neutron.agent.dhcp.agent [None req-e0e6ba56-53d2-467f-b73d-41a12f165c94 - - - - - -] DHCP configuration for ports {'842ccd8b-2a03-40a8-82af-c1147248d29c'} is completed
Dec 06 10:20:32 np0005548789.localdomain ceph-mon[298582]: pgmap v348: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 19 KiB/s wr, 42 op/s
Dec 06 10:20:32 np0005548789.localdomain ceph-mon[298582]: osdmap e176: 6 total, 6 up, 6 in
Dec 06 10:20:32 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:32.640 2 INFO neutron.agent.securitygroups_rpc [None req-8df6a51f-2782-49f1-a34d-739b1e2f53d1 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['0223fd9f-7d67-4f35-8221-a118caed647f']
Dec 06 10:20:32 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:32.950 2 INFO neutron.agent.securitygroups_rpc [None req-c356675a-b0a4-4bc7-b431-054879bdecb2 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['0223fd9f-7d67-4f35-8221-a118caed647f']
Dec 06 10:20:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 10:20:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:33Z|00419|binding|INFO|Removing iface tap902b329b-7b ovn-installed in OVS
Dec 06 10:20:33 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:33Z|00420|binding|INFO|Removing lport 902b329b-7b8a-46c2-a01e-dfe82eef6b46 ovn-installed in OVS
Dec 06 10:20:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:33.518 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 293179f1-663f-4453-88e7-13f4696f7c9d with type ""
Dec 06 10:20:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:33.520 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-e449a5e0-9225-4a29-ab74-be48f680b8f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e449a5e0-9225-4a29-ab74-be48f680b8f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9489b31d-85bf-439b-b0c5-aab9e51b25ad, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=902b329b-7b8a-46c2-a01e-dfe82eef6b46) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:33.523 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 902b329b-7b8a-46c2-a01e-dfe82eef6b46 in datapath e449a5e0-9225-4a29-ab74-be48f680b8f1 unbound from our chassis
Dec 06 10:20:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:33.524 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:33 np0005548789.localdomain kernel: device tap902b329b-7b left promiscuous mode
Dec 06 10:20:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:33.533 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e449a5e0-9225-4a29-ab74-be48f680b8f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:33 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:33.535 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[3d767afb-b401-46bb-9909-fb5b28e2e99c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:33.541 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "format": "json"}]: dispatch
Dec 06 10:20:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c706f26e-f87f-4af4-a717-78c8b08cc789", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:34 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "target_sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:34 np0005548789.localdomain ceph-mon[298582]: pgmap v350: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 37 KiB/s wr, 91 op/s
Dec 06 10:20:34 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:34.729 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:34 np0005548789.localdomain dnsmasq[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/addn_hosts - 0 addresses
Dec 06 10:20:34 np0005548789.localdomain dnsmasq-dhcp[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/host
Dec 06 10:20:34 np0005548789.localdomain podman[329462]: 2025-12-06 10:20:34.776842404 +0000 UTC m=+0.091414665 container kill 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:20:34 np0005548789.localdomain dnsmasq-dhcp[329441]: read /var/lib/neutron/dhcp/e449a5e0-9225-4a29-ab74-be48f680b8f1/opts
Dec 06 10:20:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:20:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent [None req-94ddf741-81b5-4bd7-901a-2366724e3951 - - - - - -] Unable to reload_allocations dhcp for e449a5e0-9225-4a29-ab74-be48f680b8f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap902b329b-7b not found in namespace qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1.
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap902b329b-7b not found in namespace qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1.
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.803 263652 ERROR neutron.agent.dhcp.agent 
Dec 06 10:20:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:34.809 263652 INFO neutron.agent.dhcp.agent [None req-5a5595b0-a0f8-44d7-9627-4542518c5211 - - - - - -] Synchronizing state
Dec 06 10:20:34 np0005548789.localdomain systemd[1]: tmp-crun.Nx0391.mount: Deactivated successfully.
Dec 06 10:20:34 np0005548789.localdomain podman[329476]: 2025-12-06 10:20:34.905132306 +0000 UTC m=+0.093246562 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:34 np0005548789.localdomain podman[329476]: 2025-12-06 10:20:34.909971723 +0000 UTC m=+0.098086009 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:34 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:20:34 np0005548789.localdomain podman[329477]: 2025-12-06 10:20:34.988898176 +0000 UTC m=+0.179554540 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:20:34 np0005548789.localdomain podman[329477]: 2025-12-06 10:20:34.996746786 +0000 UTC m=+0.187403160 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:35.118 263652 INFO neutron.agent.dhcp.agent [None req-83cda96d-a56f-4426-a417-563d5a2ec11d - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:35.119 263652 INFO neutron.agent.dhcp.agent [-] Starting network e449a5e0-9225-4a29-ab74-be48f680b8f1 dhcp configuration
Dec 06 10:20:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:35.120 263652 INFO neutron.agent.dhcp.agent [-] Finished network e449a5e0-9225-4a29-ab74-be48f680b8f1 dhcp configuration
Dec 06 10:20:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:35.121 263652 INFO neutron.agent.dhcp.agent [None req-83cda96d-a56f-4426-a417-563d5a2ec11d - - - - - -] Synchronizing state complete
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.412 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.413 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.413 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.413 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.413 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:35Z|00421|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.464 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:35.498 2 INFO neutron.agent.securitygroups_rpc [None req-70f019b0-4c49-406a-b078-506915b4f443 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['2bea4444-1a2f-4249-8686-d0a5b03f529f']
Dec 06 10:20:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:20:35 np0005548789.localdomain dnsmasq[329441]: exiting on receipt of SIGTERM
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: libpod-0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b.scope: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain podman[329536]: 2025-12-06 10:20:35.632874642 +0000 UTC m=+0.069540927 container kill 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:20:35 np0005548789.localdomain podman[329569]: 2025-12-06 10:20:35.68447908 +0000 UTC m=+0.040283293 container died 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:35 np0005548789.localdomain podman[329569]: 2025-12-06 10:20:35.716731775 +0000 UTC m=+0.072535918 container cleanup 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: libpod-conmon-0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b.scope: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain podman[329571]: 2025-12-06 10:20:35.754074687 +0000 UTC m=+0.104367832 container remove 0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e449a5e0-9225-4a29-ab74-be48f680b8f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-1cca8e8acafd6a16f5767d7885b992dc7463fe2148062a0fa246de7d9c1c0baa-merged.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0099ac51aa4947a706ce8ce0a5fb406487f51dd73b2794c1ae5c1e761827e58b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2de449a5e0\x2d9225\x2d4a29\x2dab74\x2dbe48f680b8f1.mount: Deactivated successfully.
Dec 06 10:20:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:35 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3962963758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.890 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:35.894 2 INFO neutron.agent.securitygroups_rpc [None req-76169eb0-4558-4a54-88c6-853dfb7935a8 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['2bea4444-1a2f-4249-8686-d0a5b03f529f']
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.966 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:20:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:35.967 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.162 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.163 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11221MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.163 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.164 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.234 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.234 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.234 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.278 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: pgmap v351: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 28 KiB/s wr, 69 op/s
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: mgrmap e49: np0005548790.kvkfyr(active, since 8m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3962963758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:36 np0005548789.localdomain sshd[329618]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1176083142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.765 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.771 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.788 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.790 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:20:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:36.791 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e177 e177: 6 total, 6 up, 6 in
Dec 06 10:20:37 np0005548789.localdomain sshd[329618]: Received disconnect from 64.227.102.57 port 37536:11: Bye Bye [preauth]
Dec 06 10:20:37 np0005548789.localdomain sshd[329618]: Disconnected from authenticating user root 64.227.102.57 port 37536 [preauth]
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b_19292787-d6d4-497b-bcb5-105ffd3d6c15", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "snap_name": "bc9ed0e0-cf16-4288-b389-6bbd7bbfed8b", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1176083142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:37 np0005548789.localdomain ceph-mon[298582]: osdmap e177: 6 total, 6 up, 6 in
Dec 06 10:20:38 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:38.295 2 INFO neutron.agent.securitygroups_rpc [None req-10166391-fcb0-4201-a7d2-7443ab5c9b01 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:38.528 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:38 np0005548789.localdomain ceph-mon[298582]: pgmap v352: 177 pgs: 177 active+clean; 192 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 13 KiB/s wr, 36 op/s
Dec 06 10:20:38 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:38.765 2 INFO neutron.agent.securitygroups_rpc [None req-867b4687-c36d-47aa-8d2e-c76597d4a6cb 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:38 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:38.969 2 INFO neutron.agent.securitygroups_rpc [None req-80b8119e-1e57-4c4d-b95c-97abe74340b6 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:39.298 2 INFO neutron.agent.securitygroups_rpc [None req-335b324d-f81b-4cc4-b913-ab18408e9420 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:39 np0005548789.localdomain sshd[329622]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:20:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:39.648 2 INFO neutron.agent.securitygroups_rpc [None req-c2f184e7-70de-480a-95d7-35dc53af97f7 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2229978036' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2101341474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2101341474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.772 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.787 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.787 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.788 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.788 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:20:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:39.845 263652 INFO neutron.agent.linux.ip_lib [None req-1168e812-3032-4571-a4b2-f8e3c0ed8a19 - - - - - -] Device tap0769d4f9-3c cannot be used as it has no MAC address
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.874 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.878 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.878 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:20:39 np0005548789.localdomain kernel: device tap0769d4f9-3c entered promiscuous mode
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.879 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:20:39 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:39Z|00422|binding|INFO|Claiming lport 0769d4f9-3cf8-430d-87d0-faa554cf4d51 for this chassis.
Dec 06 10:20:39 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:39Z|00423|binding|INFO|0769d4f9-3cf8-430d-87d0-faa554cf4d51: Claiming unknown
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.884 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016439.8846] manager: (tap0769d4f9-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/70)
Dec 06 10:20:39 np0005548789.localdomain systemd-udevd[329634]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:39 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:39.894 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-610fcf3f-6e70-4d5d-9d9e-df794ff4196d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-610fcf3f-6e70-4d5d-9d9e-df794ff4196d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22180776-fd75-4bd6-be28-febd70acf464, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=0769d4f9-3cf8-430d-87d0-faa554cf4d51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:39 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:39.896 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 0769d4f9-3cf8-430d-87d0-faa554cf4d51 in datapath 610fcf3f-6e70-4d5d-9d9e-df794ff4196d bound to our chassis
Dec 06 10:20:39 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:39.898 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 610fcf3f-6e70-4d5d-9d9e-df794ff4196d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:39 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:39.898 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[26c5ff61-9ee7-4614-854a-696546a7904d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.918 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:39Z|00424|binding|INFO|Setting lport 0769d4f9-3cf8-430d-87d0-faa554cf4d51 ovn-installed in OVS
Dec 06 10:20:39 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:39Z|00425|binding|INFO|Setting lport 0769d4f9-3cf8-430d-87d0-faa554cf4d51 up in Southbound
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.925 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.957 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:39 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap0769d4f9-3c: No such device
Dec 06 10:20:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:39.987 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:40 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:40.453 2 INFO neutron.agent.securitygroups_rpc [None req-4752e8b4-a74a-419a-afa7-aac12ad63453 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['344ee16e-9c56-4a03-94c4-4549205a4025']
Dec 06 10:20:40 np0005548789.localdomain ceph-mon[298582]: pgmap v354: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 46 KiB/s wr, 46 op/s
Dec 06 10:20:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e178 e178: 6 total, 6 up, 6 in
Dec 06 10:20:40 np0005548789.localdomain sshd[329622]: Received disconnect from 14.194.101.210 port 51986:11: Bye Bye [preauth]
Dec 06 10:20:40 np0005548789.localdomain sshd[329622]: Disconnected from authenticating user root 14.194.101.210 port 51986 [preauth]
Dec 06 10:20:40 np0005548789.localdomain podman[329704]: 
Dec 06 10:20:40 np0005548789.localdomain podman[329704]: 2025-12-06 10:20:40.841812381 +0000 UTC m=+0.094896762 container create 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:40 np0005548789.localdomain systemd[1]: Started libpod-conmon-195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058.scope.
Dec 06 10:20:40 np0005548789.localdomain podman[329704]: 2025-12-06 10:20:40.799672782 +0000 UTC m=+0.052757203 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:40 np0005548789.localdomain systemd[1]: tmp-crun.AIQ7iv.mount: Deactivated successfully.
Dec 06 10:20:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:40.925 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:20:40 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:40 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac8af9b2a52417f488cf9670e26c867e6ab859687c3a4f72e20580bf57f1cadb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:40 np0005548789.localdomain podman[329704]: 2025-12-06 10:20:40.943507689 +0000 UTC m=+0.196592060 container init 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:20:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:40.952 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:20:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:40.953 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:20:40 np0005548789.localdomain podman[329704]: 2025-12-06 10:20:40.955810645 +0000 UTC m=+0.208895016 container start 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:20:40 np0005548789.localdomain dnsmasq[329722]: started, version 2.85 cachesize 150
Dec 06 10:20:40 np0005548789.localdomain dnsmasq[329722]: DNS service limited to local subnets
Dec 06 10:20:40 np0005548789.localdomain dnsmasq[329722]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:40 np0005548789.localdomain dnsmasq[329722]: warning: no upstream servers configured
Dec 06 10:20:40 np0005548789.localdomain dnsmasq-dhcp[329722]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:20:40 np0005548789.localdomain dnsmasq[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/addn_hosts - 0 addresses
Dec 06 10:20:40 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/host
Dec 06 10:20:40 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/opts
Dec 06 10:20:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:41.081 263652 INFO neutron.agent.dhcp.agent [None req-1168e812-3032-4571-a4b2-f8e3c0ed8a19 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:39Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa84d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa844c0>], id=36e3a010-4f1c-470f-b642-2ad82f1c412c, ip_allocation=immediate, mac_address=fa:16:3e:cf:00:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:36Z, description=, dns_domain=, id=610fcf3f-6e70-4d5d-9d9e-df794ff4196d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1607502893, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60833, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2475, status=ACTIVE, subnets=['a0f9db6e-5853-44b4-b7cb-c956203cda7f'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:38Z, vlan_transparent=None, network_id=610fcf3f-6e70-4d5d-9d9e-df794ff4196d, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2493, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:39Z on network 610fcf3f-6e70-4d5d-9d9e-df794ff4196d
Dec 06 10:20:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:41.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:41.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:41.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:41.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:20:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:41.190 263652 INFO neutron.agent.dhcp.agent [None req-62ad0b69-664f-49cd-a5e8-f16aafd2554b - - - - - -] DHCP configuration for ports {'e4db429b-394a-4ad3-95be-68082bed1436'} is completed
Dec 06 10:20:41 np0005548789.localdomain dnsmasq[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/addn_hosts - 1 addresses
Dec 06 10:20:41 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/host
Dec 06 10:20:41 np0005548789.localdomain podman[329741]: 2025-12-06 10:20:41.284470772 +0000 UTC m=+0.060276604 container kill 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:20:41 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/opts
Dec 06 10:20:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:41.442 263652 INFO neutron.agent.dhcp.agent [None req-1168e812-3032-4571-a4b2-f8e3c0ed8a19 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:39Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc06640>], id=36e3a010-4f1c-470f-b642-2ad82f1c412c, ip_allocation=immediate, mac_address=fa:16:3e:cf:00:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:36Z, description=, dns_domain=, id=610fcf3f-6e70-4d5d-9d9e-df794ff4196d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1607502893, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=60833, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2475, status=ACTIVE, subnets=['a0f9db6e-5853-44b4-b7cb-c956203cda7f'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:38Z, vlan_transparent=None, network_id=610fcf3f-6e70-4d5d-9d9e-df794ff4196d, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2493, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:39Z on network 610fcf3f-6e70-4d5d-9d9e-df794ff4196d
Dec 06 10:20:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:41.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:41.534 263652 INFO neutron.agent.dhcp.agent [None req-172e93d5-4bdf-47d5-9be3-3ec1864cf020 - - - - - -] DHCP configuration for ports {'36e3a010-4f1c-470f-b642-2ad82f1c412c'} is completed
Dec 06 10:20:41 np0005548789.localdomain dnsmasq[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/addn_hosts - 1 addresses
Dec 06 10:20:41 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/host
Dec 06 10:20:41 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/opts
Dec 06 10:20:41 np0005548789.localdomain podman[329781]: 2025-12-06 10:20:41.631068456 +0000 UTC m=+0.061753799 container kill 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ffb9c87f-9478-478c-bceb-83016344d5b8", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e60ef5e5-d14b-4505-930a-87b6f2872763", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:41 np0005548789.localdomain ceph-mon[298582]: osdmap e178: 6 total, 6 up, 6 in
Dec 06 10:20:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:20:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:20:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:41.891 263652 INFO neutron.agent.dhcp.agent [None req-70a6017e-0197-46bf-b006-34c5c8ec7b9b - - - - - -] DHCP configuration for ports {'36e3a010-4f1c-470f-b642-2ad82f1c412c'} is completed
Dec 06 10:20:41 np0005548789.localdomain podman[329802]: 2025-12-06 10:20:41.932866132 +0000 UTC m=+0.086539477 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Dec 06 10:20:41 np0005548789.localdomain podman[329802]: 2025-12-06 10:20:41.945174438 +0000 UTC m=+0.098847793 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Dec 06 10:20:41 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:20:42 np0005548789.localdomain podman[329801]: 2025-12-06 10:20:42.035608783 +0000 UTC m=+0.193193137 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:20:42 np0005548789.localdomain podman[329801]: 2025-12-06 10:20:42.054402447 +0000 UTC m=+0.211986801 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 06 10:20:42 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:20:42 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:20:42.095 2 INFO neutron.agent.securitygroups_rpc [None req-739af509-ab08-45e2-ba83-57dd6efc5660 95268a68b5c84162ba789100555874fb 7787060a7af94f168805e73d06841337 - - default default] Security group rule updated ['6ae4fdb3-8bab-4aac-9ae7-1f521287092b']
Dec 06 10:20:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:42.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:42 np0005548789.localdomain ceph-mon[298582]: pgmap v356: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 1.5 KiB/s rd, 32 KiB/s wr, 9 op/s
Dec 06 10:20:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.559 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:43.596 263652 INFO neutron.agent.linux.ip_lib [None req-8d1fb88d-36ee-4e82-9ff5-b0c494330ddd - - - - - -] Device tapb26e75de-36 cannot be used as it has no MAC address
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.612 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548789.localdomain kernel: device tapb26e75de-36 entered promiscuous mode
Dec 06 10:20:43 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016443.6189] manager: (tapb26e75de-36): new Generic device (/org/freedesktop/NetworkManager/Devices/71)
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.619 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548789.localdomain systemd-udevd[329848]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:43Z|00426|binding|INFO|Claiming lport b26e75de-365d-482e-b28d-740529191fa4 for this chassis.
Dec 06 10:20:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:43Z|00427|binding|INFO|b26e75de-365d-482e-b28d-740529191fa4: Claiming unknown
Dec 06 10:20:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:43.634 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f06eaa72-d4d5-4c14-80ef-691411a95b29', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f06eaa72-d4d5-4c14-80ef-691411a95b29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ae8e12d-32c2-4a99-98bb-39bcacc37749, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=b26e75de-365d-482e-b28d-740529191fa4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:43.635 160509 INFO neutron.agent.ovn.metadata.agent [-] Port b26e75de-365d-482e-b28d-740529191fa4 in datapath f06eaa72-d4d5-4c14-80ef-691411a95b29 bound to our chassis
Dec 06 10:20:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:43.636 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f06eaa72-d4d5-4c14-80ef-691411a95b29 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:43.637 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[91b4cb0f-28f4-4e6f-bfd7-b8042a8d84c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:43Z|00428|binding|INFO|Setting lport b26e75de-365d-482e-b28d-740529191fa4 ovn-installed in OVS
Dec 06 10:20:43 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:43Z|00429|binding|INFO|Setting lport b26e75de-365d-482e-b28d-740529191fa4 up in Southbound
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.651 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapb26e75de-36: No such device
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.683 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:43.705 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:44.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:44 np0005548789.localdomain podman[329919]: 
Dec 06 10:20:44 np0005548789.localdomain podman[329919]: 2025-12-06 10:20:44.540420099 +0000 UTC m=+0.093985184 container create d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:20:44 np0005548789.localdomain systemd[1]: Started libpod-conmon-d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f.scope.
Dec 06 10:20:44 np0005548789.localdomain podman[329919]: 2025-12-06 10:20:44.493303389 +0000 UTC m=+0.046868504 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:44 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:44 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5a61fe99463007b75615de9ea3a382eb9be1c48dd22fe90e27b8f895cfb3e13/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:44 np0005548789.localdomain podman[329919]: 2025-12-06 10:20:44.624296983 +0000 UTC m=+0.177862058 container init d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:44 np0005548789.localdomain podman[329919]: 2025-12-06 10:20:44.636585438 +0000 UTC m=+0.190150523 container start d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: started, version 2.85 cachesize 150
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: DNS service limited to local subnets
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: warning: no upstream servers configured
Dec 06 10:20:44 np0005548789.localdomain dnsmasq-dhcp[329937]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/addn_hosts - 0 addresses
Dec 06 10:20:44 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/host
Dec 06 10:20:44 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/opts
Dec 06 10:20:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:44.698 263652 INFO neutron.agent.dhcp.agent [None req-8d1fb88d-36ee-4e82-9ff5-b0c494330ddd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:43Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa36c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa36880>], id=2567678a-da62-49ba-a844-4eedfb76ec89, ip_allocation=immediate, mac_address=fa:16:3e:bd:71:69, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:41Z, description=, dns_domain=, id=f06eaa72-d4d5-4c14-80ef-691411a95b29, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-147281999, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2451, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2505, status=ACTIVE, subnets=['57e1c6ad-8a82-4f37-80d2-afc0882db070'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:42Z, vlan_transparent=None, network_id=f06eaa72-d4d5-4c14-80ef-691411a95b29, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2516, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:43Z on network f06eaa72-d4d5-4c14-80ef-691411a95b29
Dec 06 10:20:44 np0005548789.localdomain ceph-mon[298582]: pgmap v357: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 45 KiB/s wr, 33 op/s
Dec 06 10:20:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "format": "json"}]: dispatch
Dec 06 10:20:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9cf2fd4c-90f2-4604-9f84-44511d67581f", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:20:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:44.814 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:44.837 263652 INFO neutron.agent.dhcp.agent [None req-8bbf9abd-4ac2-4e39-89bb-9ec51351c174 - - - - - -] DHCP configuration for ports {'60798a37-6f91-427b-9ed4-71f6e72da734'} is completed
Dec 06 10:20:44 np0005548789.localdomain dnsmasq[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/addn_hosts - 1 addresses
Dec 06 10:20:44 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/host
Dec 06 10:20:44 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/opts
Dec 06 10:20:44 np0005548789.localdomain podman[329955]: 2025-12-06 10:20:44.927907844 +0000 UTC m=+0.066435962 container kill d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:20:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:45.071 263652 INFO neutron.agent.dhcp.agent [None req-8d1fb88d-36ee-4e82-9ff5-b0c494330ddd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:43Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbec970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbecb80>], id=2567678a-da62-49ba-a844-4eedfb76ec89, ip_allocation=immediate, mac_address=fa:16:3e:bd:71:69, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:41Z, description=, dns_domain=, id=f06eaa72-d4d5-4c14-80ef-691411a95b29, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-147281999, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2451, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2505, status=ACTIVE, subnets=['57e1c6ad-8a82-4f37-80d2-afc0882db070'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:42Z, vlan_transparent=None, network_id=f06eaa72-d4d5-4c14-80ef-691411a95b29, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2516, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:43Z on network f06eaa72-d4d5-4c14-80ef-691411a95b29
Dec 06 10:20:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:45.191 263652 INFO neutron.agent.dhcp.agent [None req-3617f720-88e4-4f19-9a4a-13fdec68d3e2 - - - - - -] DHCP configuration for ports {'2567678a-da62-49ba-a844-4eedfb76ec89'} is completed
Dec 06 10:20:45 np0005548789.localdomain dnsmasq[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/addn_hosts - 1 addresses
Dec 06 10:20:45 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/host
Dec 06 10:20:45 np0005548789.localdomain podman[329995]: 2025-12-06 10:20:45.266277717 +0000 UTC m=+0.056910270 container kill d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:20:45 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/opts
Dec 06 10:20:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:45.534 263652 INFO neutron.agent.dhcp.agent [None req-649b9c8f-2faf-49e9-bee9-80e451b0f7af - - - - - -] DHCP configuration for ports {'2567678a-da62-49ba-a844-4eedfb76ec89'} is completed
Dec 06 10:20:45 np0005548789.localdomain systemd[1]: tmp-crun.weHV3K.mount: Deactivated successfully.
Dec 06 10:20:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:46.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:20:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "format": "json"}]: dispatch
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:20:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:20:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:20:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:20:46 np0005548789.localdomain systemd[1]: tmp-crun.bGvbUQ.mount: Deactivated successfully.
Dec 06 10:20:46 np0005548789.localdomain podman[330016]: 2025-12-06 10:20:46.943033081 +0000 UTC m=+0.099807032 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 10:20:46 np0005548789.localdomain podman[330016]: 2025-12-06 10:20:46.959406091 +0000 UTC m=+0.116180032 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:20:46 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:20:47 np0005548789.localdomain ceph-mon[298582]: pgmap v358: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 45 KiB/s wr, 33 op/s
Dec 06 10:20:47 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4269218752' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:47 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2201144363' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:47.251 263652 INFO neutron.agent.linux.ip_lib [None req-a7aa87c8-1410-485c-8c21-9683e299cf3f - - - - - -] Device tap4064fcde-48 cannot be used as it has no MAC address
Dec 06 10:20:47 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:47.264 263652 INFO neutron.agent.linux.ip_lib [None req-2a790702-dd02-43f1-82b0-2834ceb4572d - - - - - -] Device tap879bec11-b0 cannot be used as it has no MAC address
Dec 06 10:20:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 e179: 6 total, 6 up, 6 in
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.338 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.339 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.339 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.340 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain kernel: device tap4064fcde-48 entered promiscuous mode
Dec 06 10:20:47 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016447.3481] manager: (tap4064fcde-48): new Generic device (/org/freedesktop/NetworkManager/Devices/72)
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.349 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain systemd-udevd[330053]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00430|binding|INFO|Claiming lport 4064fcde-485d-4c6f-a000-947ac03218a2 for this chassis.
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00431|binding|INFO|4064fcde-485d-4c6f-a000-947ac03218a2: Claiming unknown
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.353 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.362 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f30bcf50-145e-4db3-b0dc-90655a633fb3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f30bcf50-145e-4db3-b0dc-90655a633fb3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db91d3ff-70d4-4a4c-b9c4-2f176cf1a088, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=4064fcde-485d-4c6f-a000-947ac03218a2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.366 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 4064fcde-485d-4c6f-a000-947ac03218a2 in datapath f30bcf50-145e-4db3-b0dc-90655a633fb3 bound to our chassis
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.368 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f30bcf50-145e-4db3-b0dc-90655a633fb3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.369 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[81ff2373-92e0-4671-8151-1725cb8ca154]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain kernel: device tap879bec11-b0 entered promiscuous mode
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016447.3887] manager: (tap879bec11-b0): new Generic device (/org/freedesktop/NetworkManager/Devices/73)
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00432|binding|INFO|Setting lport 4064fcde-485d-4c6f-a000-947ac03218a2 ovn-installed in OVS
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00433|binding|INFO|Setting lport 4064fcde-485d-4c6f-a000-947ac03218a2 up in Southbound
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00434|if_status|INFO|Not updating pb chassis for 879bec11-b088-469d-aa6e-3244ad4a6eaa now as sb is readonly
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.390 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.395 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00435|binding|INFO|Claiming lport 879bec11-b088-469d-aa6e-3244ad4a6eaa for this chassis.
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00436|binding|INFO|879bec11-b088-469d-aa6e-3244ad4a6eaa: Claiming unknown
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00437|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.412 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f7147ce7-da0c-41ba-a4e4-4f73649998e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7147ce7-da0c-41ba-a4e4-4f73649998e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7af776d-5c3a-4693-aedf-d81ded3ff511, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=879bec11-b088-469d-aa6e-3244ad4a6eaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.414 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 879bec11-b088-469d-aa6e-3244ad4a6eaa in datapath f7147ce7-da0c-41ba-a4e4-4f73649998e1 bound to our chassis
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.416 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f7147ce7-da0c-41ba-a4e4-4f73649998e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:47.416 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[9914e9b1-4ca1-4877-966b-c05554c03825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00438|binding|INFO|Setting lport 879bec11-b088-469d-aa6e-3244ad4a6eaa ovn-installed in OVS
Dec 06 10:20:47 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:47Z|00439|binding|INFO|Setting lport 879bec11-b088-469d-aa6e-3244ad4a6eaa up in Southbound
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.422 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.432 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4064fcde-48: No such device
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.436 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.469 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.476 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:47.509 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:48.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:48 np0005548789.localdomain ceph-mon[298582]: pgmap v359: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 37 KiB/s wr, 27 op/s
Dec 06 10:20:48 np0005548789.localdomain ceph-mon[298582]: osdmap e179: 6 total, 6 up, 6 in
Dec 06 10:20:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2851515333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:48.339 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:48 np0005548789.localdomain podman[330161]: 
Dec 06 10:20:48 np0005548789.localdomain podman[330161]: 2025-12-06 10:20:48.513952471 +0000 UTC m=+0.164904071 container create ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:20:48 np0005548789.localdomain podman[330161]: 2025-12-06 10:20:48.437049889 +0000 UTC m=+0.088001469 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:48 np0005548789.localdomain podman[330191]: 
Dec 06 10:20:48 np0005548789.localdomain podman[330191]: 2025-12-06 10:20:48.55871784 +0000 UTC m=+0.066976449 container create 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a.scope.
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:20:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:48.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: Started libpod-conmon-8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3.scope.
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57032ec42d0979663c7485c872a98c14b639222c748bbc12b89800ff18f850d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:20:48 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f9d7a870fbdefd40cdca250115d626184dfc997615aca80df8a4bcabb41e6a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:20:48 np0005548789.localdomain podman[330191]: 2025-12-06 10:20:48.52732988 +0000 UTC m=+0.035588499 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:20:48 np0005548789.localdomain podman[330161]: 2025-12-06 10:20:48.62937525 +0000 UTC m=+0.280326870 container init ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:48 np0005548789.localdomain podman[330161]: 2025-12-06 10:20:48.636326912 +0000 UTC m=+0.287278532 container start ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: started, version 2.85 cachesize 150
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: DNS service limited to local subnets
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: warning: no upstream servers configured
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330223]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/addn_hosts - 0 addresses
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/host
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/opts
Dec 06 10:20:48 np0005548789.localdomain podman[330191]: 2025-12-06 10:20:48.680944166 +0000 UTC m=+0.189202785 container init 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:48 np0005548789.localdomain podman[330208]: 2025-12-06 10:20:48.685190476 +0000 UTC m=+0.080264014 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:20:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:48.690 263652 INFO neutron.agent.dhcp.agent [None req-a7aa87c8-1410-485c-8c21-9683e299cf3f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:46Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbec100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbecb50>], id=a886cd7f-2a4b-48cf-a81c-6f1b5f31d906, ip_allocation=immediate, mac_address=fa:16:3e:2b:09:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:45Z, description=, dns_domain=, id=f30bcf50-145e-4db3-b0dc-90655a633fb3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2123937341, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34481, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2528, status=ACTIVE, subnets=['62e86bb9-9b29-497a-b55d-2943cb7f321e'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:46Z, vlan_transparent=None, network_id=f30bcf50-145e-4db3-b0dc-90655a633fb3, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2536, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:46Z on network f30bcf50-145e-4db3-b0dc-90655a633fb3
Dec 06 10:20:48 np0005548789.localdomain podman[330191]: 2025-12-06 10:20:48.692405176 +0000 UTC m=+0.200663785 container start 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330236]: started, version 2.85 cachesize 150
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330236]: DNS service limited to local subnets
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330236]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330236]: warning: no upstream servers configured
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330236]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/addn_hosts - 0 addresses
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/host
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/opts
Dec 06 10:20:48 np0005548789.localdomain podman[330208]: 2025-12-06 10:20:48.701127933 +0000 UTC m=+0.096201461 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:20:48 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:20:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:48.826 263652 INFO neutron.agent.dhcp.agent [None req-cbfa8911-e9bf-4b29-b644-58576e753796 - - - - - -] DHCP configuration for ports {'c1c8c312-4919-48a9-9e6a-6fe48d28b732', '7e8acf71-084a-4d21-b509-2ae5cdca884c'} is completed
Dec 06 10:20:48 np0005548789.localdomain dnsmasq[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/addn_hosts - 1 addresses
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/host
Dec 06 10:20:48 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/opts
Dec 06 10:20:48 np0005548789.localdomain podman[330256]: 2025-12-06 10:20:48.841405231 +0000 UTC m=+0.037435285 container kill ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:20:48 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:48.962 263652 INFO neutron.agent.dhcp.agent [None req-a7aa87c8-1410-485c-8c21-9683e299cf3f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:20:46Z, description=, device_id=b4262675-0888-4a13-bb89-bb38cc732c6a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd304c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb0de50>], id=a886cd7f-2a4b-48cf-a81c-6f1b5f31d906, ip_allocation=immediate, mac_address=fa:16:3e:2b:09:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:20:45Z, description=, dns_domain=, id=f30bcf50-145e-4db3-b0dc-90655a633fb3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2123937341, port_security_enabled=True, project_id=24086b701d6b4d4081d2e63578d18d24, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34481, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2528, status=ACTIVE, subnets=['62e86bb9-9b29-497a-b55d-2943cb7f321e'], tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:46Z, vlan_transparent=None, network_id=f30bcf50-145e-4db3-b0dc-90655a633fb3, port_security_enabled=False, project_id=24086b701d6b4d4081d2e63578d18d24, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2536, status=DOWN, tags=[], tenant_id=24086b701d6b4d4081d2e63578d18d24, updated_at=2025-12-06T10:20:46Z on network f30bcf50-145e-4db3-b0dc-90655a633fb3
Dec 06 10:20:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:49.086 263652 INFO neutron.agent.dhcp.agent [None req-f523fe2a-b847-425f-b8e6-d5718fd58313 - - - - - -] DHCP configuration for ports {'a886cd7f-2a4b-48cf-a81c-6f1b5f31d906'} is completed
Dec 06 10:20:49 np0005548789.localdomain dnsmasq[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/addn_hosts - 1 addresses
Dec 06 10:20:49 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/host
Dec 06 10:20:49 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/opts
Dec 06 10:20:49 np0005548789.localdomain podman[330297]: 2025-12-06 10:20:49.088334779 +0000 UTC m=+0.040424016 container kill ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2477604507' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:20:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3778715811' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3778715811' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:49 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:49.333 263652 INFO neutron.agent.dhcp.agent [None req-44a13889-df8f-4b3b-938a-00ed0b900ce8 - - - - - -] DHCP configuration for ports {'a886cd7f-2a4b-48cf-a81c-6f1b5f31d906'} is completed
Dec 06 10:20:49 np0005548789.localdomain kernel: device tap879bec11-b0 left promiscuous mode
Dec 06 10:20:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:49.353 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:49Z|00440|binding|INFO|Releasing lport 879bec11-b088-469d-aa6e-3244ad4a6eaa from this chassis (sb_readonly=0)
Dec 06 10:20:49 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:49Z|00441|binding|INFO|Setting lport 879bec11-b088-469d-aa6e-3244ad4a6eaa down in Southbound
Dec 06 10:20:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:49.362 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f7147ce7-da0c-41ba-a4e4-4f73649998e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f7147ce7-da0c-41ba-a4e4-4f73649998e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7af776d-5c3a-4693-aedf-d81ded3ff511, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=879bec11-b088-469d-aa6e-3244ad4a6eaa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:49.364 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 879bec11-b088-469d-aa6e-3244ad4a6eaa in datapath f7147ce7-da0c-41ba-a4e4-4f73649998e1 unbound from our chassis
Dec 06 10:20:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:49.368 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f7147ce7-da0c-41ba-a4e4-4f73649998e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:49.369 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[5e10df99-f3f0-4b59-95b6-86e8dda150b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:49.372 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:49.852 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "format": "json"}]: dispatch
Dec 06 10:20:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "046a7cd8-3e98-42c3-b5fe-95f442606a2a", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:50 np0005548789.localdomain ceph-mon[298582]: pgmap v361: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 26 KiB/s wr, 31 op/s
Dec 06 10:20:50 np0005548789.localdomain dnsmasq[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/addn_hosts - 0 addresses
Dec 06 10:20:50 np0005548789.localdomain podman[330339]: 2025-12-06 10:20:50.621850926 +0000 UTC m=+0.060345826 container kill 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:20:50 np0005548789.localdomain dnsmasq-dhcp[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/host
Dec 06 10:20:50 np0005548789.localdomain dnsmasq-dhcp[330236]: read /var/lib/neutron/dhcp/f7147ce7-da0c-41ba-a4e4-4f73649998e1/opts
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent [None req-eaf9eb16-a44c-4397-80fe-f0265a5d2f08 - - - - - -] Unable to reload_allocations dhcp for f7147ce7-da0c-41ba-a4e4-4f73649998e1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap879bec11-b0 not found in namespace qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1.
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     return fut.result()
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     raise self._exception
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap879bec11-b0 not found in namespace qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1.
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.646 263652 ERROR neutron.agent.dhcp.agent 
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.650 263652 INFO neutron.agent.dhcp.agent [None req-83cda96d-a56f-4426-a417-563d5a2ec11d - - - - - -] Synchronizing state
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.795 263652 INFO neutron.agent.dhcp.agent [None req-9c335b9a-62ba-4c9c-9bab-e73d42c01cf5 - - - - - -] All active networks have been fetched through RPC.
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.796 263652 INFO neutron.agent.dhcp.agent [-] Starting network f7147ce7-da0c-41ba-a4e4-4f73649998e1 dhcp configuration
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.797 263652 INFO neutron.agent.dhcp.agent [-] Finished network f7147ce7-da0c-41ba-a4e4-4f73649998e1 dhcp configuration
Dec 06 10:20:50 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:50.797 263652 INFO neutron.agent.dhcp.agent [None req-9c335b9a-62ba-4c9c-9bab-e73d42c01cf5 - - - - - -] Synchronizing state complete
Dec 06 10:20:50 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:50Z|00442|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: tmp-crun.UfWM2f.mount: Deactivated successfully.
Dec 06 10:20:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:51.061 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:51 np0005548789.localdomain dnsmasq[330236]: exiting on receipt of SIGTERM
Dec 06 10:20:51 np0005548789.localdomain podman[330369]: 2025-12-06 10:20:51.063632291 +0000 UTC m=+0.109617162 container kill 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: libpod-8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3.scope: Deactivated successfully.
Dec 06 10:20:51 np0005548789.localdomain podman[330381]: 2025-12-06 10:20:51.138645064 +0000 UTC m=+0.060669465 container died 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:20:51 np0005548789.localdomain podman[330381]: 2025-12-06 10:20:51.167822326 +0000 UTC m=+0.089846697 container cleanup 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: libpod-conmon-8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3.scope: Deactivated successfully.
Dec 06 10:20:51 np0005548789.localdomain podman[330383]: 2025-12-06 10:20:51.222795556 +0000 UTC m=+0.135959947 container remove 8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f7147ce7-da0c-41ba-a4e4-4f73649998e1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-2f9d7a870fbdefd40cdca250115d626184dfc997615aca80df8a4bcabb41e6a0-merged.mount: Deactivated successfully.
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8789c720cf321fea4f8f7abd3ba6824d0b89d586574bc84c29a33fbaac0101b3-userdata-shm.mount: Deactivated successfully.
Dec 06 10:20:51 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2df7147ce7\x2dda0c\x2d41ba\x2da4e4\x2d4f73649998e1.mount: Deactivated successfully.
Dec 06 10:20:52 np0005548789.localdomain ceph-mon[298582]: pgmap v362: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 22 KiB/s wr, 26 op/s
Dec 06 10:20:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:53.501 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:53.601 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:20:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:20:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:20:53 np0005548789.localdomain systemd[1]: tmp-crun.4NSusJ.mount: Deactivated successfully.
Dec 06 10:20:53 np0005548789.localdomain podman[330411]: 2025-12-06 10:20:53.929583737 +0000 UTC m=+0.091435316 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 06 10:20:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:20:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163386 "" "Go-http-client/1.1"
Dec 06 10:20:54 np0005548789.localdomain podman[330411]: 2025-12-06 10:20:54.063790119 +0000 UTC m=+0.225641708 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:20:54 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:20:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:20:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 21172 "" "Go-http-client/1.1"
Dec 06 10:20:54 np0005548789.localdomain ceph-mon[298582]: pgmap v363: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:54.885 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "format": "json"}]: dispatch
Dec 06 10:20:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d4fe01f5-ec52-46f8-9434-cea0a18dbdb7", "force": true, "format": "json"}]: dispatch
Dec 06 10:20:56 np0005548789.localdomain ceph-mon[298582]: pgmap v364: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: pgmap v365: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 17 KiB/s wr, 21 op/s
Dec 06 10:20:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:58.646 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:20:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:58 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:58Z|00443|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:58 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:58.999 160509 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9c79be57-161f-4ebe-adf7-dbbe338f4139 with type ""
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.001 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-c68f9a6d-f183-4c32-ae20-3af5e94473b3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c68f9a6d-f183-4c32-ae20-3af5e94473b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa76bcfc789b4e53acf344cd0b1cd7c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=caee8882-f3cb-4a2a-a1c8-8579f9a721cf, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=3adb2c37-0f70-478d-98be-4e26b3a4f4ff) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.003 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 3adb2c37-0f70-478d-98be-4e26b3a4f4ff in datapath c68f9a6d-f183-4c32-ae20-3af5e94473b3 unbound from our chassis
Dec 06 10:20:59 np0005548789.localdomain podman[330452]: 2025-12-06 10:20:59.00422524 +0000 UTC m=+0.068159224 container kill c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.005 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c68f9a6d-f183-4c32-ae20-3af5e94473b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:20:59 np0005548789.localdomain dnsmasq[328281]: exiting on receipt of SIGTERM
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.006 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[83a4ac06-624c-40f1-981e-2599c247e2fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:59 np0005548789.localdomain systemd[1]: libpod-c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4.scope: Deactivated successfully.
Dec 06 10:20:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:59Z|00444|binding|INFO|Removing iface tap3adb2c37-0f ovn-installed in OVS
Dec 06 10:20:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:59Z|00445|binding|INFO|Removing lport 3adb2c37-0f70-478d-98be-4e26b3a4f4ff ovn-installed in OVS
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.010 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.013 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.020 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain podman[330465]: 2025-12-06 10:20:59.079097819 +0000 UTC m=+0.056964143 container died c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:20:59 np0005548789.localdomain podman[330465]: 2025-12-06 10:20:59.114894924 +0000 UTC m=+0.092761238 container cleanup c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:20:59 np0005548789.localdomain systemd[1]: libpod-conmon-c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4.scope: Deactivated successfully.
Dec 06 10:20:59 np0005548789.localdomain podman[330467]: 2025-12-06 10:20:59.158975521 +0000 UTC m=+0.129918512 container remove c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c68f9a6d-f183-4c32-ae20-3af5e94473b3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.169 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain kernel: device tap3adb2c37-0f left promiscuous mode
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.184 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:59.204 263652 INFO neutron.agent.dhcp.agent [None req-db069261-bad8-4c83-9f4e-4725ac28e4b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:20:59.205 263652 INFO neutron.agent.dhcp.agent [None req-db069261-bad8-4c83-9f4e-4725ac28e4b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:20:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:20:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4153025700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:20:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:59Z|00446|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.460 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain dnsmasq[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/addn_hosts - 0 addresses
Dec 06 10:20:59 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/host
Dec 06 10:20:59 np0005548789.localdomain dnsmasq-dhcp[330223]: read /var/lib/neutron/dhcp/f30bcf50-145e-4db3-b0dc-90655a633fb3/opts
Dec 06 10:20:59 np0005548789.localdomain podman[330512]: 2025-12-06 10:20:59.653802906 +0000 UTC m=+0.046605926 container kill ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.884 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:59Z|00447|binding|INFO|Releasing lport 4064fcde-485d-4c6f-a000-947ac03218a2 from this chassis (sb_readonly=0)
Dec 06 10:20:59 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:20:59Z|00448|binding|INFO|Setting lport 4064fcde-485d-4c6f-a000-947ac03218a2 down in Southbound
Dec 06 10:20:59 np0005548789.localdomain kernel: device tap4064fcde-48 left promiscuous mode
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.887 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.897 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f30bcf50-145e-4db3-b0dc-90655a633fb3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f30bcf50-145e-4db3-b0dc-90655a633fb3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db91d3ff-70d4-4a4c-b9c4-2f176cf1a088, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=4064fcde-485d-4c6f-a000-947ac03218a2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.899 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 4064fcde-485d-4c6f-a000-947ac03218a2 in datapath f30bcf50-145e-4db3-b0dc-90655a633fb3 unbound from our chassis
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.900 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f30bcf50-145e-4db3-b0dc-90655a633fb3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:20:59 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:20:59.902 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[fa06f38f-5ebd-477a-8519-36e63904ee44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:20:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:20:59.912 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-6d060ccaa9e70944fae8106a7cb62cb34c009da5ff146db45b624802025af9fb-merged.mount: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c2781052a3630cf50580874a9e8559954e4ce423e9575765f5c45967475b3fb4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2dc68f9a6d\x2df183\x2d4c32\x2dae20\x2d3af5e94473b3.mount: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain sudo[330534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:21:00 np0005548789.localdomain sudo[330534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:00 np0005548789.localdomain sudo[330534]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:00 np0005548789.localdomain sudo[330561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:21:00 np0005548789.localdomain sudo[330561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:00 np0005548789.localdomain dnsmasq[330223]: exiting on receipt of SIGTERM
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: libpod-ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a.scope: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain podman[330586]: 2025-12-06 10:21:00.32652679 +0000 UTC m=+0.047817352 container kill ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:21:00 np0005548789.localdomain podman[330607]: 2025-12-06 10:21:00.395135028 +0000 UTC m=+0.041805519 container died ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-57032ec42d0979663c7485c872a98c14b639222c748bbc12b89800ff18f850d9-merged.mount: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain podman[330607]: 2025-12-06 10:21:00.442401912 +0000 UTC m=+0.089072353 container remove ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f30bcf50-145e-4db3-b0dc-90655a633fb3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:21:00 np0005548789.localdomain ceph-mon[298582]: pgmap v366: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 21 KiB/s wr, 20 op/s
Dec 06 10:21:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/11180419' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:00.473 263652 INFO neutron.agent.dhcp.agent [None req-0d6bffb3-fd31-4fe0-9497-5449b3d5b3ab - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:00 np0005548789.localdomain systemd[1]: libpod-conmon-ef8cce1d55af1c1fc54e83e6459f29a59a45ba66244ff2947d8423eb55d75c4a.scope: Deactivated successfully.
Dec 06 10:21:00 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:00.521 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:00 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:00Z|00449|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:00.719 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:00 np0005548789.localdomain sudo[330561]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:01 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2df30bcf50\x2d145e\x2d4db3\x2db0dc\x2d90655a633fb3.mount: Deactivated successfully.
Dec 06 10:21:01 np0005548789.localdomain sudo[330661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:21:01 np0005548789.localdomain sudo[330661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:21:01 np0005548789.localdomain sudo[330661]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:21:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:21:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:21:01 np0005548789.localdomain dnsmasq[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/addn_hosts - 0 addresses
Dec 06 10:21:01 np0005548789.localdomain podman[330694]: 2025-12-06 10:21:01.78683607 +0000 UTC m=+0.069538286 container kill d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:21:01 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/host
Dec 06 10:21:01 np0005548789.localdomain dnsmasq-dhcp[329937]: read /var/lib/neutron/dhcp/f06eaa72-d4d5-4c14-80ef-691411a95b29/opts
Dec 06 10:21:01 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:01Z|00450|binding|INFO|Releasing lport b26e75de-365d-482e-b28d-740529191fa4 from this chassis (sb_readonly=0)
Dec 06 10:21:01 np0005548789.localdomain kernel: device tapb26e75de-36 left promiscuous mode
Dec 06 10:21:01 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:01Z|00451|binding|INFO|Setting lport b26e75de-365d-482e-b28d-740529191fa4 down in Southbound
Dec 06 10:21:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:01.955 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:01.970 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-f06eaa72-d4d5-4c14-80ef-691411a95b29', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f06eaa72-d4d5-4c14-80ef-691411a95b29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ae8e12d-32c2-4a99-98bb-39bcacc37749, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=b26e75de-365d-482e-b28d-740529191fa4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:01.972 160509 INFO neutron.agent.ovn.metadata.agent [-] Port b26e75de-365d-482e-b28d-740529191fa4 in datapath f06eaa72-d4d5-4c14-80ef-691411a95b29 unbound from our chassis
Dec 06 10:21:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:01.974 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f06eaa72-d4d5-4c14-80ef-691411a95b29 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:01.974 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:01 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:01.976 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa683a4-7a4e-4ad5-aff3-8c30525169e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:02 np0005548789.localdomain sshd[330727]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:02 np0005548789.localdomain dnsmasq[329937]: exiting on receipt of SIGTERM
Dec 06 10:21:02 np0005548789.localdomain systemd[1]: tmp-crun.vnXDLf.mount: Deactivated successfully.
Dec 06 10:21:02 np0005548789.localdomain podman[330737]: 2025-12-06 10:21:02.499896427 +0000 UTC m=+0.053908478 container kill d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:21:02 np0005548789.localdomain systemd[1]: libpod-d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f.scope: Deactivated successfully.
Dec 06 10:21:02 np0005548789.localdomain podman[330751]: 2025-12-06 10:21:02.546170221 +0000 UTC m=+0.038113655 container died d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:02 np0005548789.localdomain ceph-mon[298582]: pgmap v367: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s rd, 11 KiB/s wr, 14 op/s
Dec 06 10:21:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:21:02 np0005548789.localdomain podman[330751]: 2025-12-06 10:21:02.624504776 +0000 UTC m=+0.116448180 container cleanup d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:21:02 np0005548789.localdomain systemd[1]: libpod-conmon-d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f.scope: Deactivated successfully.
Dec 06 10:21:02 np0005548789.localdomain podman[330756]: 2025-12-06 10:21:02.646869699 +0000 UTC m=+0.130940642 container remove d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f06eaa72-d4d5-4c14-80ef-691411a95b29, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:02.675 263652 INFO neutron.agent.dhcp.agent [None req-f0da0179-f0c2-4644-8abf-3e5f20050ff3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:02 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:02.675 263652 INFO neutron.agent.dhcp.agent [None req-f0da0179-f0c2-4644-8abf-3e5f20050ff3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:02Z|00452|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:02.806 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:03.217 263652 INFO neutron.agent.linux.ip_lib [None req-b27466b5-66c1-43f6-bb93-5c3ac987bdd4 - - - - - -] Device tap48aef5f2-0c cannot be used as it has no MAC address
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.243 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain kernel: device tap48aef5f2-0c entered promiscuous mode
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.250 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:03Z|00453|binding|INFO|Claiming lport 48aef5f2-0c04-4b34-bd2d-f71862404e37 for this chassis.
Dec 06 10:21:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:03Z|00454|binding|INFO|48aef5f2-0c04-4b34-bd2d-f71862404e37: Claiming unknown
Dec 06 10:21:03 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016463.2531] manager: (tap48aef5f2-0c): new Generic device (/org/freedesktop/NetworkManager/Devices/74)
Dec 06 10:21:03 np0005548789.localdomain systemd-udevd[330788]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:03.266 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5986df1f-13f3-42c1-bcc4-79dcf74a49a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5986df1f-13f3-42c1-bcc4-79dcf74a49a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5413063-0727-4de9-8605-e62b7d56e9f4, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=48aef5f2-0c04-4b34-bd2d-f71862404e37) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:03.268 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 48aef5f2-0c04-4b34-bd2d-f71862404e37 in datapath 5986df1f-13f3-42c1-bcc4-79dcf74a49a9 bound to our chassis
Dec 06 10:21:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:03.271 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5986df1f-13f3-42c1-bcc4-79dcf74a49a9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:03 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:03.272 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[16fe7515-fb92-4450-9b52-cfd23738e8ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:03Z|00455|binding|INFO|Setting lport 48aef5f2-0c04-4b34-bd2d-f71862404e37 ovn-installed in OVS
Dec 06 10:21:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:03Z|00456|binding|INFO|Setting lport 48aef5f2-0c04-4b34-bd2d-f71862404e37 up in Southbound
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.290 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:03 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap48aef5f2-0c: No such device
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.334 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.361 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e5a61fe99463007b75615de9ea3a382eb9be1c48dd22fe90e27b8f895cfb3e13-merged.mount: Deactivated successfully.
Dec 06 10:21:03 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d97fd00f3e78882d783a796607d4bd6632cdb7dd83587cae5671ab11021d365f-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:03 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2df06eaa72\x2dd4d5\x2d4c14\x2d80ef\x2d691411a95b29.mount: Deactivated successfully.
Dec 06 10:21:03 np0005548789.localdomain sshd[330727]: Received disconnect from 179.33.210.213 port 47270:11: Bye Bye [preauth]
Dec 06 10:21:03 np0005548789.localdomain sshd[330727]: Disconnected from authenticating user root 179.33.210.213 port 47270 [preauth]
Dec 06 10:21:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:03.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:03 np0005548789.localdomain podman[330852]: 2025-12-06 10:21:03.814703288 +0000 UTC m=+0.036753205 container kill 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:03 np0005548789.localdomain dnsmasq[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/addn_hosts - 0 addresses
Dec 06 10:21:03 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/host
Dec 06 10:21:03 np0005548789.localdomain dnsmasq-dhcp[329722]: read /var/lib/neutron/dhcp/610fcf3f-6e70-4d5d-9d9e-df794ff4196d/opts
Dec 06 10:21:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:04.117 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:04Z|00457|binding|INFO|Releasing lport 0769d4f9-3cf8-430d-87d0-faa554cf4d51 from this chassis (sb_readonly=0)
Dec 06 10:21:04 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:04Z|00458|binding|INFO|Setting lport 0769d4f9-3cf8-430d-87d0-faa554cf4d51 down in Southbound
Dec 06 10:21:04 np0005548789.localdomain kernel: device tap0769d4f9-3c left promiscuous mode
Dec 06 10:21:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:04.127 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-610fcf3f-6e70-4d5d-9d9e-df794ff4196d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-610fcf3f-6e70-4d5d-9d9e-df794ff4196d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24086b701d6b4d4081d2e63578d18d24', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22180776-fd75-4bd6-be28-febd70acf464, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=0769d4f9-3cf8-430d-87d0-faa554cf4d51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:04.129 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 0769d4f9-3cf8-430d-87d0-faa554cf4d51 in datapath 610fcf3f-6e70-4d5d-9d9e-df794ff4196d unbound from our chassis
Dec 06 10:21:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:04.130 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 610fcf3f-6e70-4d5d-9d9e-df794ff4196d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:04.133 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[65fddb1c-4123-4f9a-a5e5-2c19aae4ea83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:04.140 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:04 np0005548789.localdomain podman[330898]: 
Dec 06 10:21:04 np0005548789.localdomain podman[330898]: 2025-12-06 10:21:04.174324081 +0000 UTC m=+0.080308906 container create 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:21:04 np0005548789.localdomain systemd[1]: Started libpod-conmon-4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53.scope.
Dec 06 10:21:04 np0005548789.localdomain podman[330898]: 2025-12-06 10:21:04.127429968 +0000 UTC m=+0.033414843 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:04 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:04 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ed23ebe6a60fad4889f2ca706b63873c667d977225c0a5045f8dfc5999e212b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:04 np0005548789.localdomain podman[330898]: 2025-12-06 10:21:04.266466137 +0000 UTC m=+0.172450932 container init 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:21:04 np0005548789.localdomain podman[330898]: 2025-12-06 10:21:04.276673519 +0000 UTC m=+0.182658314 container start 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[330917]: started, version 2.85 cachesize 150
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[330917]: DNS service limited to local subnets
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[330917]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[330917]: warning: no upstream servers configured
Dec 06 10:21:04 np0005548789.localdomain dnsmasq-dhcp[330917]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/addn_hosts - 0 addresses
Dec 06 10:21:04 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/host
Dec 06 10:21:04 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/opts
Dec 06 10:21:04 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:04.430 263652 INFO neutron.agent.dhcp.agent [None req-ef5cbd84-a521-453c-b2ba-f80430a0d2b1 - - - - - -] DHCP configuration for ports {'a0b3edf2-b200-4d5e-9f11-a2af9c2d7b08'} is completed
Dec 06 10:21:04 np0005548789.localdomain systemd[1]: tmp-crun.rOlUzt.mount: Deactivated successfully.
Dec 06 10:21:04 np0005548789.localdomain ceph-mon[298582]: pgmap v368: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 12 KiB/s wr, 31 op/s
Dec 06 10:21:04 np0005548789.localdomain sshd[330918]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:04 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:04.617 2 INFO neutron.agent.securitygroups_rpc [None req-a8006eb6-87bf-4e51-be14-a8e4ed75c69e a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:04 np0005548789.localdomain podman[330937]: 2025-12-06 10:21:04.882202819 +0000 UTC m=+0.055210208 container kill 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:21:04 np0005548789.localdomain systemd[1]: tmp-crun.c9bQGp.mount: Deactivated successfully.
Dec 06 10:21:04 np0005548789.localdomain dnsmasq[329722]: exiting on receipt of SIGTERM
Dec 06 10:21:04 np0005548789.localdomain systemd[1]: libpod-195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058.scope: Deactivated successfully.
Dec 06 10:21:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:04.917 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:04 np0005548789.localdomain podman[330951]: 2025-12-06 10:21:04.979240156 +0000 UTC m=+0.054394804 container died 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:21:05 np0005548789.localdomain podman[330951]: 2025-12-06 10:21:05.093190379 +0000 UTC m=+0.168344977 container cleanup 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: libpod-conmon-195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058.scope: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain podman[330952]: 2025-12-06 10:21:05.123088242 +0000 UTC m=+0.195111334 container remove 195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-610fcf3f-6e70-4d5d-9d9e-df794ff4196d, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:05 np0005548789.localdomain podman[330978]: 2025-12-06 10:21:05.173629657 +0000 UTC m=+0.137916916 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:21:05 np0005548789.localdomain podman[330978]: 2025-12-06 10:21:05.183210711 +0000 UTC m=+0.147497990 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:21:05 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:05.196 263652 INFO neutron.agent.dhcp.agent [None req-5b4974ab-1455-46f4-ace8-29cbc2e54a8a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain podman[330979]: 2025-12-06 10:21:05.238347816 +0000 UTC m=+0.195685162 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:21:05 np0005548789.localdomain podman[330979]: 2025-12-06 10:21:05.26954698 +0000 UTC m=+0.226884296 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:05.441 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: tmp-crun.w2zIry.mount: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-ac8af9b2a52417f488cf9670e26c867e6ab859687c3a4f72e20580bf57f1cadb-merged.mount: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-195bb5c97a1927fb0566a11eeb513a05a2f050ad98f5e7dc5af8ef23feb91058-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d610fcf3f\x2d6e70\x2d4d5d\x2d9d9e\x2ddf794ff4196d.mount: Deactivated successfully.
Dec 06 10:21:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:05Z|00459|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:05.729 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:05 np0005548789.localdomain sshd[330918]: Received disconnect from 154.113.10.34 port 41238:11: Bye Bye [preauth]
Dec 06 10:21:05 np0005548789.localdomain sshd[330918]: Disconnected from authenticating user root 154.113.10.34 port 41238 [preauth]
Dec 06 10:21:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:06.329 2 INFO neutron.agent.securitygroups_rpc [None req-58c814b2-70d6-487c-b572-f1a393accc35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:06 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:06.563 2 INFO neutron.agent.securitygroups_rpc [None req-58c814b2-70d6-487c-b572-f1a393accc35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:06 np0005548789.localdomain ceph-mon[298582]: pgmap v369: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 7.7 KiB/s wr, 19 op/s
Dec 06 10:21:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:07.387 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:06Z, description=, device_id=7ecd23ba-4ca3-4eae-9829-cff158a165a0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5f490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc5f400>], id=3f7091ee-adf1-4a41-bf51-535f147c89c5, ip_allocation=immediate, mac_address=fa:16:3e:67:09:c2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:00Z, description=, dns_domain=, id=5986df1f-13f3-42c1-bcc4-79dcf74a49a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1321854165, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10820, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2609, status=ACTIVE, subnets=['b8481c35-3dcb-4ca8-9bae-441805cdac62'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:02Z, vlan_transparent=None, network_id=5986df1f-13f3-42c1-bcc4-79dcf74a49a9, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2656, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:07Z on network 5986df1f-13f3-42c1-bcc4-79dcf74a49a9
Dec 06 10:21:07 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:07.529 2 INFO neutron.agent.securitygroups_rpc [None req-beefe55f-e6d8-4aae-b3a4-9c077707e8ab a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:07.549 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:07 np0005548789.localdomain dnsmasq[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/addn_hosts - 1 addresses
Dec 06 10:21:07 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/host
Dec 06 10:21:07 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/opts
Dec 06 10:21:07 np0005548789.localdomain podman[331040]: 2025-12-06 10:21:07.604930078 +0000 UTC m=+0.048839103 container kill 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:07 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:07 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.915 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.920 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ff64df5-1d46-42f0-9389-4835c5ed86a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.916486', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '476fb4f6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': 'a8116c88aaac6ed31d27cdbfcf86bf1a073c0810c45244d9aa4ca4b4edfbacb6'}]}, 'timestamp': '2025-12-06 10:21:07.921374', '_unique_id': '3063b1d99ec4418a8e6e128a1714f575'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.923 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.924 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd700e681-36d9-4445-8925-1da916dd1656', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.924832', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '47705028-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '281a09345acc4ceb66e259964dfd5d0588406e3fabac0e4b003e776c5a64db8c'}]}, 'timestamp': '2025-12-06 10:21:07.925184', '_unique_id': '7f4dbdfab1d7479eafe392419801374d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.937 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.938 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dce0d41e-c09e-4687-a251-66c959482fa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:07.926788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '47724978-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': '69142bf732726df01f02411a8436af9e2ce0abcfc283d1d0560b3b7bc316ae7f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:07.926788', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '47726458-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': 'd72c1b4a26daf25f85018b4fe9ce8bbc98faef546d1945f2a94ad38447bb984f'}]}, 'timestamp': '2025-12-06 10:21:07.938893', '_unique_id': 'b4d3dc24355a4fd5bbc921b3381f292f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.940 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.942 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b46c9a5-33a1-4075-9f19-4f7b793a820a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.942162', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '4772fa1c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '2e9594678f7c711123d7d4cd0a466be63a8bd01798738a54f935c2c7b1138695'}]}, 'timestamp': '2025-12-06 10:21:07.942712', '_unique_id': 'a7e6ab283d3a42cf8919d2797ba15019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.946 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:07.945 263652 INFO neutron.agent.dhcp.agent [None req-7a18cd85-4f16-4020-a5c4-3b18d05ef74b - - - - - -] DHCP configuration for ports {'3f7091ee-adf1-4a41-bf51-535f147c89c5'} is completed
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a46e9bf0-dd55-4ef8-8715-9848a2dee6f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.946529', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '4773aa52-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '38c9c1121bcf7f2e2e1d876c3dca37ce491df8239e40bb23a6444f80b17d042a'}]}, 'timestamp': '2025-12-06 10:21:07.947460', '_unique_id': '2261033e4f1d44ebb0f1a3df0385ac5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.950 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df33f971-93bf-48b8-a2a8-f9bd5919e83f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.950585', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '47744638-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': 'ece13a0ee25152f7285c339508311f11f9f8414d03d0e21af989ad7bf3245815'}]}, 'timestamp': '2025-12-06 10:21:07.951312', '_unique_id': '245f9ffae1c74f1998e5e07ac8b69f64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.987 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aa9fc0e-d9cd-49e8-a9e0-fa229d486e4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:07.954692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4779be6a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '4b003e5dd74879618f3a156e75c82a1456ccab535b0d079102e99d6e6ed8400b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:07.954692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4779d292-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': 'f6579e9ba956858112bb9ad6f06253984960bd862fdae01e202f577b3e34eba5'}]}, 'timestamp': '2025-12-06 10:21:07.987546', '_unique_id': '382482c07233401caab11686e0f21326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.991 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56815010-2f2a-4e6f-ac1c-32eff6764621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:07.991026', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '477a6d92-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': 'bd60571b238dd35b35ed66301f654b3721ed77a922fd572f1740eb2783105e84'}]}, 'timestamp': '2025-12-06 10:21:07.991672', '_unique_id': 'ec3fb65239e24d1dba6cef9ce56a49a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fd8bc16-cf2f-477c-9f23-d13d70c15939', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:21:07.994711', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '477d5eee-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.259447318, 'message_signature': '360bcb06b3d64126591c5a04364fdd6a319534899409c25d8ffd4ba37849f3db'}]}, 'timestamp': '2025-12-06 10:21:08.010914', '_unique_id': '3da0899143144767acc586d66cd3cd08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.013 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.014 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51e6ca14-cee8-4c8b-b783-c1ed65a82bd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.013875', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '477ded8c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': '04ef3857d19f81e388a3208e1d4df35f643e8a07dae003e7c9ea606c8b250d8b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.013875', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '477e00ce-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': '70562f9024703b0109965949997ec90feeb7f538958375b691bb5cf417fe5592'}]}, 'timestamp': '2025-12-06 10:21:08.014999', '_unique_id': '2b7313cb68fc45008a1418ff6ff82b0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.016 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.017 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.017 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5902eb1a-7ea2-494c-a02d-aa93517a49a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.017529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '477e7a22-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '9b874b6dbb53fa3e40f8cc3ef7ae038922a007136ffd4f99b10d91380a822d79'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.017529', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '477e8be8-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': 'd8f50070d704eca3a189e0fd7ddd730fe92cf65abc3c4282d63cc54f9522abde'}]}, 'timestamp': '2025-12-06 10:21:08.018497', '_unique_id': 'b9896e64673d4e689ec7b0771bfedf2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.019 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.020 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.020 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.021 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2f4c53a-a6e2-4c2a-be11-e78e00e19131', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.020916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '477efd1c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '4095b4d897fed516baa3fd440bc44352f39208d74edef3113033e164e0ce74f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.020916', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '477f10a4-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '501c685a1258f21f51ef2125eba117f9df9016356dd74df7689793290d16bda7'}]}, 'timestamp': '2025-12-06 10:21:08.021929', '_unique_id': '3c35314656ef444aaeed4bb727cfcb9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.024 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.024 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 18120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16b7913b-666c-4f87-b11c-33d0021df07e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18120000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:21:08.024923', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '477f992a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.259447318, 'message_signature': '6a42028a5ad482f765fe66ecb8c0cdce02cf7b7a98eea37d741bab456f011849'}]}, 'timestamp': '2025-12-06 10:21:08.025444', '_unique_id': '74b2031bea154dbdba05e6b33da6e47f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.027 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.027 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '307ce6bf-eb8f-4fce-baa2-b3bccec308ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:08.027799', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '47800a72-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': 'fb053d1eda83b3b6dcd127eb2d6a6b8b83ab28f2010acfff5d1f1140ad507077'}]}, 'timestamp': '2025-12-06 10:21:08.028324', '_unique_id': '1670a4c12b704344a147b5c99c218c7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.029 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.030 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.031 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5cafd48-f1f5-497d-a435-0f7df2a32ad4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.030856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4780816e-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '2b7f5ddd0cb5d117140ebadf3865b008a42a0c247b58c554ca2e6bcccba74429'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.030856', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '47809302-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '1584d3a6d74c625965a15a239bc26b816b6339d638b9f1703288bcfc6194d9c6'}]}, 'timestamp': '2025-12-06 10:21:08.031852', '_unique_id': '07cd2ad0d33045a085e05bf9bf0b1377'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.032 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.034 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.034 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98dfa394-d37e-4e33-a2a8-87dc52b6ecd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:08.034706', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '47811f7a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '10fd1d11dcc4c2fd735823b1ecb6d105e11bbec351158368a510c707bab3e1c6'}]}, 'timestamp': '2025-12-06 10:21:08.035520', '_unique_id': 'ba817ad0a29e4cf4b8468a5b5f912124'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.038 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9db31de4-ea3c-42ff-856f-ee16995f158f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.038564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4781b2e6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': 'c6df7914702b5af2cfbe9602a20e2d97f72d1afcb17fb332a0529746979dc690'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.038564', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4781c7d6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.176148892, 'message_signature': '1fae707ab140ce786f6dcfcc4dc35d612399b2f4d6062d0609766e0e35c31a97'}]}, 'timestamp': '2025-12-06 10:21:08.039713', '_unique_id': '06567edceb3f41e8941dbcd0397d8b40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.042 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.042 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7fb265a-85bd-4c90-a44e-f203189c8abe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:08.042626', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '478252b4-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '0e0bf24c1723e38bb4c8593d5f77531e4888f27fe5fd6dc41efbbde6d3075ea3'}]}, 'timestamp': '2025-12-06 10:21:08.043284', '_unique_id': '1a9046cb48ea41b19f10e8dcee0bcca4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.046 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c16fde-761d-4a2d-86c0-306a8d868df7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.045794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4782cc1c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '5a8bbb6b739940245c72cd4acadcd26708eaa757f180b3434fa7aa66ae9de6c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.045794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4782dc34-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '935c7054c800ceef23c2d5329ff24ceffab127871b61ca4a134cf3bd8004c7bc'}]}, 'timestamp': '2025-12-06 10:21:08.046685', '_unique_id': 'e6225dd6b33a423ca1eccc088eeb1482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.047 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.048 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.048 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.048 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bae18df-68f0-4020-aca4-a8771fa54657', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:21:08.048165', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '47832360-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '83781222efa8d8575414aa46d658de500d6edb8954947b0709b5245972e7d677'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:21:08.048165', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4783312a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.20423423, 'message_signature': '86ef0f6ba7ead2905ebf685d53365a1f27d8702c6aa6deffce6e9162ad9ab83b'}]}, 'timestamp': '2025-12-06 10:21:08.048923', '_unique_id': '0e93c74b021844569753cae5e9e693bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.050 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.050 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97514aaa-cd39-4be8-908d-e7bd2d893c02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:21:08.050908', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '47838eae-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 12886.165819636, 'message_signature': '5723cfa44abd4559504f1f79117ef6ce238b9fdc3424adf0e5e1d9f12e634140'}]}, 'timestamp': '2025-12-06 10:21:08.051290', '_unique_id': 'fff598fd3096457185c4eb98a91e1348'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:21:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:21:08.051 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: pgmap v370: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 7.7 KiB/s wr, 19 op/s
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2520375608' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2520375608' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3418725974' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:08 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:08.502 2 INFO neutron.agent.securitygroups_rpc [None req-4469b72b-a2a8-46c0-b170-fb645c70fec6 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:08.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:09.277 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:06Z, description=, device_id=7ecd23ba-4ca3-4eae-9829-cff158a165a0, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa40c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc17940>], id=3f7091ee-adf1-4a41-bf51-535f147c89c5, ip_allocation=immediate, mac_address=fa:16:3e:67:09:c2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:00Z, description=, dns_domain=, id=5986df1f-13f3-42c1-bcc4-79dcf74a49a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1321854165, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10820, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2609, status=ACTIVE, subnets=['b8481c35-3dcb-4ca8-9bae-441805cdac62'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:02Z, vlan_transparent=None, network_id=5986df1f-13f3-42c1-bcc4-79dcf74a49a9, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2656, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:07Z on network 5986df1f-13f3-42c1-bcc4-79dcf74a49a9
Dec 06 10:21:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:09.314 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "format": "json"}]: dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/126387762' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3307122424' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:09 np0005548789.localdomain dnsmasq[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/addn_hosts - 1 addresses
Dec 06 10:21:09 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/host
Dec 06 10:21:09 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/opts
Dec 06 10:21:09 np0005548789.localdomain podman[331079]: 2025-12-06 10:21:09.520891877 +0000 UTC m=+0.070032722 container kill 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:21:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:09.824 263652 INFO neutron.agent.dhcp.agent [None req-83c88103-66c6-4f19-88b2-1aa7358b69a2 - - - - - -] DHCP configuration for ports {'3f7091ee-adf1-4a41-bf51-535f147c89c5'} is completed
Dec 06 10:21:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:09.967 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:10 np0005548789.localdomain ceph-mon[298582]: pgmap v371: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 9.9 KiB/s wr, 62 op/s
Dec 06 10:21:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1826059961' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:12.055 2 INFO neutron.agent.securitygroups_rpc [None req-a8044dd6-257e-4d6a-a5a6-1617984725a4 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:12 np0005548789.localdomain ceph-mon[298582]: pgmap v372: 177 pgs: 177 active+clean; 193 MiB data, 904 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 3.7 KiB/s wr, 60 op/s
Dec 06 10:21:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:12.762 2 INFO neutron.agent.securitygroups_rpc [None req-ce6b28f9-e93f-45d2-8a9b-cc88bd3abac1 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:21:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:21:12 np0005548789.localdomain podman[331101]: 2025-12-06 10:21:12.930693768 +0000 UTC m=+0.087217337 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 06 10:21:12 np0005548789.localdomain systemd[1]: tmp-crun.FXH0zB.mount: Deactivated successfully.
Dec 06 10:21:12 np0005548789.localdomain podman[331101]: 2025-12-06 10:21:12.972332171 +0000 UTC m=+0.128855770 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:12 np0005548789.localdomain podman[331100]: 2025-12-06 10:21:12.979608053 +0000 UTC m=+0.139057252 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=)
Dec 06 10:21:12 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:21:12 np0005548789.localdomain podman[331100]: 2025-12-06 10:21:12.996128459 +0000 UTC m=+0.155577668 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git)
Dec 06 10:21:13 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:21:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:13.079 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:13.080 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:21:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:13.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46", "format": "json"}]: dispatch
Dec 06 10:21:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:13.773 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:14 np0005548789.localdomain ceph-mon[298582]: pgmap v373: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 8.5 KiB/s wr, 101 op/s
Dec 06 10:21:14 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:14.547 2 INFO neutron.agent.securitygroups_rpc [None req-ff4891e2-327c-4b51-b5fc-5a809ca2f304 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['903bc67e-3e9d-4b17-b669-c51b1cfd9fe6']
Dec 06 10:21:14 np0005548789.localdomain dnsmasq[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/addn_hosts - 0 addresses
Dec 06 10:21:14 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/host
Dec 06 10:21:14 np0005548789.localdomain podman[331154]: 2025-12-06 10:21:14.601838573 +0000 UTC m=+0.062925315 container kill 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:14 np0005548789.localdomain dnsmasq-dhcp[330917]: read /var/lib/neutron/dhcp/5986df1f-13f3-42c1-bcc4-79dcf74a49a9/opts
Dec 06 10:21:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:14.799 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:14 np0005548789.localdomain kernel: device tap48aef5f2-0c left promiscuous mode
Dec 06 10:21:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:14Z|00460|binding|INFO|Releasing lport 48aef5f2-0c04-4b34-bd2d-f71862404e37 from this chassis (sb_readonly=0)
Dec 06 10:21:14 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:14Z|00461|binding|INFO|Setting lport 48aef5f2-0c04-4b34-bd2d-f71862404e37 down in Southbound
Dec 06 10:21:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:14.814 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-5986df1f-13f3-42c1-bcc4-79dcf74a49a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5986df1f-13f3-42c1-bcc4-79dcf74a49a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5413063-0727-4de9-8605-e62b7d56e9f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=48aef5f2-0c04-4b34-bd2d-f71862404e37) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:14.817 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 48aef5f2-0c04-4b34-bd2d-f71862404e37 in datapath 5986df1f-13f3-42c1-bcc4-79dcf74a49a9 unbound from our chassis
Dec 06 10:21:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:14.819 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5986df1f-13f3-42c1-bcc4-79dcf74a49a9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:21:14 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:14.820 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[42af6f00-5eee-4c55-91dd-5fc8f7ae0dea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:14.821 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:14 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:14.888 2 INFO neutron.agent.securitygroups_rpc [None req-4caeb4e0-6839-4739-a438-dff319ba5ebb 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['903bc67e-3e9d-4b17-b669-c51b1cfd9fe6']
Dec 06 10:21:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:14.969 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:15 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:15.813 2 INFO neutron.agent.securitygroups_rpc [None req-90100f1d-f80a-4a04-b568-2870d38561f7 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548789.localdomain dnsmasq[330917]: exiting on receipt of SIGTERM
Dec 06 10:21:16 np0005548789.localdomain podman[331194]: 2025-12-06 10:21:16.086816606 +0000 UTC m=+0.058151869 container kill 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:16 np0005548789.localdomain systemd[1]: libpod-4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53.scope: Deactivated successfully.
Dec 06 10:21:16 np0005548789.localdomain podman[331206]: 2025-12-06 10:21:16.165330006 +0000 UTC m=+0.064722690 container died 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:21:16 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:16.206 2 INFO neutron.agent.securitygroups_rpc [None req-1962285c-ec71-4abe-922e-2812550a1f59 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548789.localdomain podman[331206]: 2025-12-06 10:21:16.235980725 +0000 UTC m=+0.135373379 container cleanup 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:21:16 np0005548789.localdomain systemd[1]: libpod-conmon-4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53.scope: Deactivated successfully.
Dec 06 10:21:16 np0005548789.localdomain podman[331208]: 2025-12-06 10:21:16.26034701 +0000 UTC m=+0.153367699 container remove 4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5986df1f-13f3-42c1-bcc4-79dcf74a49a9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:21:16 np0005548789.localdomain ceph-mon[298582]: pgmap v374: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 7.0 KiB/s wr, 84 op/s
Dec 06 10:21:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:16.444 263652 INFO neutron.agent.dhcp.agent [None req-108f80ac-61fb-4ef4-8ace-74d386acc748 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:21:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:21:16 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:16.656 2 INFO neutron.agent.securitygroups_rpc [None req-62279be6-b47d-4f82-9bae-6dcf35358e74 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:16 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:16.875 2 INFO neutron.agent.securitygroups_rpc [None req-c061e6e9-3e5b-41ac-9ef0-5285d101333c 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:21:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-4ed23ebe6a60fad4889f2ca706b63873c667d977225c0a5045f8dfc5999e212b-merged.mount: Deactivated successfully.
Dec 06 10:21:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f9a605bde30cb63d1258bafc5b6357b3e8a6178585b25833a2d7aa12944aa53-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:17 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d5986df1f\x2d13f3\x2d42c1\x2dbcc4\x2d79dcf74a49a9.mount: Deactivated successfully.
Dec 06 10:21:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:17.119 2 INFO neutron.agent.securitygroups_rpc [None req-2a7b6fd7-f3e3-4d7a-9d68-bfb25de0babb 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548789.localdomain podman[331237]: 2025-12-06 10:21:17.179334142 +0000 UTC m=+0.083251306 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:21:17 np0005548789.localdomain podman[331237]: 2025-12-06 10:21:17.191101822 +0000 UTC m=+0.095018966 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:17 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:21:17 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:17.221 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:17.299 2 INFO neutron.agent.securitygroups_rpc [None req-a7542a85-dea5-4041-a20f-6eded131077b 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:17Z|00462|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:17.450 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46_f41ba66f-ce64-424e-90ff-4fce011eb0df", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "snap_name": "2eb048c9-c193-4307-94d1-af1469cf7b46", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:17 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:17.519 2 INFO neutron.agent.securitygroups_rpc [None req-2962579a-8cbd-4a57-b13b-f9182cbc39c2 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:18.070 2 INFO neutron.agent.securitygroups_rpc [None req-c4974b4a-fd74-4b95-bbd1-546aef178ffe 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:18.081 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:21:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:18.195 2 INFO neutron.agent.securitygroups_rpc [None req-3a1797f5-55dd-437d-aa3c-dfbf79d9d8b2 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:18.212 2 INFO neutron.agent.securitygroups_rpc [None req-44e28314-226f-4847-9798-44b59d6b4b35 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:18.334 2 INFO neutron.agent.securitygroups_rpc [None req-b55ae0b1-9a25-4f2d-ae6a-9ce8bc1e6fe6 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['0559edeb-45f4-4489-a671-f1d11bd0b93c']
Dec 06 10:21:18 np0005548789.localdomain ceph-mon[298582]: pgmap v375: 177 pgs: 177 active+clean; 193 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 7.0 KiB/s wr, 84 op/s
Dec 06 10:21:18 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:18.576 2 INFO neutron.agent.securitygroups_rpc [None req-b097e5fe-7591-4c88-9845-0ee25de4ff5d a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:18.803 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:21:18 np0005548789.localdomain systemd[1]: tmp-crun.jTBrqu.mount: Deactivated successfully.
Dec 06 10:21:18 np0005548789.localdomain podman[331257]: 2025-12-06 10:21:18.922491247 +0000 UTC m=+0.084651079 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:21:18 np0005548789.localdomain podman[331257]: 2025-12-06 10:21:18.934319568 +0000 UTC m=+0.096479380 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:21:18 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:21:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:19.021 2 INFO neutron.agent.securitygroups_rpc [None req-634a6650-1d18-4bf0-bb6f-8096dc9b484b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:19.197 2 INFO neutron.agent.securitygroups_rpc [None req-8740ee6b-b668-48f2-97f8-ee50cd2a4f10 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['3e4cda00-96df-465b-a218-fdd9aa158162']
Dec 06 10:21:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:19.439 2 INFO neutron.agent.securitygroups_rpc [None req-44297d43-b394-4987-b7fa-1e1c5c65d7e5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:19 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1072020705' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:19 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1072020705' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:19.802 2 INFO neutron.agent.securitygroups_rpc [None req-ec21728d-7a8b-4886-a802-a9f2fd832d5f a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:20.005 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:20.071 2 INFO neutron.agent.securitygroups_rpc [None req-ad373519-aa70-4a24-9c68-f8b3d34f07d1 05cea3733946411abb747782f855ad13 e82deaff368b4feea9fec0f06459a6ca - - default default] Security group member updated ['49ffd6de-2ba3-48fb-87b6-b485622383ee']
Dec 06 10:21:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:20.105 2 INFO neutron.agent.securitygroups_rpc [None req-387cce7d-30a2-4fff-b469-9e054d53578e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['2dafdbdc-1eca-4442-97f4-c504a138db8a']
Dec 06 10:21:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e180 e180: 6 total, 6 up, 6 in
Dec 06 10:21:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "format": "json"}]: dispatch
Dec 06 10:21:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "279e0b5d-0e40-4978-82df-d3bcd56c5a3c", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:20 np0005548789.localdomain ceph-mon[298582]: pgmap v376: 177 pgs: 177 active+clean; 193 MiB data, 909 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 14 KiB/s wr, 87 op/s
Dec 06 10:21:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:20.585 2 INFO neutron.agent.securitygroups_rpc [None req-d0a643e4-c9dc-4942-bc89-0b7141ebc4ce 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['2dafdbdc-1eca-4442-97f4-c504a138db8a']
Dec 06 10:21:20 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:20.877 2 INFO neutron.agent.securitygroups_rpc [None req-46e2ac08-cde4-4c43-95df-6267eb9b2508 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:21 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:21.309 2 INFO neutron.agent.securitygroups_rpc [None req-7161a7af-e2d1-4aef-85f9-991736510e62 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['9ec4be56-d6a3-492b-8ed5-b0e035114ef3']
Dec 06 10:21:21 np0005548789.localdomain ceph-mon[298582]: osdmap e180: 6 total, 6 up, 6 in
Dec 06 10:21:21 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:21.955 2 INFO neutron.agent.securitygroups_rpc [None req-5446130a-9f42-445f-9ada-e84f5ef55ebf 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['9ec4be56-d6a3-492b-8ed5-b0e035114ef3']
Dec 06 10:21:22 np0005548789.localdomain sshd[331278]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:22 np0005548789.localdomain ceph-mon[298582]: pgmap v378: 177 pgs: 177 active+clean; 193 MiB data, 909 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 14 KiB/s wr, 52 op/s
Dec 06 10:21:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:23.029 2 INFO neutron.agent.securitygroups_rpc [None req-027b5fbe-7eaf-4a47-82ae-b4c37a4f7304 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:23.384 2 INFO neutron.agent.securitygroups_rpc [None req-045a3cff-be52-4b3f-bacc-9f67cad8a71e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1202839971' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:23.735 263652 INFO neutron.agent.linux.ip_lib [None req-c5e83136-da2e-4565-b830-7a6712a783bf - - - - - -] Device tap07894cb1-b1 cannot be used as it has no MAC address
Dec 06 10:21:23 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:23.765 2 INFO neutron.agent.securitygroups_rpc [None req-7256d2df-f723-495d-9fc4-3babfe884545 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.795 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain kernel: device tap07894cb1-b1 entered promiscuous mode
Dec 06 10:21:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:23Z|00463|binding|INFO|Claiming lport 07894cb1-b1eb-4745-bfa0-45277bc8102d for this chassis.
Dec 06 10:21:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:23Z|00464|binding|INFO|07894cb1-b1eb-4745-bfa0-45277bc8102d: Claiming unknown
Dec 06 10:21:23 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016483.8085] manager: (tap07894cb1-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/75)
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.806 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain systemd-udevd[331289]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.811 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:23.818 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bca2059-94f6-4a4c-a1db-49e48579cd24, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=07894cb1-b1eb-4745-bfa0-45277bc8102d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:23.820 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 07894cb1-b1eb-4745-bfa0-45277bc8102d in datapath 6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4 bound to our chassis
Dec 06 10:21:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:23.822 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:23 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:23.822 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[913eb280-36a1-4c26-8b79-41ecf66743a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:23Z|00465|binding|INFO|Setting lport 07894cb1-b1eb-4745-bfa0-45277bc8102d ovn-installed in OVS
Dec 06 10:21:23 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:23Z|00466|binding|INFO|Setting lport 07894cb1-b1eb-4745-bfa0-45277bc8102d up in Southbound
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.851 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap07894cb1-b1: No such device
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.888 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:21:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:21:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:23.920 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:21:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:21:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:21:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1"
Dec 06 10:21:24 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:24.017 2 INFO neutron.agent.securitygroups_rpc [None req-41802285-db31-4507-ae87-65b8788f6146 05cea3733946411abb747782f855ad13 e82deaff368b4feea9fec0f06459a6ca - - default default] Security group member updated ['49ffd6de-2ba3-48fb-87b6-b485622383ee']
Dec 06 10:21:24 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:24.194 2 INFO neutron.agent.securitygroups_rpc [None req-2554a227-dcf4-4ba9-820f-f0670d58a0fd 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:24 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:24.513 2 INFO neutron.agent.securitygroups_rpc [None req-59f267c4-4d57-4206-9795-2fa0d0bba04b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:24 np0005548789.localdomain ceph-mon[298582]: pgmap v379: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:24 np0005548789.localdomain podman[331360]: 
Dec 06 10:21:24 np0005548789.localdomain podman[331360]: 2025-12-06 10:21:24.754834911 +0000 UTC m=+0.094317125 container create 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:21:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:21:24 np0005548789.localdomain systemd[1]: Started libpod-conmon-871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b.scope.
Dec 06 10:21:24 np0005548789.localdomain podman[331360]: 2025-12-06 10:21:24.711482316 +0000 UTC m=+0.050964540 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:24 np0005548789.localdomain systemd[1]: tmp-crun.HUxFGA.mount: Deactivated successfully.
Dec 06 10:21:24 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:24 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c32e1a0810f6a82ee408c06255873f9870f9500df5b5b512ff2b2f1018e06472/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:24 np0005548789.localdomain podman[331360]: 2025-12-06 10:21:24.852161976 +0000 UTC m=+0.191644190 container init 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:24 np0005548789.localdomain podman[331360]: 2025-12-06 10:21:24.86374636 +0000 UTC m=+0.203228574 container start 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec 06 10:21:24 np0005548789.localdomain dnsmasq[331390]: started, version 2.85 cachesize 150
Dec 06 10:21:24 np0005548789.localdomain dnsmasq[331390]: DNS service limited to local subnets
Dec 06 10:21:24 np0005548789.localdomain dnsmasq[331390]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:24 np0005548789.localdomain dnsmasq[331390]: warning: no upstream servers configured
Dec 06 10:21:24 np0005548789.localdomain dnsmasq-dhcp[331390]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:24 np0005548789.localdomain dnsmasq[331390]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/addn_hosts - 0 addresses
Dec 06 10:21:24 np0005548789.localdomain dnsmasq-dhcp[331390]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/host
Dec 06 10:21:24 np0005548789.localdomain dnsmasq-dhcp[331390]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/opts
Dec 06 10:21:24 np0005548789.localdomain podman[331375]: 2025-12-06 10:21:24.901645539 +0000 UTC m=+0.097119100 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 06 10:21:24 np0005548789.localdomain podman[331375]: 2025-12-06 10:21:24.973130784 +0000 UTC m=+0.168604355 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:21:24 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:21:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:25.007 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:25.009 2 INFO neutron.agent.securitygroups_rpc [None req-e2ff1412-56d0-4e73-b6d0-7b3abe3cbdda 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:25.111 263652 INFO neutron.agent.dhcp.agent [None req-8564d077-3b52-421d-b7d5-566575246df5 - - - - - -] DHCP configuration for ports {'b209dc58-7d4e-4e85-a9ad-fc7f3eb9fd41'} is completed
Dec 06 10:21:25 np0005548789.localdomain dnsmasq[331390]: exiting on receipt of SIGTERM
Dec 06 10:21:25 np0005548789.localdomain podman[331420]: 2025-12-06 10:21:25.253630338 +0000 UTC m=+0.067151974 container kill 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:25 np0005548789.localdomain systemd[1]: libpod-871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b.scope: Deactivated successfully.
Dec 06 10:21:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:25.307 2 INFO neutron.agent.securitygroups_rpc [None req-502d7035-1806-4b5f-89df-e87b8d86a05e 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['e92480e8-6d06-4c62-9df4-cfcbb83c433c']
Dec 06 10:21:25 np0005548789.localdomain podman[331434]: 2025-12-06 10:21:25.330477798 +0000 UTC m=+0.061681967 container died 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:25 np0005548789.localdomain podman[331434]: 2025-12-06 10:21:25.357993768 +0000 UTC m=+0.089197887 container cleanup 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:21:25 np0005548789.localdomain systemd[1]: libpod-conmon-871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b.scope: Deactivated successfully.
Dec 06 10:21:25 np0005548789.localdomain podman[331435]: 2025-12-06 10:21:25.400310882 +0000 UTC m=+0.126709945 container remove 871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:21:25 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:25.523 2 INFO neutron.agent.securitygroups_rpc [None req-4635dd6b-d978-4577-967f-9e6194773640 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c32e1a0810f6a82ee408c06255873f9870f9500df5b5b512ff2b2f1018e06472-merged.mount: Deactivated successfully.
Dec 06 10:21:25 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-871e5234add3e791e06f36666b5266db5c5bf235c01c63595bbc267f8da5356b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:25 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:25 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:26.028 2 INFO neutron.agent.securitygroups_rpc [None req-47fed9fe-9d6c-4fec-9c42-e4b99f91a654 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:26.084 2 INFO neutron.agent.securitygroups_rpc [None req-23962b7d-779d-4700-97cd-a1c3604fb216 14db54f428d84c88aba32e6937011f75 969bd3cedbbc4b03a4546a8b852f13f2 - - default default] Security group rule updated ['4227525c-3196-4e1b-83f0-62a3222dd04d']
Dec 06 10:21:26 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:26.540 2 INFO neutron.agent.securitygroups_rpc [None req-79d1c16c-d523-499a-9021-4bdb2b87f9e2 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:26 np0005548789.localdomain ceph-mon[298582]: pgmap v380: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:26 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:26 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/921077461' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:26 np0005548789.localdomain podman[331511]: 
Dec 06 10:21:26 np0005548789.localdomain podman[331511]: 2025-12-06 10:21:26.806828897 +0000 UTC m=+0.083605847 container create 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec 06 10:21:26 np0005548789.localdomain systemd[1]: Started libpod-conmon-50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4.scope.
Dec 06 10:21:26 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/751c4114672752fe297d8f8b90618e47d59106a9537bfb331c2c081878c120d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:26 np0005548789.localdomain podman[331511]: 2025-12-06 10:21:26.764920846 +0000 UTC m=+0.041697846 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:26 np0005548789.localdomain podman[331511]: 2025-12-06 10:21:26.871303718 +0000 UTC m=+0.148080678 container init 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:21:26 np0005548789.localdomain podman[331511]: 2025-12-06 10:21:26.880084206 +0000 UTC m=+0.156861156 container start 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:26 np0005548789.localdomain dnsmasq[331529]: started, version 2.85 cachesize 150
Dec 06 10:21:26 np0005548789.localdomain dnsmasq[331529]: DNS service limited to local subnets
Dec 06 10:21:26 np0005548789.localdomain dnsmasq[331529]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:26 np0005548789.localdomain dnsmasq[331529]: warning: no upstream servers configured
Dec 06 10:21:26 np0005548789.localdomain dnsmasq-dhcp[331529]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:26 np0005548789.localdomain dnsmasq-dhcp[331529]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Dec 06 10:21:26 np0005548789.localdomain dnsmasq[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/addn_hosts - 2 addresses
Dec 06 10:21:26 np0005548789.localdomain dnsmasq-dhcp[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/host
Dec 06 10:21:26 np0005548789.localdomain dnsmasq-dhcp[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/opts
Dec 06 10:21:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:26.938 263652 INFO neutron.agent.dhcp.agent [None req-57d6ff30-5daa-4f69-9fc3-53cf5f818934 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:24Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7e940>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7ee50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa770d0>, <neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc7e2e0>], id=a00dffc7-6f3c-4c71-a5e8-356fd7271314, ip_allocation=immediate, mac_address=fa:16:3e:31:78:60, name=tempest-PortsIpV6TestJSON-1981047382, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:21Z, description=, dns_domain=, id=6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-2107693199, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45674, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=2771, status=ACTIVE, subnets=['156375fe-1af8-48ec-b1e8-48e2e596ca8a', '15947303-a9dd-45df-aac2-e0a5c586be61'], tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:23Z, vlan_transparent=None, network_id=6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f89459b7-5955-49a9-980d-ccf671c641e2'], standard_attr_id=2802, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:24Z on network 6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4
Dec 06 10:21:27 np0005548789.localdomain dnsmasq[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/addn_hosts - 2 addresses
Dec 06 10:21:27 np0005548789.localdomain dnsmasq-dhcp[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/host
Dec 06 10:21:27 np0005548789.localdomain dnsmasq-dhcp[331529]: read /var/lib/neutron/dhcp/6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4/opts
Dec 06 10:21:27 np0005548789.localdomain podman[331547]: 2025-12-06 10:21:27.139216017 +0000 UTC m=+0.066481963 container kill 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:21:27 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:27.197 263652 INFO neutron.agent.dhcp.agent [None req-68974d9d-792f-4d9e-9829-90267bdb6192 - - - - - -] DHCP configuration for ports {'07894cb1-b1eb-4745-bfa0-45277bc8102d', 'a00dffc7-6f3c-4c71-a5e8-356fd7271314', 'b209dc58-7d4e-4e85-a9ad-fc7f3eb9fd41'} is completed
Dec 06 10:21:27 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:27.320 263652 INFO neutron.agent.dhcp.agent [None req-2d5f6596-b50a-47f0-887f-a8c8ac8e535a - - - - - -] DHCP configuration for ports {'a00dffc7-6f3c-4c71-a5e8-356fd7271314'} is completed
Dec 06 10:21:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e181 e181: 6 total, 6 up, 6 in
Dec 06 10:21:27 np0005548789.localdomain dnsmasq[331529]: exiting on receipt of SIGTERM
Dec 06 10:21:27 np0005548789.localdomain podman[331586]: 2025-12-06 10:21:27.550837399 +0000 UTC m=+0.069776414 container kill 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:27 np0005548789.localdomain systemd[1]: libpod-50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4.scope: Deactivated successfully.
Dec 06 10:21:27 np0005548789.localdomain podman[331599]: 2025-12-06 10:21:27.623227162 +0000 UTC m=+0.059443358 container died 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:27 np0005548789.localdomain podman[331599]: 2025-12-06 10:21:27.657629114 +0000 UTC m=+0.093845270 container cleanup 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:27 np0005548789.localdomain systemd[1]: libpod-conmon-50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4.scope: Deactivated successfully.
Dec 06 10:21:27 np0005548789.localdomain podman[331601]: 2025-12-06 10:21:27.695092569 +0000 UTC m=+0.125365813 container remove 50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:21:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:27.708 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:27 np0005548789.localdomain kernel: device tap07894cb1-b1 left promiscuous mode
Dec 06 10:21:27 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:27Z|00467|binding|INFO|Releasing lport 07894cb1-b1eb-4745-bfa0-45277bc8102d from this chassis (sb_readonly=0)
Dec 06 10:21:27 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:27Z|00468|binding|INFO|Setting lport 07894cb1-b1eb-4745-bfa0-45277bc8102d down in Southbound
Dec 06 10:21:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:27.724 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bca2059-94f6-4a4c-a1db-49e48579cd24, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=07894cb1-b1eb-4745-bfa0-45277bc8102d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:27.726 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 07894cb1-b1eb-4745-bfa0-45277bc8102d in datapath 6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4 unbound from our chassis
Dec 06 10:21:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:27.727 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d1e1b6f-f3ba-4f93-818d-9da1cd142ed4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:27 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:27.728 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[19e30417-38c3-4fe4-99cf-78d2872cb7a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:27.738 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-751c4114672752fe297d8f8b90618e47d59106a9537bfb331c2c081878c120d1-merged.mount: Deactivated successfully.
Dec 06 10:21:27 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50f88f562d9e110cd7c73bab40c97e60533228fd1ce1f9aa890b9537a1d43ea4-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:28.074 263652 INFO neutron.agent.dhcp.agent [None req-5806e92b-5bf2-4aa5-a473-94a90813d877 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:28.075 263652 INFO neutron.agent.dhcp.agent [None req-5806e92b-5bf2-4aa5-a473-94a90813d877 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:28 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d6d1e1b6f\x2df3ba\x2d4f93\x2d818d\x2d9da1cd142ed4.mount: Deactivated successfully.
Dec 06 10:21:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:28.076 263652 INFO neutron.agent.dhcp.agent [None req-5806e92b-5bf2-4aa5-a473-94a90813d877 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:28.300 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:28 np0005548789.localdomain ceph-mon[298582]: pgmap v381: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 16 KiB/s wr, 38 op/s
Dec 06 10:21:28 np0005548789.localdomain ceph-mon[298582]: osdmap e181: 6 total, 6 up, 6 in
Dec 06 10:21:28 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:28Z|00469|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:28.518 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:28.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "format": "json"}]: dispatch
Dec 06 10:21:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:30.052 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:30 np0005548789.localdomain ceph-mon[298582]: pgmap v383: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 16 KiB/s wr, 61 op/s
Dec 06 10:21:30 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:30.643 2 INFO neutron.agent.securitygroups_rpc [None req-650d6f16-dff9-4e9f-a532-00c1d7cd9ac8 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3604592233' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:31 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3604592233' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:31 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:31.378 2 INFO neutron.agent.securitygroups_rpc [None req-1b25accf-a6fc-48da-91f7-6e8e14cb52bd a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:31 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:31.438 2 INFO neutron.agent.securitygroups_rpc [None req-253b8277-210a-4573-82a4-3ce2f38be71e cc9a0aebc5df40baa5d30408481c8824 5ea98fc77f0c4728a4c2d7a5429d8129 - - default default] Security group rule updated ['113d3ef2-1b05-41a6-846b-b981d95adda0']
Dec 06 10:21:31 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:31.887 2 INFO neutron.agent.securitygroups_rpc [None req-0a2b9713-3499-411d-9f50-733d3731fca5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:32 np0005548789.localdomain ceph-mon[298582]: pgmap v384: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 14 KiB/s wr, 53 op/s
Dec 06 10:21:32 np0005548789.localdomain sshd[331629]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:32 np0005548789.localdomain sshd[331278]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:21:32 np0005548789.localdomain sshd[331278]: banner exchange: Connection from 124.163.255.210 port 40135: Connection timed out
Dec 06 10:21:32 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:32.510 2 INFO neutron.agent.securitygroups_rpc [None req-9523440b-4459-4eba-b7c1-0671f42e7aa4 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "format": "json"}]: dispatch
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "586fac81-a866-4e62-9321-4e3fa9e20434", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:33 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:33.888 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:34.363 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:34Z, description=, device_id=d3712577-f8e8-4c55-825d-3cfca431b537, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa5e220>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcbdb20>], id=ef99384a-2dd1-472a-bed4-f4d953c92a45, ip_allocation=immediate, mac_address=fa:16:3e:2f:bd:38, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2873, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:21:34Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:21:34 np0005548789.localdomain ceph-mon[298582]: pgmap v385: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 15 KiB/s wr, 42 op/s
Dec 06 10:21:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:34 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/620957375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:34 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:21:34 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:21:34 np0005548789.localdomain podman[331647]: 2025-12-06 10:21:34.579282255 +0000 UTC m=+0.054149486 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:34 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:21:34 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:34.822 263652 INFO neutron.agent.dhcp.agent [None req-23df8721-1a62-4d30-9cf3-cc03f9ef211d - - - - - -] DHCP configuration for ports {'ef99384a-2dd1-472a-bed4-f4d953c92a45'} is completed
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.097 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e182 e182: 6 total, 6 up, 6 in
Dec 06 10:21:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2748713948' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:21:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:21:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:35.580 263652 INFO neutron.agent.linux.ip_lib [None req-2028a43a-dd13-41c7-b6a9-90eb06345c1d - - - - - -] Device tap7a29805b-bc cannot be used as it has no MAC address
Dec 06 10:21:35 np0005548789.localdomain kernel: device tap7a29805b-bc entered promiscuous mode
Dec 06 10:21:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:35Z|00470|binding|INFO|Claiming lport 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 for this chassis.
Dec 06 10:21:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:35Z|00471|binding|INFO|7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8: Claiming unknown
Dec 06 10:21:35 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016495.6171] manager: (tap7a29805b-bc): new Generic device (/org/freedesktop/NetworkManager/Devices/76)
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.606 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.617 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain systemd-udevd[331705]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:35.625 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-1ceb092b-0721-47ff-8a32-871f82b9c9c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ceb092b-0721-47ff-8a32-871f82b9c9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc3e50f0-1643-4b6d-9347-eecb3387f433, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:35.628 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 in datapath 1ceb092b-0721-47ff-8a32-871f82b9c9c0 bound to our chassis
Dec 06 10:21:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:35.629 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ceb092b-0721-47ff-8a32-871f82b9c9c0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:35.630 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[14f42e30-baed-47dd-9f8d-1c01595d48c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:35Z|00472|binding|INFO|Setting lport 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 ovn-installed in OVS
Dec 06 10:21:35 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:35Z|00473|binding|INFO|Setting lport 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 up in Southbound
Dec 06 10:21:35 np0005548789.localdomain podman[331673]: 2025-12-06 10:21:35.675364251 +0000 UTC m=+0.147118608 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.679 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain podman[331671]: 2025-12-06 10:21:35.633874963 +0000 UTC m=+0.112080277 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.703 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:35.706 2 INFO neutron.agent.securitygroups_rpc [None req-b9509f86-2a19-49fe-82cf-8c23b9e8fca9 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:35 np0005548789.localdomain podman[331673]: 2025-12-06 10:21:35.707916946 +0000 UTC m=+0.179671283 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:21:35 np0005548789.localdomain podman[331671]: 2025-12-06 10:21:35.714004102 +0000 UTC m=+0.192209406 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:21:35 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:21:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:35.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:35 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.225 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.225 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.226 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.226 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.227 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: pgmap v386: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 15 KiB/s wr, 42 op/s
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: osdmap e182: 6 total, 6 up, 6 in
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e183 e183: 6 total, 6 up, 6 in
Dec 06 10:21:36 np0005548789.localdomain podman[331792]: 
Dec 06 10:21:36 np0005548789.localdomain podman[331792]: 2025-12-06 10:21:36.610886048 +0000 UTC m=+0.106768725 container create 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:36 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2846497516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.644 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:36 np0005548789.localdomain podman[331792]: 2025-12-06 10:21:36.552751511 +0000 UTC m=+0.048634218 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:36 np0005548789.localdomain systemd[1]: Started libpod-conmon-20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2.scope.
Dec 06 10:21:36 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:36 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f16e1d58a341217a815476c881a76cf024f00fba0caf0e395a3447aa5ff6a5ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:36 np0005548789.localdomain podman[331792]: 2025-12-06 10:21:36.701031843 +0000 UTC m=+0.196914510 container init 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:21:36 np0005548789.localdomain podman[331792]: 2025-12-06 10:21:36.70745836 +0000 UTC m=+0.203341007 container start 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: started, version 2.85 cachesize 150
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: DNS service limited to local subnets
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: warning: no upstream servers configured
Dec 06 10:21:36 np0005548789.localdomain dnsmasq-dhcp[331812]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/addn_hosts - 0 addresses
Dec 06 10:21:36 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/host
Dec 06 10:21:36 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/opts
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.733 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.734 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:21:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:36.760 263652 INFO neutron.agent.dhcp.agent [None req-2028a43a-dd13-41c7-b6a9-90eb06345c1d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:35Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf65e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf6550>], id=90f7a28e-a8f2-4712-a6e0-d2bc46b5745f, ip_allocation=immediate, mac_address=fa:16:3e:09:74:62, name=tempest-PortsIpV6TestJSON-1589482234, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:33Z, description=, dns_domain=, id=1ceb092b-0721-47ff-8a32-871f82b9c9c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-2073272237, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35875, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2870, status=ACTIVE, subnets=['07d85612-b0dd-4672-9a31-4432908cf38c'], tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:34Z, vlan_transparent=None, network_id=1ceb092b-0721-47ff-8a32-871f82b9c9c0, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f89459b7-5955-49a9-980d-ccf671c641e2'], standard_attr_id=2885, status=DOWN, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:35Z on network 1ceb092b-0721-47ff-8a32-871f82b9c9c0
Dec 06 10:21:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:36.813 263652 INFO neutron.agent.dhcp.agent [None req-6ffbc636-7bf6-4525-bc7c-10147ebbc29b - - - - - -] DHCP configuration for ports {'9cdca5e6-0cc0-4c47-8039-0c19344c47c0'} is completed
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.887 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.888 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11163MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.889 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.889 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:36 np0005548789.localdomain dnsmasq[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/addn_hosts - 1 addresses
Dec 06 10:21:36 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/host
Dec 06 10:21:36 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/opts
Dec 06 10:21:36 np0005548789.localdomain podman[331831]: 2025-12-06 10:21:36.930363223 +0000 UTC m=+0.061436249 container kill 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.957 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.958 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.958 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:21:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:36.986 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:37.154 263652 INFO neutron.agent.dhcp.agent [None req-030c8905-92f4-4789-b960-81f70de49bdf - - - - - -] DHCP configuration for ports {'90f7a28e-a8f2-4712-a6e0-d2bc46b5745f'} is completed
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1607464631' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:37.445 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:37.452 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:21:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:37.467 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:21:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:37.470 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:21:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:37.470 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "format": "json"}]: dispatch
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: osdmap e183: 6 total, 6 up, 6 in
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2846497516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1607464631' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e184 e184: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:38 np0005548789.localdomain ceph-mon[298582]: pgmap v389: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 29 op/s
Dec 06 10:21:38 np0005548789.localdomain ceph-mon[298582]: osdmap e184: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e185 e185: 6 total, 6 up, 6 in
Dec 06 10:21:38 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:38.822 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:35Z, description=, device_id=07d96743-59c2-40bb-9e6f-79db48bad162, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa5e970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd30160>], id=90f7a28e-a8f2-4712-a6e0-d2bc46b5745f, ip_allocation=immediate, mac_address=fa:16:3e:09:74:62, name=tempest-PortsIpV6TestJSON-1589482234, network_id=1ceb092b-0721-47ff-8a32-871f82b9c9c0, port_security_enabled=True, project_id=a18f82f0d09644c7b6d23e2fece8be4f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['f89459b7-5955-49a9-980d-ccf671c641e2'], standard_attr_id=2885, status=ACTIVE, tags=[], tenant_id=a18f82f0d09644c7b6d23e2fece8be4f, updated_at=2025-12-06T10:21:37Z on network 1ceb092b-0721-47ff-8a32-871f82b9c9c0
Dec 06 10:21:38 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:21:38 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:21:38 np0005548789.localdomain podman[331891]: 2025-12-06 10:21:38.933018442 +0000 UTC m=+0.081951846 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:21:38 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:21:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:38.936 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:39 np0005548789.localdomain dnsmasq[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/addn_hosts - 1 addresses
Dec 06 10:21:39 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/host
Dec 06 10:21:39 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/opts
Dec 06 10:21:39 np0005548789.localdomain podman[331920]: 2025-12-06 10:21:39.02191105 +0000 UTC m=+0.049136293 container kill 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:21:39 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:39.330 263652 INFO neutron.agent.dhcp.agent [None req-56fe9e9a-a0a1-4cf0-ae91-907ac3f34e5e - - - - - -] DHCP configuration for ports {'90f7a28e-a8f2-4712-a6e0-d2bc46b5745f'} is completed
Dec 06 10:21:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:39.466 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:39 np0005548789.localdomain ceph-mon[298582]: osdmap e185: 6 total, 6 up, 6 in
Dec 06 10:21:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/929112309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:39 np0005548789.localdomain sshd[331947]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:39 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:39.902 2 INFO neutron.agent.securitygroups_rpc [None req-07bddad5-b128-4210-8a04-c1adeb45eb18 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:40.144 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:40 np0005548789.localdomain dnsmasq[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/addn_hosts - 0 addresses
Dec 06 10:21:40 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/host
Dec 06 10:21:40 np0005548789.localdomain dnsmasq-dhcp[331812]: read /var/lib/neutron/dhcp/1ceb092b-0721-47ff-8a32-871f82b9c9c0/opts
Dec 06 10:21:40 np0005548789.localdomain podman[331966]: 2025-12-06 10:21:40.149342702 +0000 UTC m=+0.102335189 container kill 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:40 np0005548789.localdomain sshd[331947]: Received disconnect from 64.227.102.57 port 33970:11: Bye Bye [preauth]
Dec 06 10:21:40 np0005548789.localdomain sshd[331947]: Disconnected from authenticating user root 64.227.102.57 port 33970 [preauth]
Dec 06 10:21:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:40.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:40Z|00474|binding|INFO|Releasing lport 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 from this chassis (sb_readonly=0)
Dec 06 10:21:40 np0005548789.localdomain kernel: device tap7a29805b-bc left promiscuous mode
Dec 06 10:21:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:40Z|00475|binding|INFO|Setting lport 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 down in Southbound
Dec 06 10:21:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:40.355 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-1ceb092b-0721-47ff-8a32-871f82b9c9c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ceb092b-0721-47ff-8a32-871f82b9c9c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a18f82f0d09644c7b6d23e2fece8be4f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc3e50f0-1643-4b6d-9347-eecb3387f433, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:40.356 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:40.358 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 7a29805b-bc7f-4a1f-9655-ca6a5d1f8ed8 in datapath 1ceb092b-0721-47ff-8a32-871f82b9c9c0 unbound from our chassis
Dec 06 10:21:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:40.360 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ceb092b-0721-47ff-8a32-871f82b9c9c0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:21:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:40.362 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[372e837b-2a2b-4ff4-9cc9-061ae7283fd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: pgmap v392: 177 pgs: 177 active+clean; 193 MiB data, 931 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 25 KiB/s wr, 71 op/s
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6e6fd32a-770e-4afe-8d83-a1956fc630a7", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4689547c-b07c-49f8-ab25-594d5c576c89", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e186 e186: 6 total, 6 up, 6 in
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.292 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.293 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.293 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:21:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:41.294 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:21:41 np0005548789.localdomain dnsmasq[331812]: exiting on receipt of SIGTERM
Dec 06 10:21:41 np0005548789.localdomain podman[332004]: 2025-12-06 10:21:41.557799817 +0000 UTC m=+0.061451320 container kill 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:41 np0005548789.localdomain systemd[1]: libpod-20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2.scope: Deactivated successfully.
Dec 06 10:21:41 np0005548789.localdomain podman[332018]: 2025-12-06 10:21:41.61251502 +0000 UTC m=+0.045338337 container died 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:21:41 np0005548789.localdomain ceph-mon[298582]: osdmap e186: 6 total, 6 up, 6 in
Dec 06 10:21:41 np0005548789.localdomain podman[332018]: 2025-12-06 10:21:41.663823138 +0000 UTC m=+0.096646385 container cleanup 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:21:41 np0005548789.localdomain systemd[1]: libpod-conmon-20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2.scope: Deactivated successfully.
Dec 06 10:21:41 np0005548789.localdomain sshd[332045]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:21:41 np0005548789.localdomain podman[332020]: 2025-12-06 10:21:41.692209466 +0000 UTC m=+0.114653996 container remove 20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ceb092b-0721-47ff-8a32-871f82b9c9c0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:41.718 263652 INFO neutron.agent.dhcp.agent [None req-89bfc89d-0180-4220-9f43-2f27efa1794d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:41.721 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:21:42 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:42Z|00476|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.060 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.142 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.174 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.174 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:42.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e187 e187: 6 total, 6 up, 6 in
Dec 06 10:21:42 np0005548789.localdomain systemd[1]: tmp-crun.mniz3H.mount: Deactivated successfully.
Dec 06 10:21:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-f16e1d58a341217a815476c881a76cf024f00fba0caf0e395a3447aa5ff6a5ef-merged.mount: Deactivated successfully.
Dec 06 10:21:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20c8dd0cc628914b02b73de177e93e44e95860183716d594861a309709f95dd2-userdata-shm.mount: Deactivated successfully.
Dec 06 10:21:42 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d1ceb092b\x2d0721\x2d47ff\x2d8a32\x2d871f82b9c9c0.mount: Deactivated successfully.
Dec 06 10:21:42 np0005548789.localdomain ceph-mon[298582]: pgmap v394: 177 pgs: 177 active+clean; 193 MiB data, 931 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 22 KiB/s wr, 60 op/s
Dec 06 10:21:42 np0005548789.localdomain ceph-mon[298582]: osdmap e187: 6 total, 6 up, 6 in
Dec 06 10:21:42 np0005548789.localdomain sshd[331629]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:21:42 np0005548789.localdomain sshd[331629]: banner exchange: Connection from 123.160.164.187 port 35264: Connection timed out
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548789.localdomain sshd[332045]: Received disconnect from 118.219.234.233 port 36032:11: Bye Bye [preauth]
Dec 06 10:21:43 np0005548789.localdomain sshd[332045]: Disconnected from authenticating user root 118.219.234.233 port 36032 [preauth]
Dec 06 10:21:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:21:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:21:43 np0005548789.localdomain podman[332049]: 2025-12-06 10:21:43.143154088 +0000 UTC m=+0.088841466 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Dec 06 10:21:43 np0005548789.localdomain podman[332049]: 2025-12-06 10:21:43.15662107 +0000 UTC m=+0.102308478 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, version=9.6)
Dec 06 10:21:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:43.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:43.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:43.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:21:43 np0005548789.localdomain systemd[1]: tmp-crun.8diATm.mount: Deactivated successfully.
Dec 06 10:21:43 np0005548789.localdomain podman[332050]: 2025-12-06 10:21:43.205513594 +0000 UTC m=+0.148072716 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true)
Dec 06 10:21:43 np0005548789.localdomain podman[332050]: 2025-12-06 10:21:43.214614093 +0000 UTC m=+0.157173245 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Dec 06 10:21:43 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:21:43 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb_a3496f8c-73ba-4d3a-8401-faf81aff8654", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "snap_name": "802dfa0a-e026-43d4-b3ba-67c6242d0ddb", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/518841331' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:43.940 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:44 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:44.320 2 INFO neutron.agent.securitygroups_rpc [None req-e0373cd9-ce28-4174-b7a6-2d0d511546d5 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['ecf618e7-df48-4fb6-89e3-d9952de70569']
Dec 06 10:21:44 np0005548789.localdomain ceph-mon[298582]: pgmap v396: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 52 KiB/s wr, 140 op/s
Dec 06 10:21:44 np0005548789.localdomain ceph-mon[298582]: mgrmap e50: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:21:44 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:44.870 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:44Z, description=, device_id=ce829857-8272-489e-9d8d-e074f2d58a5d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdfdf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdfac0>], id=cb29ec91-f044-43bb-99c4-9e015acf18ae, ip_allocation=immediate, mac_address=fa:16:3e:31:4a:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2962, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:21:44Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:21:45 np0005548789.localdomain podman[332104]: 2025-12-06 10:21:45.079409106 +0000 UTC m=+0.056620331 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:21:45 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:21:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:21:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:21:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:45.179 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:45.315 263652 INFO neutron.agent.dhcp.agent [None req-2fa99148-40fa-4aeb-809c-22e93103a036 - - - - - -] DHCP configuration for ports {'cb29ec91-f044-43bb-99c4-9e015acf18ae'} is completed
Dec 06 10:21:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:46.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:46.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:46.450 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:46 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:46.597 2 INFO neutron.agent.securitygroups_rpc [None req-f9e919cd-0c1a-4346-a33d-5b1944f06d7b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['d5c7cc25-d8ea-40c9-b1b2-04cf074e64bb', 'ecf618e7-df48-4fb6-89e3-d9952de70569']
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:21:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:21:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:21:47 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:47.250 2 INFO neutron.agent.securitygroups_rpc [None req-cf0a2394-56b4-45a3-97ca-aedf09b47c4d a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['d5c7cc25-d8ea-40c9-b1b2-04cf074e64bb']
Dec 06 10:21:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:47.339 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:47.340 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:47.341 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:47 np0005548789.localdomain ceph-mon[298582]: pgmap v397: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 29 KiB/s wr, 77 op/s
Dec 06 10:21:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "format": "json"}]: dispatch
Dec 06 10:21:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1ddc97c2-406f-4edf-8fc4-61a5ba2286e5", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:47 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3433747594' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e188 e188: 6 total, 6 up, 6 in
Dec 06 10:21:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:21:47 np0005548789.localdomain podman[332124]: 2025-12-06 10:21:47.758063257 +0000 UTC m=+0.082178483 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:21:47 np0005548789.localdomain podman[332124]: 2025-12-06 10:21:47.792072488 +0000 UTC m=+0.116187694 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:21:47 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: pgmap v398: 177 pgs: 177 active+clean; 193 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 24 KiB/s wr, 64 op/s
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: osdmap e188: 6 total, 6 up, 6 in
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4095004814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "format": "json"}]: dispatch
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3407317111' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e189 e189: 6 total, 6 up, 6 in
Dec 06 10:21:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:49.057 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:49 np0005548789.localdomain ceph-mon[298582]: osdmap e189: 6 total, 6 up, 6 in
Dec 06 10:21:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2519745187' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:21:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:21:49 np0005548789.localdomain podman[332143]: 2025-12-06 10:21:49.925456182 +0000 UTC m=+0.086807814 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:21:49 np0005548789.localdomain podman[332143]: 2025-12-06 10:21:49.934165218 +0000 UTC m=+0.095516800 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:21:49 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:21:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:49.968 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:50.210 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:50 np0005548789.localdomain ceph-mon[298582]: pgmap v401: 177 pgs: 177 active+clean; 194 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 47 KiB/s wr, 78 op/s
Dec 06 10:21:51 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:51.079 2 INFO neutron.agent.securitygroups_rpc [None req-9aa7129f-1e5c-4b11-befb-2e4631e91223 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['2129cf94-a39f-4e4e-ab36-0d488acfdae6']
Dec 06 10:21:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e190 e190: 6 total, 6 up, 6 in
Dec 06 10:21:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971", "format": "json"}]: dispatch
Dec 06 10:21:52 np0005548789.localdomain ceph-mon[298582]: pgmap v402: 177 pgs: 177 active+clean; 194 MiB data, 932 MiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 17 KiB/s wr, 3 op/s
Dec 06 10:21:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:52 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:52.768 2 INFO neutron.agent.securitygroups_rpc [None req-1911ca26-173a-428d-a937-0c8a9ec17a18 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['2129cf94-a39f-4e4e-ab36-0d488acfdae6', '398ea86c-4672-439c-a6e6-0b07306b07fb', 'fb8e4f1d-f459-47af-ae16-0205f1ef2540']
Dec 06 10:21:53 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:53.251 2 INFO neutron.agent.securitygroups_rpc [None req-e01a41a9-2c25-4070-a80a-0e758300cea8 a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['398ea86c-4672-439c-a6e6-0b07306b07fb', 'fb8e4f1d-f459-47af-ae16-0205f1ef2540']
Dec 06 10:21:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "format": "json"}]: dispatch
Dec 06 10:21:53 np0005548789.localdomain ceph-mon[298582]: osdmap e190: 6 total, 6 up, 6 in
Dec 06 10:21:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:21:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:21:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:21:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:21:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:21:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1"
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.119 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain ceph-mon[298582]: pgmap v404: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 37 KiB/s wr, 53 op/s
Dec 06 10:21:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:54 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:54.554 263652 INFO neutron.agent.linux.ip_lib [None req-eb07b3ed-4dbd-48c8-9b8c-33ed0e899d4c - - - - - -] Device tap72ee7f1b-13 cannot be used as it has no MAC address
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.575 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain kernel: device tap72ee7f1b-13 entered promiscuous mode
Dec 06 10:21:54 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016514.5842] manager: (tap72ee7f1b-13): new Generic device (/org/freedesktop/NetworkManager/Devices/77)
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.583 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:54Z|00477|binding|INFO|Claiming lport 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 for this chassis.
Dec 06 10:21:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:54Z|00478|binding|INFO|72ee7f1b-132c-4087-bcf6-0dc2886ccb24: Claiming unknown
Dec 06 10:21:54 np0005548789.localdomain systemd-udevd[332175]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:21:54 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:54.602 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-23332176-d495-43f0-b960-60f576e19db9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23332176-d495-43f0-b960-60f576e19db9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8461d03-d92a-4a33-8802-d634072db402, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=72ee7f1b-132c-4087-bcf6-0dc2886ccb24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:21:54 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:54.605 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 in datapath 23332176-d495-43f0-b960-60f576e19db9 bound to our chassis
Dec 06 10:21:54 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:54.607 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6dcc78be-11ab-40d5-a0de-d91ea8357630 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:21:54 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:54.607 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23332176-d495-43f0-b960-60f576e19db9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:21:54 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:21:54.609 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[58cb6de8-6f04-4fa7-a6b9-6035080d04fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.616 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:54Z|00479|binding|INFO|Setting lport 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 ovn-installed in OVS
Dec 06 10:21:54 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:21:54Z|00480|binding|INFO|Setting lport 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 up in Southbound
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.622 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.625 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.649 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:54 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap72ee7f1b-13: No such device
Dec 06 10:21:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:54.678 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:55.244 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:55 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:21:55.376 2 INFO neutron.agent.securitygroups_rpc [None req-41969300-114c-4f41-96fa-3fc73bdf2a0b a2f6f80b9d5a42ccb727340d59efb967 a18f82f0d09644c7b6d23e2fece8be4f - - default default] Security group member updated ['f89459b7-5955-49a9-980d-ccf671c641e2']
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1393965120' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:21:55 np0005548789.localdomain podman[332245]: 
Dec 06 10:21:55 np0005548789.localdomain podman[332245]: 2025-12-06 10:21:55.587445409 +0000 UTC m=+0.064994577 container create 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:21:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:21:55 np0005548789.localdomain systemd[1]: Started libpod-conmon-071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827.scope.
Dec 06 10:21:55 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:21:55 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/988891ef40577becefdf38dadb1a51a6fd48e58dae4cb75c05349582a2ed8956/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:21:55 np0005548789.localdomain podman[332245]: 2025-12-06 10:21:55.558211636 +0000 UTC m=+0.035760824 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:21:55 np0005548789.localdomain podman[332245]: 2025-12-06 10:21:55.664841306 +0000 UTC m=+0.142390494 container init 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:55 np0005548789.localdomain dnsmasq[332273]: started, version 2.85 cachesize 150
Dec 06 10:21:55 np0005548789.localdomain dnsmasq[332273]: DNS service limited to local subnets
Dec 06 10:21:55 np0005548789.localdomain dnsmasq[332273]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:21:55 np0005548789.localdomain dnsmasq[332273]: warning: no upstream servers configured
Dec 06 10:21:55 np0005548789.localdomain dnsmasq-dhcp[332273]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:21:55 np0005548789.localdomain dnsmasq[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/addn_hosts - 0 addresses
Dec 06 10:21:55 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/host
Dec 06 10:21:55 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/opts
Dec 06 10:21:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:55.701 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:55 np0005548789.localdomain podman[332258]: 2025-12-06 10:21:55.713444971 +0000 UTC m=+0.084276767 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 10:21:55 np0005548789.localdomain podman[332245]: 2025-12-06 10:21:55.738179797 +0000 UTC m=+0.215728995 container start 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:55 np0005548789.localdomain podman[332258]: 2025-12-06 10:21:55.761207741 +0000 UTC m=+0.132039577 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:55 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:21:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:55.794 263652 INFO neutron.agent.dhcp.agent [None req-b3262e7d-0aad-4820-bcd0-9f351ab7bcd0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:54Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf44f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbf4190>], id=9b798f40-f638-4713-ad94-f68eb006cdb3, ip_allocation=immediate, mac_address=fa:16:3e:b6:af:bd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:51Z, description=, dns_domain=, id=23332176-d495-43f0-b960-60f576e19db9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1157192730, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31598, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3002, status=ACTIVE, subnets=['02f360c2-c764-4a10-aa33-ee333a17b366'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:52Z, vlan_transparent=None, network_id=23332176-d495-43f0-b960-60f576e19db9, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3014, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:54Z on network 23332176-d495-43f0-b960-60f576e19db9
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:21:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:55.882 263652 INFO neutron.agent.dhcp.agent [None req-09210cce-daf8-4170-849d-ab83d4d5b6c9 - - - - - -] DHCP configuration for ports {'94818738-f83b-4c5a-b7b7-2e0ae5a8c2de'} is completed
Dec 06 10:21:56 np0005548789.localdomain dnsmasq[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/addn_hosts - 1 addresses
Dec 06 10:21:56 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/host
Dec 06 10:21:56 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/opts
Dec 06 10:21:56 np0005548789.localdomain podman[332306]: 2025-12-06 10:21:56.050637149 +0000 UTC m=+0.059066466 container kill 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:21:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:56.229 263652 INFO neutron.agent.dhcp.agent [None req-244435d4-e5a4-45ab-b7ed-171608c3b4e2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:21:54Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32880>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32550>], id=9b798f40-f638-4713-ad94-f68eb006cdb3, ip_allocation=immediate, mac_address=fa:16:3e:b6:af:bd, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:51Z, description=, dns_domain=, id=23332176-d495-43f0-b960-60f576e19db9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1157192730, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31598, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3002, status=ACTIVE, subnets=['02f360c2-c764-4a10-aa33-ee333a17b366'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:52Z, vlan_transparent=None, network_id=23332176-d495-43f0-b960-60f576e19db9, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3014, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:21:54Z on network 23332176-d495-43f0-b960-60f576e19db9
Dec 06 10:21:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:56.393 263652 INFO neutron.agent.dhcp.agent [None req-fcb1b711-ad49-4536-bf4a-87c0b06c49fb - - - - - -] DHCP configuration for ports {'9b798f40-f638-4713-ad94-f68eb006cdb3'} is completed
Dec 06 10:21:56 np0005548789.localdomain dnsmasq[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/addn_hosts - 1 addresses
Dec 06 10:21:56 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/host
Dec 06 10:21:56 np0005548789.localdomain podman[332344]: 2025-12-06 10:21:56.582123205 +0000 UTC m=+0.054643210 container kill 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:21:56 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/opts
Dec 06 10:21:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "format": "json"}]: dispatch
Dec 06 10:21:56 np0005548789.localdomain ceph-mon[298582]: pgmap v405: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 28 KiB/s wr, 40 op/s
Dec 06 10:21:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6", "format": "json"}]: dispatch
Dec 06 10:21:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:21:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/880620488' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:21:56 np0005548789.localdomain systemd[1]: tmp-crun.x5AqMl.mount: Deactivated successfully.
Dec 06 10:21:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:21:56.892 263652 INFO neutron.agent.dhcp.agent [None req-52fd6933-6149-4c44-aba7-ea2f5b1b0450 - - - - - -] DHCP configuration for ports {'9b798f40-f638-4713-ad94-f68eb006cdb3'} is completed
Dec 06 10:21:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e191 e191: 6 total, 6 up, 6 in
Dec 06 10:21:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:58 np0005548789.localdomain ceph-mon[298582]: pgmap v406: 177 pgs: 177 active+clean; 194 MiB data, 937 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 10 KiB/s wr, 33 op/s
Dec 06 10:21:58 np0005548789.localdomain ceph-mon[298582]: osdmap e191: 6 total, 6 up, 6 in
Dec 06 10:21:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:59.125 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:21:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "format": "json"}]: dispatch
Dec 06 10:21:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "67d6cee7-d14d-4a25-bb5e-de16ccdb07ed", "force": true, "format": "json"}]: dispatch
Dec 06 10:21:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:21:59.697 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:00.246 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6_17d9a7bb-1ff4-4445-bd23-f0d930d9f87b", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "snap_name": "80ddbc86-47b7-40f0-8588-929d2cc3feb6", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: pgmap v408: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 11 MiB/s wr, 105 op/s
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "format": "json"}]: dispatch
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e192 e192: 6 total, 6 up, 6 in
Dec 06 10:22:01 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:22:01 np0005548789.localdomain sudo[332366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:22:01 np0005548789.localdomain sudo[332366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:01 np0005548789.localdomain sudo[332366]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:01 np0005548789.localdomain sudo[332384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:22:01 np0005548789.localdomain sudo[332384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:01 np0005548789.localdomain ceph-mon[298582]: osdmap e192: 6 total, 6 up, 6 in
Dec 06 10:22:01 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:01.945 263652 INFO neutron.agent.linux.ip_lib [None req-258ecf09-4f1d-4e9a-94a9-a7b71bd1d829 - - - - - -] Device tap27deb0c2-77 cannot be used as it has no MAC address
Dec 06 10:22:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:01.972 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:01 np0005548789.localdomain kernel: device tap27deb0c2-77 entered promiscuous mode
Dec 06 10:22:01 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016521.9815] manager: (tap27deb0c2-77): new Generic device (/org/freedesktop/NetworkManager/Devices/78)
Dec 06 10:22:01 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:01Z|00481|binding|INFO|Claiming lport 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 for this chassis.
Dec 06 10:22:01 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:01Z|00482|binding|INFO|27deb0c2-77bb-47c5-bae8-ebe8fdf091d3: Claiming unknown
Dec 06 10:22:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:01.986 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:01 np0005548789.localdomain systemd-udevd[332439]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:02.018 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:02Z|00483|binding|INFO|Setting lport 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 ovn-installed in OVS
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:02.021 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:02.023 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:02Z|00484|binding|INFO|Setting lport 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 up in Southbound
Dec 06 10:22:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:02.034 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4daafaf0e264da6a728bdd60c5d6377', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=788a81a8-1f71-42fc-a0bb-d48b0228b6cf, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=27deb0c2-77bb-47c5-bae8-ebe8fdf091d3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:02.036 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 in datapath b212e2be-4a1a-42c0-a3cd-d5a930c0c30a bound to our chassis
Dec 06 10:22:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:02.038 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b212e2be-4a1a-42c0-a3cd-d5a930c0c30a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:22:02 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:02.039 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[409e442c-c054-40eb-bd60-1f4f65121ede]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap27deb0c2-77: No such device
Dec 06 10:22:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:02.065 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:02.098 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.379044) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522379103, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2406, "num_deletes": 261, "total_data_size": 4407873, "memory_usage": 4468232, "flush_reason": "Manual Compaction"}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522414267, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2373234, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27379, "largest_seqno": 29779, "table_properties": {"data_size": 2365392, "index_size": 4347, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21482, "raw_average_key_size": 22, "raw_value_size": 2347822, "raw_average_value_size": 2443, "num_data_blocks": 187, "num_entries": 961, "num_filter_entries": 961, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016395, "oldest_key_time": 1765016395, "file_creation_time": 1765016522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 35276 microseconds, and 4843 cpu microseconds.
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.414319) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2373234 bytes OK
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.414348) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.417874) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.417910) EVENT_LOG_v1 {"time_micros": 1765016522417900, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.417935) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 4396686, prev total WAL file size 4397435, number of live WAL files 2.
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.418964) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303034' seq:72057594037927935, type:22 .. '6D6772737461740034323536' seq:0, type:0; will stop at (end)
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2317KB)], [45(17MB)]
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522418998, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 21195557, "oldest_snapshot_seqno": -1}
Dec 06 10:22:02 np0005548789.localdomain systemd[1]: tmp-crun.ymtRvn.mount: Deactivated successfully.
Dec 06 10:22:02 np0005548789.localdomain podman[332521]: 2025-12-06 10:22:02.456983669 +0000 UTC m=+0.126142757 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., ceph=True, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.41.4, release=1763362218, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public)
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 13318 keys, 19558016 bytes, temperature: kUnknown
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522498531, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 19558016, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19482747, "index_size": 40864, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33349, "raw_key_size": 355670, "raw_average_key_size": 26, "raw_value_size": 19257210, "raw_average_value_size": 1445, "num_data_blocks": 1544, "num_entries": 13318, "num_filter_entries": 13318, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016522, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.498842) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 19558016 bytes
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.501139) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 266.2 rd, 245.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 18.0 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(17.2) write-amplify(8.2) OK, records in: 13796, records dropped: 478 output_compression: NoCompression
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.501164) EVENT_LOG_v1 {"time_micros": 1765016522501153, "job": 26, "event": "compaction_finished", "compaction_time_micros": 79627, "compaction_time_cpu_micros": 31108, "output_level": 6, "num_output_files": 1, "total_output_size": 19558016, "num_input_records": 13796, "num_output_records": 13318, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522501488, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016522503205, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.418852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.503281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.503292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.503297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.503301) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:02.503306) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:02 np0005548789.localdomain podman[332521]: 2025-12-06 10:22:02.605458137 +0000 UTC m=+0.274617195 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, release=1763362218, distribution-scope=public, build-date=2025-11-26T19:44:28Z)
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: pgmap v410: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 11 MiB/s wr, 68 op/s
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "58c822ee-e1d1-46ef-8c6f-1d173f1e10cf", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:02 np0005548789.localdomain podman[332621]: 
Dec 06 10:22:02 np0005548789.localdomain podman[332621]: 2025-12-06 10:22:02.928532664 +0000 UTC m=+0.087192726 container create 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:22:02 np0005548789.localdomain systemd[1]: Started libpod-conmon-98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b.scope.
Dec 06 10:22:02 np0005548789.localdomain podman[332621]: 2025-12-06 10:22:02.890208142 +0000 UTC m=+0.048868184 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:22:02 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:22:03 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf868e69c05e71759c1e3d850cc649a6050ae655aa7438d5cc3256a2e4a2d10a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:22:03 np0005548789.localdomain podman[332621]: 2025-12-06 10:22:03.015899854 +0000 UTC m=+0.174559936 container init 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:22:03 np0005548789.localdomain dnsmasq[332659]: started, version 2.85 cachesize 150
Dec 06 10:22:03 np0005548789.localdomain dnsmasq[332659]: DNS service limited to local subnets
Dec 06 10:22:03 np0005548789.localdomain dnsmasq[332659]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:22:03 np0005548789.localdomain dnsmasq[332659]: warning: no upstream servers configured
Dec 06 10:22:03 np0005548789.localdomain dnsmasq-dhcp[332659]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Dec 06 10:22:03 np0005548789.localdomain podman[332621]: 2025-12-06 10:22:03.044583341 +0000 UTC m=+0.203243403 container start 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:22:03 np0005548789.localdomain dnsmasq[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/addn_hosts - 0 addresses
Dec 06 10:22:03 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/host
Dec 06 10:22:03 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/opts
Dec 06 10:22:03 np0005548789.localdomain sudo[332384]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:03 np0005548789.localdomain sudo[332693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:22:03 np0005548789.localdomain sudo[332693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:03 np0005548789.localdomain sudo[332693]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:03 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:03.463 263652 INFO neutron.agent.dhcp.agent [None req-4949f628-6e62-4804-8a88-e673699185ca - - - - - -] DHCP configuration for ports {'9cdb81a3-442c-4dfd-a5c8-275c2f6a56c6'} is completed
Dec 06 10:22:03 np0005548789.localdomain sudo[332711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:22:03 np0005548789.localdomain sudo[332711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: pgmap v411: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 59 KiB/s rd, 25 MiB/s wr, 92 op/s
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:04.157 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548789.localdomain sudo[332711]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:04.312 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:04 np0005548789.localdomain sudo[332760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:22:04 np0005548789.localdomain sudo[332760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:22:04 np0005548789.localdomain sudo[332760]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "format": "json"}]: dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0512b167-8d53-4c3b-a204-1d35afbfc3ad", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:22:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:05.270 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:22:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:22:05 np0005548789.localdomain podman[332779]: 2025-12-06 10:22:05.936154151 +0000 UTC m=+0.080906594 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:22:05 np0005548789.localdomain podman[332779]: 2025-12-06 10:22:05.942210227 +0000 UTC m=+0.086962670 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:22:05 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:22:05 np0005548789.localdomain systemd[1]: tmp-crun.NmVjtt.mount: Deactivated successfully.
Dec 06 10:22:05 np0005548789.localdomain podman[332778]: 2025-12-06 10:22:05.997988151 +0000 UTC m=+0.143061474 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec 06 10:22:06 np0005548789.localdomain podman[332778]: 2025-12-06 10:22:06.006013546 +0000 UTC m=+0.151086889 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:22:06 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:22:06 np0005548789.localdomain ceph-mon[298582]: pgmap v412: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 59 KiB/s rd, 25 MiB/s wr, 92 op/s
Dec 06 10:22:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1c1b8ab9-c6b5-4cb1-94fb-5001df83d93e", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:06 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:06.346 263652 INFO neutron.agent.linux.ip_lib [None req-5a0f97f5-1431-492f-9642-7c750c8f561f - - - - - -] Device tap4430879c-2d cannot be used as it has no MAC address
Dec 06 10:22:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:06.404 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:06 np0005548789.localdomain kernel: device tap4430879c-2d entered promiscuous mode
Dec 06 10:22:06 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016526.4135] manager: (tap4430879c-2d): new Generic device (/org/freedesktop/NetworkManager/Devices/79)
Dec 06 10:22:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:06.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:06Z|00485|binding|INFO|Claiming lport 4430879c-2d53-4fe9-afc4-ecc4737ece3d for this chassis.
Dec 06 10:22:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:06Z|00486|binding|INFO|4430879c-2d53-4fe9-afc4-ecc4737ece3d: Claiming unknown
Dec 06 10:22:06 np0005548789.localdomain systemd-udevd[332828]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:06Z|00487|binding|INFO|Setting lport 4430879c-2d53-4fe9-afc4-ecc4737ece3d ovn-installed in OVS
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:06.456 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap4430879c-2d: No such device
Dec 06 10:22:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:06.492 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-0f63818b-46da-4610-917f-48a4c73bfa86', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f63818b-46da-4610-917f-48a4c73bfa86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4628e25-25c8-4fdc-a3c4-cda346229624, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=4430879c-2d53-4fe9-afc4-ecc4737ece3d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:06 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:06Z|00488|binding|INFO|Setting lport 4430879c-2d53-4fe9-afc4-ecc4737ece3d up in Southbound
Dec 06 10:22:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:06.493 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 4430879c-2d53-4fe9-afc4-ecc4737ece3d in datapath 0f63818b-46da-4610-917f-48a4c73bfa86 bound to our chassis
Dec 06 10:22:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:06.495 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0f63818b-46da-4610-917f-48a4c73bfa86 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:22:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:06.496 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:06 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:06.496 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c75aec0d-369b-4e52-b649-e5a87afb6df1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:06.523 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "format": "json"}]: dispatch
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 e193: 6 total, 6 up, 6 in
Dec 06 10:22:07 np0005548789.localdomain podman[332899]: 
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.406521) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527406604, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 447, "num_deletes": 253, "total_data_size": 409840, "memory_usage": 419176, "flush_reason": "Manual Compaction"}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec 06 10:22:07 np0005548789.localdomain podman[332899]: 2025-12-06 10:22:07.407530308 +0000 UTC m=+0.054399314 container create 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527411400, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 270596, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29784, "largest_seqno": 30226, "table_properties": {"data_size": 267814, "index_size": 765, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 6684, "raw_average_key_size": 18, "raw_value_size": 261976, "raw_average_value_size": 727, "num_data_blocks": 29, "num_entries": 360, "num_filter_entries": 360, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016522, "oldest_key_time": 1765016522, "file_creation_time": 1765016527, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 4927 microseconds, and 2009 cpu microseconds.
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.411453) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 270596 bytes OK
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.411483) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.416572) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.416598) EVENT_LOG_v1 {"time_micros": 1765016527416590, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.416623) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 406914, prev total WAL file size 406914, number of live WAL files 2.
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.417498) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353336' seq:72057594037927935, type:22 .. '6B760031373930' seq:0, type:0; will stop at (end)
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(264KB)], [48(18MB)]
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527417555, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 19828612, "oldest_snapshot_seqno": -1}
Dec 06 10:22:07 np0005548789.localdomain systemd[1]: Started libpod-conmon-294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747.scope.
Dec 06 10:22:07 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:22:07 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53d7169593b190df1c1e48dc9de962f6b6ad6b884e0167968ce6145110e64e36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:22:07 np0005548789.localdomain podman[332899]: 2025-12-06 10:22:07.379478861 +0000 UTC m=+0.026347867 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:22:07 np0005548789.localdomain podman[332899]: 2025-12-06 10:22:07.508887957 +0000 UTC m=+0.155756973 container init 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13146 keys, 18733015 bytes, temperature: kUnknown
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527510078, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 18733015, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18659953, "index_size": 39105, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32901, "raw_key_size": 353794, "raw_average_key_size": 26, "raw_value_size": 18438154, "raw_average_value_size": 1402, "num_data_blocks": 1451, "num_entries": 13146, "num_filter_entries": 13146, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016527, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.510573) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 18733015 bytes
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.513793) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.7 rd, 201.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 18.7 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(142.5) write-amplify(69.2) OK, records in: 13678, records dropped: 532 output_compression: NoCompression
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.513824) EVENT_LOG_v1 {"time_micros": 1765016527513809, "job": 28, "event": "compaction_finished", "compaction_time_micros": 92777, "compaction_time_cpu_micros": 47590, "output_level": 6, "num_output_files": 1, "total_output_size": 18733015, "num_input_records": 13678, "num_output_records": 13146, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527514221, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016527516508, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.417214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516599) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:07.516604) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:07 np0005548789.localdomain podman[332899]: 2025-12-06 10:22:07.519633945 +0000 UTC m=+0.166502991 container start 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:07 np0005548789.localdomain dnsmasq[332917]: started, version 2.85 cachesize 150
Dec 06 10:22:07 np0005548789.localdomain dnsmasq[332917]: DNS service limited to local subnets
Dec 06 10:22:07 np0005548789.localdomain dnsmasq[332917]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:22:07 np0005548789.localdomain dnsmasq[332917]: warning: no upstream servers configured
Dec 06 10:22:07 np0005548789.localdomain dnsmasq-dhcp[332917]: DHCP, static leases only on 10.103.0.0, lease time 1d
Dec 06 10:22:07 np0005548789.localdomain dnsmasq[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/addn_hosts - 0 addresses
Dec 06 10:22:07 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/host
Dec 06 10:22:07 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/opts
Dec 06 10:22:07 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:07.684 263652 INFO neutron.agent.dhcp.agent [None req-c1f47ad4-f6b2-4230-a523-c11a694e1c6c - - - - - -] DHCP configuration for ports {'891ff69b-32bd-4c75-bfdb-8e56dbfa03a3'} is completed
Dec 06 10:22:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:08 np0005548789.localdomain ceph-mon[298582]: pgmap v413: 177 pgs: 177 active+clean; 394 MiB data, 1.5 GiB used, 40 GiB / 42 GiB avail; 48 KiB/s rd, 20 MiB/s wr, 74 op/s
Dec 06 10:22:08 np0005548789.localdomain ceph-mon[298582]: osdmap e193: 6 total, 6 up, 6 in
Dec 06 10:22:08 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:08.940 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:08Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc661f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc66190>], id=705ae66d-542f-4ec2-ac8f-4e6b6578ea8b, ip_allocation=immediate, mac_address=fa:16:3e:e4:07:d4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:04Z, description=, dns_domain=, id=0f63818b-46da-4610-917f-48a4c73bfa86, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1667240543, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16595, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3081, status=ACTIVE, subnets=['b8628baa-738e-4823-9d4e-66c46fc00679'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:05Z, vlan_transparent=None, network_id=0f63818b-46da-4610-917f-48a4c73bfa86, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3111, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:08Z on network 0f63818b-46da-4610-917f-48a4c73bfa86
Dec 06 10:22:08 np0005548789.localdomain sshd[332918]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:22:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:09.210 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:09 np0005548789.localdomain systemd[1]: tmp-crun.bue4S9.mount: Deactivated successfully.
Dec 06 10:22:09 np0005548789.localdomain dnsmasq[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/addn_hosts - 1 addresses
Dec 06 10:22:09 np0005548789.localdomain podman[332935]: 2025-12-06 10:22:09.230287956 +0000 UTC m=+0.111863710 container kill 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:09 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/host
Dec 06 10:22:09 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/opts
Dec 06 10:22:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:09.305 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:09Z, description=, device_id=d826eb4e-7fd5-4cdb-9ce2-8323b958c600, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb47b50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb474f0>], id=1292f95c-a471-4d24-82b6-dc839c334a0e, ip_allocation=immediate, mac_address=fa:16:3e:f4:55:31, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:59Z, description=, dns_domain=, id=b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1471924923, port_security_enabled=True, project_id=b4daafaf0e264da6a728bdd60c5d6377, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56487, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3055, status=ACTIVE, subnets=['d1ff6954-2d67-457c-b6ca-44990d6a79f2'], tags=[], tenant_id=b4daafaf0e264da6a728bdd60c5d6377, updated_at=2025-12-06T10:22:00Z, vlan_transparent=None, network_id=b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, port_security_enabled=False, project_id=b4daafaf0e264da6a728bdd60c5d6377, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3112, status=DOWN, tags=[], tenant_id=b4daafaf0e264da6a728bdd60c5d6377, updated_at=2025-12-06T10:22:09Z on network b212e2be-4a1a-42c0-a3cd-d5a930c0c30a
Dec 06 10:22:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:09 np0005548789.localdomain dnsmasq[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/addn_hosts - 1 addresses
Dec 06 10:22:09 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/host
Dec 06 10:22:09 np0005548789.localdomain podman[332974]: 2025-12-06 10:22:09.480215447 +0000 UTC m=+0.047331798 container kill 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:22:09 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/opts
Dec 06 10:22:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:09.550 263652 INFO neutron.agent.dhcp.agent [None req-09e42bbe-5a3f-47a8-8c31-0b8c43871525 - - - - - -] DHCP configuration for ports {'705ae66d-542f-4ec2-ac8f-4e6b6578ea8b'} is completed
Dec 06 10:22:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:09.815 263652 INFO neutron.agent.dhcp.agent [None req-e4152a86-7c49-4f40-b4c4-9a9caddbab89 - - - - - -] DHCP configuration for ports {'1292f95c-a471-4d24-82b6-dc839c334a0e'} is completed
Dec 06 10:22:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:09.925 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:08Z, description=, device_id=2b50982b-9c62-4665-9d93-c7bc12ad6e52, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb42ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbdccd0>], id=705ae66d-542f-4ec2-ac8f-4e6b6578ea8b, ip_allocation=immediate, mac_address=fa:16:3e:e4:07:d4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:04Z, description=, dns_domain=, id=0f63818b-46da-4610-917f-48a4c73bfa86, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1667240543, port_security_enabled=True, project_id=e82deaff368b4feea9fec0f06459a6ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16595, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3081, status=ACTIVE, subnets=['b8628baa-738e-4823-9d4e-66c46fc00679'], tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:05Z, vlan_transparent=None, network_id=0f63818b-46da-4610-917f-48a4c73bfa86, port_security_enabled=False, project_id=e82deaff368b4feea9fec0f06459a6ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3111, status=DOWN, tags=[], tenant_id=e82deaff368b4feea9fec0f06459a6ca, updated_at=2025-12-06T10:22:08Z on network 0f63818b-46da-4610-917f-48a4c73bfa86
Dec 06 10:22:09 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:09.978 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:09Z, description=, device_id=d826eb4e-7fd5-4cdb-9ce2-8323b958c600, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9d4160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fbec1f0>], id=1292f95c-a471-4d24-82b6-dc839c334a0e, ip_allocation=immediate, mac_address=fa:16:3e:f4:55:31, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:21:59Z, description=, dns_domain=, id=b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1471924923, port_security_enabled=True, project_id=b4daafaf0e264da6a728bdd60c5d6377, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56487, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3055, status=ACTIVE, subnets=['d1ff6954-2d67-457c-b6ca-44990d6a79f2'], tags=[], tenant_id=b4daafaf0e264da6a728bdd60c5d6377, updated_at=2025-12-06T10:22:00Z, vlan_transparent=None, network_id=b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, port_security_enabled=False, project_id=b4daafaf0e264da6a728bdd60c5d6377, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3112, status=DOWN, tags=[], tenant_id=b4daafaf0e264da6a728bdd60c5d6377, updated_at=2025-12-06T10:22:09Z on network b212e2be-4a1a-42c0-a3cd-d5a930c0c30a
Dec 06 10:22:10 np0005548789.localdomain dnsmasq[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/addn_hosts - 1 addresses
Dec 06 10:22:10 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/host
Dec 06 10:22:10 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/opts
Dec 06 10:22:10 np0005548789.localdomain podman[333028]: 2025-12-06 10:22:10.163250025 +0000 UTC m=+0.074723514 container kill 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:22:10 np0005548789.localdomain dnsmasq[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/addn_hosts - 1 addresses
Dec 06 10:22:10 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/host
Dec 06 10:22:10 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/opts
Dec 06 10:22:10 np0005548789.localdomain podman[333035]: 2025-12-06 10:22:10.196844972 +0000 UTC m=+0.084780142 container kill 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:22:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:10.307 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:10 np0005548789.localdomain sshd[332918]: Received disconnect from 14.194.101.210 port 53908:11: Bye Bye [preauth]
Dec 06 10:22:10 np0005548789.localdomain sshd[332918]: Disconnected from authenticating user root 14.194.101.210 port 53908 [preauth]
Dec 06 10:22:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4a25353d-60c6-4e9f-96d0-37cd3eb987b5", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:10 np0005548789.localdomain ceph-mon[298582]: pgmap v415: 177 pgs: 177 active+clean; 514 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 22 KiB/s rd, 27 MiB/s wr, 44 op/s
Dec 06 10:22:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:10.542 263652 INFO neutron.agent.dhcp.agent [None req-c1d2e46e-b923-4709-a0e9-a869c04844fa - - - - - -] DHCP configuration for ports {'705ae66d-542f-4ec2-ac8f-4e6b6578ea8b'} is completed
Dec 06 10:22:10 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:10.788 263652 INFO neutron.agent.dhcp.agent [None req-5faa22a5-138a-4cec-b7c2-55a9e2c93096 - - - - - -] DHCP configuration for ports {'1292f95c-a471-4d24-82b6-dc839c334a0e'} is completed
Dec 06 10:22:11 np0005548789.localdomain dnsmasq[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/addn_hosts - 0 addresses
Dec 06 10:22:11 np0005548789.localdomain podman[333087]: 2025-12-06 10:22:11.384852349 +0000 UTC m=+0.045375328 container kill 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:22:11 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/host
Dec 06 10:22:11 np0005548789.localdomain dnsmasq-dhcp[332659]: read /var/lib/neutron/dhcp/b212e2be-4a1a-42c0-a3cd-d5a930c0c30a/opts
Dec 06 10:22:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:11.581 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:11Z|00489|binding|INFO|Releasing lport 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 from this chassis (sb_readonly=0)
Dec 06 10:22:11 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:11Z|00490|binding|INFO|Setting lport 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 down in Southbound
Dec 06 10:22:11 np0005548789.localdomain kernel: device tap27deb0c2-77 left promiscuous mode
Dec 06 10:22:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:11.598 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4daafaf0e264da6a728bdd60c5d6377', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=788a81a8-1f71-42fc-a0bb-d48b0228b6cf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=27deb0c2-77bb-47c5-bae8-ebe8fdf091d3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:11.600 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 27deb0c2-77bb-47c5-bae8-ebe8fdf091d3 in datapath b212e2be-4a1a-42c0-a3cd-d5a930c0c30a unbound from our chassis
Dec 06 10:22:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:11.602 160509 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b212e2be-4a1a-42c0-a3cd-d5a930c0c30a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Dec 06 10:22:11 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:11.603 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e42a30eb-d1aa-4bfb-a519-198dcf7174c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:11.608 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:12 np0005548789.localdomain ceph-mon[298582]: pgmap v416: 177 pgs: 177 active+clean; 514 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 19 KiB/s rd, 23 MiB/s wr, 37 op/s
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.354 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:13.355 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.355 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4c1f4920-d346-409d-af49-370c9b85e205", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4c1f4920-d346-409d-af49-370c9b85e205", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "format": "json"}]: dispatch
Dec 06 10:22:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:13 np0005548789.localdomain systemd[1]: tmp-crun.0zYZXP.mount: Deactivated successfully.
Dec 06 10:22:13 np0005548789.localdomain dnsmasq[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/addn_hosts - 0 addresses
Dec 06 10:22:13 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/host
Dec 06 10:22:13 np0005548789.localdomain podman[333126]: 2025-12-06 10:22:13.632714274 +0000 UTC m=+0.046234175 container kill 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:13 np0005548789.localdomain dnsmasq-dhcp[332917]: read /var/lib/neutron/dhcp/0f63818b-46da-4610-917f-48a4c73bfa86/opts
Dec 06 10:22:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:22:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:22:13 np0005548789.localdomain podman[333139]: 2025-12-06 10:22:13.710088158 +0000 UTC m=+0.053539697 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:22:13 np0005548789.localdomain podman[333139]: 2025-12-06 10:22:13.7281328 +0000 UTC m=+0.071584359 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Dec 06 10:22:13 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:22:13 np0005548789.localdomain podman[333141]: 2025-12-06 10:22:13.773356432 +0000 UTC m=+0.115966705 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:13 np0005548789.localdomain podman[333141]: 2025-12-06 10:22:13.779378126 +0000 UTC m=+0.121988379 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:22:13 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:22:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:13.913 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:13 np0005548789.localdomain kernel: device tap4430879c-2d left promiscuous mode
Dec 06 10:22:13 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:13Z|00491|binding|INFO|Releasing lport 4430879c-2d53-4fe9-afc4-ecc4737ece3d from this chassis (sb_readonly=0)
Dec 06 10:22:13 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:13Z|00492|binding|INFO|Setting lport 4430879c-2d53-4fe9-afc4-ecc4737ece3d down in Southbound
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.933 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-0f63818b-46da-4610-917f-48a4c73bfa86', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f63818b-46da-4610-917f-48a4c73bfa86', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c4628e25-25c8-4fdc-a3c4-cda346229624, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=4430879c-2d53-4fe9-afc4-ecc4737ece3d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.935 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 4430879c-2d53-4fe9-afc4-ecc4737ece3d in datapath 0f63818b-46da-4610-917f-48a4c73bfa86 unbound from our chassis
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.939 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f63818b-46da-4610-917f-48a4c73bfa86, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:13 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:13.939 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[36f81f07-71a7-4ed9-aa2d-d35d2903324b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:13.978 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:14.213 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:14 np0005548789.localdomain ceph-mon[298582]: pgmap v417: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:14 np0005548789.localdomain dnsmasq[332917]: exiting on receipt of SIGTERM
Dec 06 10:22:14 np0005548789.localdomain podman[333200]: 2025-12-06 10:22:14.523338968 +0000 UTC m=+0.064364679 container kill 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:14 np0005548789.localdomain systemd[1]: libpod-294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747.scope: Deactivated successfully.
Dec 06 10:22:14 np0005548789.localdomain podman[333214]: 2025-12-06 10:22:14.598473805 +0000 UTC m=+0.058689455 container died 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:22:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-53d7169593b190df1c1e48dc9de962f6b6ad6b884e0167968ce6145110e64e36-merged.mount: Deactivated successfully.
Dec 06 10:22:14 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:14 np0005548789.localdomain podman[333214]: 2025-12-06 10:22:14.686496295 +0000 UTC m=+0.146711885 container remove 294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f63818b-46da-4610-917f-48a4c73bfa86, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Dec 06 10:22:14 np0005548789.localdomain systemd[1]: libpod-conmon-294ed2ade3cdaf5dc40b3998609fd207e11e78054502a44b21052a2751a92747.scope: Deactivated successfully.
Dec 06 10:22:14 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d0f63818b\x2d46da\x2d4610\x2d917f\x2d48a4c73bfa86.mount: Deactivated successfully.
Dec 06 10:22:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:14.939 263652 INFO neutron.agent.dhcp.agent [None req-ff50923f-6bc2-49b4-aee2-4c9179c6c120 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:14.966 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:15.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:15 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:15.507 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:15 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:15Z|00493|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:22:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:15.756 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:16 np0005548789.localdomain systemd[1]: tmp-crun.J1nZ1R.mount: Deactivated successfully.
Dec 06 10:22:16 np0005548789.localdomain dnsmasq[332659]: exiting on receipt of SIGTERM
Dec 06 10:22:16 np0005548789.localdomain podman[333257]: 2025-12-06 10:22:16.108628157 +0000 UTC m=+0.079963336 container kill 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:16 np0005548789.localdomain systemd[1]: libpod-98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b.scope: Deactivated successfully.
Dec 06 10:22:16 np0005548789.localdomain podman[333271]: 2025-12-06 10:22:16.186846128 +0000 UTC m=+0.065044960 container died 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:16 np0005548789.localdomain podman[333271]: 2025-12-06 10:22:16.219835556 +0000 UTC m=+0.098034338 container cleanup 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:22:16 np0005548789.localdomain systemd[1]: libpod-conmon-98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b.scope: Deactivated successfully.
Dec 06 10:22:16 np0005548789.localdomain podman[333273]: 2025-12-06 10:22:16.296483689 +0000 UTC m=+0.165808329 container remove 98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b212e2be-4a1a-42c0-a3cd-d5a930c0c30a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:22:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:16.326 263652 INFO neutron.agent.dhcp.agent [None req-401e84f8-0097-4620-913f-c43e89ccdc7b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:16 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:16.327 263652 INFO neutron.agent.dhcp.agent [None req-401e84f8-0097-4620-913f-c43e89ccdc7b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: pgmap v418: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "format": "json"}]: dispatch
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.578979) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536579291, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 377, "num_deletes": 251, "total_data_size": 194848, "memory_usage": 203224, "flush_reason": "Manual Compaction"}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536582614, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 125501, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30231, "largest_seqno": 30603, "table_properties": {"data_size": 123249, "index_size": 363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6109, "raw_average_key_size": 19, "raw_value_size": 118707, "raw_average_value_size": 384, "num_data_blocks": 16, "num_entries": 309, "num_filter_entries": 309, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016528, "oldest_key_time": 1765016528, "file_creation_time": 1765016536, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 3679 microseconds, and 1690 cpu microseconds.
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.582673) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 125501 bytes OK
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.582698) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.584328) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.584352) EVENT_LOG_v1 {"time_micros": 1765016536584344, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.584379) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 192318, prev total WAL file size 192318, number of live WAL files 2.
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.585228) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(122KB)], [51(17MB)]
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536585283, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18858516, "oldest_snapshot_seqno": -1}
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:22:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12940 keys, 17664003 bytes, temperature: kUnknown
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536686258, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17664003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17593559, "index_size": 36999, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32389, "raw_key_size": 350035, "raw_average_key_size": 27, "raw_value_size": 17376636, "raw_average_value_size": 1342, "num_data_blocks": 1359, "num_entries": 12940, "num_filter_entries": 12940, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016536, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.686587) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17664003 bytes
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.688587) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.6 rd, 174.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 17.9 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(291.0) write-amplify(140.7) OK, records in: 13455, records dropped: 515 output_compression: NoCompression
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.688610) EVENT_LOG_v1 {"time_micros": 1765016536688599, "job": 30, "event": "compaction_finished", "compaction_time_micros": 101056, "compaction_time_cpu_micros": 48983, "output_level": 6, "num_output_files": 1, "total_output_size": 17664003, "num_input_records": 13455, "num_output_records": 12940, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536688746, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016536690610, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.585093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.690634) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.690639) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.690640) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.690642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:16 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:22:16.690643) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:22:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-bf868e69c05e71759c1e3d850cc649a6050ae655aa7438d5cc3256a2e4a2d10a-merged.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98149fe39736f2b06b84af45fc052baa53e53a5b040ce189b48bcd27e51b9b5b-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2db212e2be\x2d4a1a\x2d42c0\x2da3cd\x2dd5a930c0c30a.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "format": "json"}]: dispatch
Dec 06 10:22:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:22:17 np0005548789.localdomain systemd[1]: tmp-crun.SqqHWT.mount: Deactivated successfully.
Dec 06 10:22:17 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:17Z|00494|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:22:17 np0005548789.localdomain podman[333302]: 2025-12-06 10:22:17.936671287 +0000 UTC m=+0.097657886 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:22:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:17.965 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:17 np0005548789.localdomain podman[333302]: 2025-12-06 10:22:17.999391725 +0000 UTC m=+0.160378344 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 06 10:22:18 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:22:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:18 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:18.357 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:22:18 np0005548789.localdomain ceph-mon[298582]: pgmap v419: 177 pgs: 177 active+clean; 642 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 18 KiB/s rd, 25 MiB/s wr, 36 op/s
Dec 06 10:22:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:19.251 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "format": "json"}]: dispatch
Dec 06 10:22:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:19 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:19Z|00495|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:22:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:19.964 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:20.355 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e4229277-1c8f-4f16-a3c9-ae90dd92f437", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "target_sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548789.localdomain ceph-mon[298582]: pgmap v420: 177 pgs: 177 active+clean; 746 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 22 KiB/s rd, 30 MiB/s wr, 45 op/s
Dec 06 10:22:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:20 np0005548789.localdomain dnsmasq[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/addn_hosts - 0 addresses
Dec 06 10:22:20 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/host
Dec 06 10:22:20 np0005548789.localdomain podman[333339]: 2025-12-06 10:22:20.815918631 +0000 UTC m=+0.045893063 container kill 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:20 np0005548789.localdomain dnsmasq-dhcp[332273]: read /var/lib/neutron/dhcp/23332176-d495-43f0-b960-60f576e19db9/opts
Dec 06 10:22:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:22:20 np0005548789.localdomain podman[333352]: 2025-12-06 10:22:20.928561084 +0000 UTC m=+0.090459706 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:22:20 np0005548789.localdomain podman[333352]: 2025-12-06 10:22:20.965334458 +0000 UTC m=+0.127233080 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:22:20 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:20Z|00496|binding|INFO|Releasing lport 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 from this chassis (sb_readonly=0)
Dec 06 10:22:20 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:20Z|00497|binding|INFO|Setting lport 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 down in Southbound
Dec 06 10:22:20 np0005548789.localdomain kernel: device tap72ee7f1b-13 left promiscuous mode
Dec 06 10:22:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:20.978 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:20 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:22:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:20.989 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-23332176-d495-43f0-b960-60f576e19db9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23332176-d495-43f0-b960-60f576e19db9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e82deaff368b4feea9fec0f06459a6ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8461d03-d92a-4a33-8802-d634072db402, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=72ee7f1b-132c-4087-bcf6-0dc2886ccb24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:20.991 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 72ee7f1b-132c-4087-bcf6-0dc2886ccb24 in datapath 23332176-d495-43f0-b960-60f576e19db9 unbound from our chassis
Dec 06 10:22:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:20.994 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23332176-d495-43f0-b960-60f576e19db9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:20.995 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[335dcbf4-a45a-4da7-ae54-a11aaa510a93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:21.000 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:21 np0005548789.localdomain dnsmasq[332273]: exiting on receipt of SIGTERM
Dec 06 10:22:21 np0005548789.localdomain podman[333400]: 2025-12-06 10:22:21.413110696 +0000 UTC m=+0.065142772 container kill 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:22:21 np0005548789.localdomain systemd[1]: libpod-071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827.scope: Deactivated successfully.
Dec 06 10:22:21 np0005548789.localdomain podman[333412]: 2025-12-06 10:22:21.48881479 +0000 UTC m=+0.060234741 container died 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:22:21 np0005548789.localdomain podman[333412]: 2025-12-06 10:22:21.52022597 +0000 UTC m=+0.091645891 container cleanup 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:21 np0005548789.localdomain systemd[1]: libpod-conmon-071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827.scope: Deactivated successfully.
Dec 06 10:22:21 np0005548789.localdomain podman[333414]: 2025-12-06 10:22:21.569288501 +0000 UTC m=+0.126333293 container remove 071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-23332176-d495-43f0-b960-60f576e19db9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:22:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:21.653 263652 INFO neutron.agent.dhcp.agent [None req-5da40bd6-5ea1-4d09-9398-a7019386d725 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:21 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:21.654 263652 INFO neutron.agent.dhcp.agent [None req-5da40bd6-5ea1-4d09-9398-a7019386d725 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:22:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-988891ef40577becefdf38dadb1a51a6fd48e58dae4cb75c05349582a2ed8956-merged.mount: Deactivated successfully.
Dec 06 10:22:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-071dbc5c9b86072f2d5bda447e10edc83b5e25677868e8ab3267454575f41827-userdata-shm.mount: Deactivated successfully.
Dec 06 10:22:21 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d23332176\x2dd495\x2d43f0\x2db960\x2d60f576e19db9.mount: Deactivated successfully.
Dec 06 10:22:21 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:21Z|00498|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:22:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:21.910 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: pgmap v421: 177 pgs: 177 active+clean; 746 MiB data, 2.5 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 28 op/s
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: mgrmap e51: np0005548790.kvkfyr(active, since 10m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:22 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2693859626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "format": "json"}]: dispatch
Dec 06 10:22:23 np0005548789.localdomain ceph-mon[298582]: pgmap v422: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 21 KiB/s rd, 29 MiB/s wr, 44 op/s
Dec 06 10:22:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:22:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:22:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:22:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:22:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:22:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19264 "" "Go-http-client/1.1"
Dec 06 10:22:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:24.358 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:25.357 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:22:25 np0005548789.localdomain podman[333441]: 2025-12-06 10:22:25.915188109 +0000 UTC m=+0.077060786 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:25 np0005548789.localdomain podman[333441]: 2025-12-06 10:22:25.952479699 +0000 UTC m=+0.114352396 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:22:25 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:22:26 np0005548789.localdomain ceph-mon[298582]: pgmap v423: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 30 op/s
Dec 06 10:22:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea385aa8-7125-4234-a610-ef3ee4890f37", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "format": "json"}]: dispatch
Dec 06 10:22:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d56ed464-2af3-4efa-8437-4562ed59da6b", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:28 np0005548789.localdomain ceph-mon[298582]: pgmap v424: 177 pgs: 177 active+clean; 867 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 14 KiB/s rd, 19 MiB/s wr, 30 op/s
Dec 06 10:22:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:29.384 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "731d0819-2292-4c89-bff7-ec72ce366121", "format": "json"}]: dispatch
Dec 06 10:22:29 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3455244347' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:30.360 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "731d0819-2292-4c89-bff7-ec72ce366121", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548789.localdomain ceph-mon[298582]: pgmap v425: 177 pgs: 177 active+clean; 987 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 21 KiB/s rd, 29 MiB/s wr, 45 op/s
Dec 06 10:22:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971_3308a422-28d2-40bc-9817-d02064ebbe3c", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "snap_name": "34ca82e6-47fe-4edf-98f0-b49a22c3b971", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:30 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e194 e194: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548789.localdomain ceph-mon[298582]: osdmap e194: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e195 e195: 6 total, 6 up, 6 in
Dec 06 10:22:31 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:22:32 np0005548789.localdomain ceph-mon[298582]: pgmap v427: 177 pgs: 177 active+clean; 987 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 16 KiB/s rd, 24 MiB/s wr, 37 op/s
Dec 06 10:22:32 np0005548789.localdomain ceph-mon[298582]: osdmap e195: 6 total, 6 up, 6 in
Dec 06 10:22:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "04a68994-1285-4b19-bd78-8daa43192107", "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "format": "json"}]: dispatch
Dec 06 10:22:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9d3cab8c-98e7-4693-a92b-d356598b900a", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:34.409 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:35 np0005548789.localdomain ceph-mon[298582]: pgmap v429: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 30 MiB/s wr, 76 op/s
Dec 06 10:22:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:35.362 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548789.localdomain ceph-mon[298582]: pgmap v430: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 30 MiB/s wr, 76 op/s
Dec 06 10:22:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "04a68994-1285-4b19-bd78-8daa43192107", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:22:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:22:36 np0005548789.localdomain podman[333466]: 2025-12-06 10:22:36.93193966 +0000 UTC m=+0.087300299 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:22:36 np0005548789.localdomain podman[333466]: 2025-12-06 10:22:36.970017684 +0000 UTC m=+0.125378293 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 10:22:36 np0005548789.localdomain podman[333467]: 2025-12-06 10:22:36.991069828 +0000 UTC m=+0.142652692 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:22:36 np0005548789.localdomain systemd[1]: tmp-crun.cULO71.mount: Deactivated successfully.
Dec 06 10:22:36 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:22:37 np0005548789.localdomain podman[333467]: 2025-12-06 10:22:37.004055075 +0000 UTC m=+0.155637959 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:22:37 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:22:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1087206490' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.204 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.205 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.205 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.206 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.206 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e196 e196: 6 total, 6 up, 6 in
Dec 06 10:22:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:37 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4090728169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.657 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.742 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.743 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.949 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.950 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11175MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.951 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:37.951 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.018 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.019 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.019 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.069 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3578013769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.527 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.532 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.563 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: pgmap v431: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 2.6 MiB/s rd, 15 MiB/s wr, 54 op/s
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: osdmap e196: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4090728169' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/749219856' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3578013769' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e197 e197: 6 total, 6 up, 6 in
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.595 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:22:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:38.596 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:39.447 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c4b473ee-dbff-47ba-b3da-5329e0795d44", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "df8d11b3-b101-4628-bf7f-13330bfcfc51", "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: osdmap e197: 6 total, 6 up, 6 in
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/457341292' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/819637897' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:40.364 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:40.594 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e198 e198: 6 total, 6 up, 6 in
Dec 06 10:22:40 np0005548789.localdomain ceph-mon[298582]: pgmap v434: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 2.8 MiB/s rd, 33 MiB/s wr, 183 op/s
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.277 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.278 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.278 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.279 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:22:41 np0005548789.localdomain ceph-mon[298582]: osdmap e198: 6 total, 6 up, 6 in
Dec 06 10:22:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2929352609' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.844 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.865 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:22:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:41.865 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e199 e199: 6 total, 6 up, 6 in
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: pgmap v436: 177 pgs: 177 active+clean; 1.2 GiB data, 3.9 GiB used, 38 GiB / 42 GiB avail; 103 KiB/s rd, 21 MiB/s wr, 158 op/s
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567_04b38652-f50e-477c-8a7a-6a8616208060", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "snap_name": "14d68a54-7a75-45f3-abbd-01dbc7ace567", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "df8d11b3-b101-4628-bf7f-13330bfcfc51", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2211323013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2211323013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1478917471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:43 np0005548789.localdomain ceph-mon[298582]: osdmap e199: 6 total, 6 up, 6 in
Dec 06 10:22:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:22:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:22:43 np0005548789.localdomain podman[333552]: 2025-12-06 10:22:43.936408732 +0000 UTC m=+0.090681213 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:22:43 np0005548789.localdomain systemd[1]: tmp-crun.iTPYYx.mount: Deactivated successfully.
Dec 06 10:22:43 np0005548789.localdomain podman[333552]: 2025-12-06 10:22:43.979136138 +0000 UTC m=+0.133408589 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec 06 10:22:43 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:22:44 np0005548789.localdomain podman[333553]: 2025-12-06 10:22:43.980875461 +0000 UTC m=+0.132478660 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:44 np0005548789.localdomain podman[333553]: 2025-12-06 10:22:44.063193967 +0000 UTC m=+0.214797136 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:44 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:22:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:44.476 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:44 np0005548789.localdomain ceph-mon[298582]: pgmap v438: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 290 KiB/s rd, 24 MiB/s wr, 436 op/s
Dec 06 10:22:44 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e200 e200: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:45Z|00499|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:22:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:45.064 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:45 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:22:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:22:45 np0005548789.localdomain podman[333610]: 2025-12-06 10:22:45.071660845 +0000 UTC m=+0.075128567 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:22:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:22:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:45.367 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "format": "json"}]: dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1d7917f2-7128-46ce-8479-e0741aca1efd", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: osdmap e200: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1317054491' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e201 e201: 6 total, 6 up, 6 in
Dec 06 10:22:45 np0005548789.localdomain sshd[333632]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:22:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:46.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:46.184 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:46 np0005548789.localdomain sshd[333632]: Received disconnect from 64.227.102.57 port 55038:11: Bye Bye [preauth]
Dec 06 10:22:46 np0005548789.localdomain sshd[333632]: Disconnected from authenticating user root 64.227.102.57 port 55038 [preauth]
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:22:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:22:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:22:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "format": "json"}]: dispatch
Dec 06 10:22:46 np0005548789.localdomain ceph-mon[298582]: pgmap v440: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 171 KiB/s rd, 1.4 MiB/s wr, 254 op/s
Dec 06 10:22:46 np0005548789.localdomain ceph-mon[298582]: osdmap e201: 6 total, 6 up, 6 in
Dec 06 10:22:47 np0005548789.localdomain sshd[333634]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:22:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:47.340 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:47.341 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:47.341 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e202 e202: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: pgmap v442: 177 pgs: 177 active+clean; 195 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 171 KiB/s rd, 1.4 MiB/s wr, 254 op/s
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: osdmap e202: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3787706483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/869442583' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4009757146' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:48 np0005548789.localdomain sshd[333634]: Received disconnect from 154.113.10.34 port 55934:11: Bye Bye [preauth]
Dec 06 10:22:48 np0005548789.localdomain sshd[333634]: Disconnected from authenticating user root 154.113.10.34 port 55934 [preauth]
Dec 06 10:22:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e203 e203: 6 total, 6 up, 6 in
Dec 06 10:22:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:22:48 np0005548789.localdomain podman[333636]: 2025-12-06 10:22:48.638919112 +0000 UTC m=+0.066974808 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:22:48 np0005548789.localdomain podman[333636]: 2025-12-06 10:22:48.650152455 +0000 UTC m=+0.078208231 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:22:48 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:22:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:49.178 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:49.521 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "format": "json"}]: dispatch
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "64212ab2-3c88-4fa8-92e9-e7786f748419", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: osdmap e203: 6 total, 6 up, 6 in
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1713529106' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2088044712' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:22:49 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e204 e204: 6 total, 6 up, 6 in
Dec 06 10:22:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:50.369 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:50 np0005548789.localdomain ceph-mon[298582]: pgmap v445: 177 pgs: 177 active+clean; 196 MiB data, 984 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 51 KiB/s wr, 84 op/s
Dec 06 10:22:50 np0005548789.localdomain ceph-mon[298582]: osdmap e204: 6 total, 6 up, 6 in
Dec 06 10:22:50 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e205 e205: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548789.localdomain ceph-mon[298582]: osdmap e205: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e206 e206: 6 total, 6 up, 6 in
Dec 06 10:22:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:22:51 np0005548789.localdomain systemd[1]: tmp-crun.zK97n9.mount: Deactivated successfully.
Dec 06 10:22:51 np0005548789.localdomain podman[333656]: 2025-12-06 10:22:51.942682441 +0000 UTC m=+0.106188847 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:22:51 np0005548789.localdomain podman[333656]: 2025-12-06 10:22:51.954256815 +0000 UTC m=+0.117763231 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:22:51 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:22:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e207 e207: 6 total, 6 up, 6 in
Dec 06 10:22:52 np0005548789.localdomain ceph-mon[298582]: pgmap v448: 177 pgs: 177 active+clean; 196 MiB data, 984 MiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 58 KiB/s wr, 95 op/s
Dec 06 10:22:52 np0005548789.localdomain ceph-mon[298582]: osdmap e206: 6 total, 6 up, 6 in
Dec 06 10:22:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:22:52 np0005548789.localdomain ceph-mon[298582]: osdmap e207: 6 total, 6 up, 6 in
Dec 06 10:22:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:22:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "format": "json"}]: dispatch
Dec 06 10:22:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:22:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:22:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:22:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:22:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:22:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1"
Dec 06 10:22:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:54.522 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e208 e208: 6 total, 6 up, 6 in
Dec 06 10:22:54 np0005548789.localdomain ceph-mon[298582]: pgmap v451: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 202 KiB/s rd, 56 KiB/s wr, 286 op/s
Dec 06 10:22:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2991255880' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.298 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.299 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.298 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.371 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:55.520 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:55Z, description=, device_id=d17ccc6d-49c1-4c04-b4b2-d05430f70927, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcc22b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fcc2730>], id=8f530efb-7550-478d-addb-425f279de982, ip_allocation=immediate, mac_address=fa:16:3e:33:08:6a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3291, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:22:55Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:22:55 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:55.694 263652 INFO neutron.agent.linux.ip_lib [None req-a6857fea-71a4-4313-97cb-6e4af08d500f - - - - - -] Device tapec870270-76 cannot be used as it has no MAC address
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: osdmap e208: 6 total, 6 up, 6 in
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3220423124' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1020764155' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.720 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain kernel: device tapec870270-76 entered promiscuous mode
Dec 06 10:22:55 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016575.7279] manager: (tapec870270-76): new Generic device (/org/freedesktop/NetworkManager/Devices/80)
Dec 06 10:22:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:55Z|00500|binding|INFO|Claiming lport ec870270-76fc-404f-9ac8-aae83a5c5051 for this chassis.
Dec 06 10:22:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:55Z|00501|binding|INFO|ec870270-76fc-404f-9ac8-aae83a5c5051: Claiming unknown
Dec 06 10:22:55 np0005548789.localdomain systemd-udevd[333716]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.741 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-04e62072-8d37-46d4-a112-c923d93098a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04e62072-8d37-46d4-a112-c923d93098a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3006b6c88845443ab13998bd660d02f7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dafa4fe-04f5-4502-a649-2a574bf9c45c, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=ec870270-76fc-404f-9ac8-aae83a5c5051) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.743 160509 INFO neutron.agent.ovn.metadata.agent [-] Port ec870270-76fc-404f-9ac8-aae83a5c5051 in datapath 04e62072-8d37-46d4-a112-c923d93098a9 bound to our chassis
Dec 06 10:22:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:55Z|00502|binding|INFO|Setting lport ec870270-76fc-404f-9ac8-aae83a5c5051 ovn-installed in OVS
Dec 06 10:22:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:22:55Z|00503|binding|INFO|Setting lport ec870270-76fc-404f-9ac8-aae83a5c5051 up in Southbound
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.745 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0f3fb41c-7a0b-40e3-8890-b8a5ead66801 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.745 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04e62072-8d37-46d4-a112-c923d93098a9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:22:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:22:55.746 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[91d1a665-0acb-4302-ac1a-8ba883c2be87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.747 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:22:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:22:55 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:22:55 np0005548789.localdomain podman[333700]: 2025-12-06 10:22:55.760158514 +0000 UTC m=+0.085030211 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.770 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tapec870270-76: No such device
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.812 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:55.840 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:56 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:56.070 263652 INFO neutron.agent.dhcp.agent [None req-923278d6-42bc-4efe-acae-e825706e52e8 - - - - - -] DHCP configuration for ports {'8f530efb-7550-478d-addb-425f279de982'} is completed
Dec 06 10:22:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:56.367 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:22:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "format": "json"}]: dispatch
Dec 06 10:22:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea938ab5-0d5f-47e7-a093-25bbc5841b54", "force": true, "format": "json"}]: dispatch
Dec 06 10:22:56 np0005548789.localdomain ceph-mon[298582]: pgmap v453: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 174 KiB/s rd, 48 KiB/s wr, 246 op/s
Dec 06 10:22:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3555310940' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3555310940' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:56 np0005548789.localdomain podman[333797]: 
Dec 06 10:22:56 np0005548789.localdomain podman[333797]: 2025-12-06 10:22:56.81829248 +0000 UTC m=+0.092649554 container create f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:22:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:22:56 np0005548789.localdomain systemd[1]: Started libpod-conmon-f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba.scope.
Dec 06 10:22:56 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:22:56 np0005548789.localdomain podman[333797]: 2025-12-06 10:22:56.775291005 +0000 UTC m=+0.049648129 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:22:56 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e17231f4b67de947b4d429d81a6f4334b12776ef1dd2b29f4fb039abcc7bc52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:22:56 np0005548789.localdomain podman[333797]: 2025-12-06 10:22:56.889030642 +0000 UTC m=+0.163387726 container init f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec 06 10:22:56 np0005548789.localdomain dnsmasq[333826]: started, version 2.85 cachesize 150
Dec 06 10:22:56 np0005548789.localdomain dnsmasq[333826]: DNS service limited to local subnets
Dec 06 10:22:56 np0005548789.localdomain dnsmasq[333826]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:22:56 np0005548789.localdomain dnsmasq[333826]: warning: no upstream servers configured
Dec 06 10:22:56 np0005548789.localdomain dnsmasq-dhcp[333826]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:22:56 np0005548789.localdomain dnsmasq[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/addn_hosts - 0 addresses
Dec 06 10:22:56 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/host
Dec 06 10:22:56 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/opts
Dec 06 10:22:56 np0005548789.localdomain podman[333797]: 2025-12-06 10:22:56.949498189 +0000 UTC m=+0.223855263 container start f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:22:56 np0005548789.localdomain podman[333810]: 2025-12-06 10:22:56.966600362 +0000 UTC m=+0.122261427 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 06 10:22:57 np0005548789.localdomain podman[333810]: 2025-12-06 10:22:57.013065923 +0000 UTC m=+0.168727018 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:22:57 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:22:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:57.125 263652 INFO neutron.agent.dhcp.agent [None req-fe6337ef-bb50-4310-a33d-1eb2350e643e - - - - - -] DHCP configuration for ports {'b7836405-7939-4345-bc2d-b9851667edb3'} is completed
Dec 06 10:22:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:57.136 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:56Z, description=, device_id=d17ccc6d-49c1-4c04-b4b2-d05430f70927, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e8190>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e86a0>], id=81d0b0ea-67b7-457e-bfbc-24d21f440b5e, ip_allocation=immediate, mac_address=fa:16:3e:7b:5d:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:53Z, description=, dns_domain=, id=04e62072-8d37-46d4-a112-c923d93098a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-657417851-network, port_security_enabled=True, project_id=3006b6c88845443ab13998bd660d02f7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9710, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3279, status=ACTIVE, subnets=['d1dd1f6b-ed10-476e-b0e0-6361467c9e10'], tags=[], tenant_id=3006b6c88845443ab13998bd660d02f7, updated_at=2025-12-06T10:22:53Z, vlan_transparent=None, network_id=04e62072-8d37-46d4-a112-c923d93098a9, port_security_enabled=False, project_id=3006b6c88845443ab13998bd660d02f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3300, status=DOWN, tags=[], tenant_id=3006b6c88845443ab13998bd660d02f7, updated_at=2025-12-06T10:22:56Z on network 04e62072-8d37-46d4-a112-c923d93098a9
Dec 06 10:22:57 np0005548789.localdomain dnsmasq[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/addn_hosts - 1 addresses
Dec 06 10:22:57 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/host
Dec 06 10:22:57 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/opts
Dec 06 10:22:57 np0005548789.localdomain podman[333856]: 2025-12-06 10:22:57.363287679 +0000 UTC m=+0.062972916 container kill f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:22:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e209 e209: 6 total, 6 up, 6 in
Dec 06 10:22:57 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:57.724 263652 INFO neutron.agent.dhcp.agent [None req-db279e16-a3b7-40aa-b550-d60523417811 - - - - - -] DHCP configuration for ports {'81d0b0ea-67b7-457e-bfbc-24d21f440b5e'} is completed
Dec 06 10:22:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e209 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:58 np0005548789.localdomain ceph-mon[298582]: pgmap v454: 177 pgs: 177 active+clean; 196 MiB data, 969 MiB used, 41 GiB / 42 GiB avail; 135 KiB/s rd, 37 KiB/s wr, 190 op/s
Dec 06 10:22:58 np0005548789.localdomain ceph-mon[298582]: osdmap e209: 6 total, 6 up, 6 in
Dec 06 10:22:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:22:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2409640309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:22:58 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:58.945 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:22:56Z, description=, device_id=d17ccc6d-49c1-4c04-b4b2-d05430f70927, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb47790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fb47760>], id=81d0b0ea-67b7-457e-bfbc-24d21f440b5e, ip_allocation=immediate, mac_address=fa:16:3e:7b:5d:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:22:53Z, description=, dns_domain=, id=04e62072-8d37-46d4-a112-c923d93098a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-657417851-network, port_security_enabled=True, project_id=3006b6c88845443ab13998bd660d02f7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9710, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3279, status=ACTIVE, subnets=['d1dd1f6b-ed10-476e-b0e0-6361467c9e10'], tags=[], tenant_id=3006b6c88845443ab13998bd660d02f7, updated_at=2025-12-06T10:22:53Z, vlan_transparent=None, network_id=04e62072-8d37-46d4-a112-c923d93098a9, port_security_enabled=False, project_id=3006b6c88845443ab13998bd660d02f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3300, status=DOWN, tags=[], tenant_id=3006b6c88845443ab13998bd660d02f7, updated_at=2025-12-06T10:22:56Z on network 04e62072-8d37-46d4-a112-c923d93098a9
Dec 06 10:22:59 np0005548789.localdomain dnsmasq[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/addn_hosts - 1 addresses
Dec 06 10:22:59 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/host
Dec 06 10:22:59 np0005548789.localdomain podman[333895]: 2025-12-06 10:22:59.158635188 +0000 UTC m=+0.067958188 container kill f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:22:59 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/opts
Dec 06 10:22:59 np0005548789.localdomain systemd[1]: tmp-crun.pr6b0Q.mount: Deactivated successfully.
Dec 06 10:22:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:22:59.383 263652 INFO neutron.agent.dhcp.agent [None req-7394efdf-8f2c-40d1-afbc-c55308d3573f - - - - - -] DHCP configuration for ports {'81d0b0ea-67b7-457e-bfbc-24d21f440b5e'} is completed
Dec 06 10:22:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:22:59.526 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:00.374 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:00 np0005548789.localdomain ceph-mon[298582]: pgmap v456: 177 pgs: 177 active+clean; 196 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 216 KiB/s rd, 56 KiB/s wr, 301 op/s
Dec 06 10:23:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e210 e210: 6 total, 6 up, 6 in
Dec 06 10:23:02 np0005548789.localdomain ceph-mon[298582]: pgmap v457: 177 pgs: 177 active+clean; 196 MiB data, 953 MiB used, 41 GiB / 42 GiB avail; 81 KiB/s rd, 19 KiB/s wr, 110 op/s
Dec 06 10:23:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:02 np0005548789.localdomain ceph-mon[298582]: osdmap e210: 6 total, 6 up, 6 in
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e211 e211: 6 total, 6 up, 6 in
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "format": "json"}]: dispatch
Dec 06 10:23:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:04 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:04.300 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:23:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:04.529 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:04 np0005548789.localdomain ceph-mon[298582]: pgmap v459: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 34 KiB/s wr, 118 op/s
Dec 06 10:23:04 np0005548789.localdomain ceph-mon[298582]: osdmap e211: 6 total, 6 up, 6 in
Dec 06 10:23:04 np0005548789.localdomain sudo[333915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:23:04 np0005548789.localdomain sudo[333915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:04 np0005548789.localdomain sudo[333915]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:04 np0005548789.localdomain sudo[333933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:23:04 np0005548789.localdomain sudo[333933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:05 np0005548789.localdomain sudo[333933]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:05.407 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:05 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e212 e212: 6 total, 6 up, 6 in
Dec 06 10:23:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:23:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:23:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:05 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:23:05 np0005548789.localdomain sudo[333982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:23:05 np0005548789.localdomain sudo[333982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:23:05 np0005548789.localdomain sudo[333982]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:06 np0005548789.localdomain ceph-mon[298582]: pgmap v461: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 35 KiB/s wr, 123 op/s
Dec 06 10:23:06 np0005548789.localdomain ceph-mon[298582]: osdmap e212: 6 total, 6 up, 6 in
Dec 06 10:23:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e213 e213: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71", "format": "json"}]: dispatch
Dec 06 10:23:07 np0005548789.localdomain ceph-mon[298582]: osdmap e213: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:23:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e214 e214: 6 total, 6 up, 6 in
Dec 06 10:23:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:23:07 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.924 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f82f0d4c-e0f9-41f9-85bd-60caefc79060', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:07.918876', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8ef6eaa6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '09f51530a368551a372290c68c3a874ef5e62b9eb39f75e57300c76a5df3f503'}]}, 'timestamp': '2025-12-06 10:23:07.925824', '_unique_id': '9ffe9a6c20df420aa47bb6dd9712cfa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.928 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.930 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e68bf2-4e8a-4183-a041-c30944a9c07a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:07.930455', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8ef7bb8e-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '838d1aeb92eee4594c569a032bce97b0984877edf0810a616254330c16bcd708'}]}, 'timestamp': '2025-12-06 10:23:07.931023', '_unique_id': 'c12ce357c7a04ec58221e2612b4c633d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.933 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '383fd15e-4b38-4892-9cdd-dbe7de181f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:07.933463', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8ef8308c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': 'f2f8b58ed7393922d611df1e60eca428c0f75c63881293de9a68c02ba7358b8d'}]}, 'timestamp': '2025-12-06 10:23:07.934026', '_unique_id': '978e699060f749baabaa25e167baac68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.935 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.936 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.936 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.937 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c79b969f-1916-46c4-a823-dce5e870cb3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:07.937082', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8ef8c04c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '672d796e9eec7334ba30755ef2ca2b992bcb9512c52564e67eb851a1c63e89c9'}]}, 'timestamp': '2025-12-06 10:23:07.937682', '_unique_id': '22eb05bdaafc4cfdae3f5ce564d28449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.938 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.939 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:23:07 np0005548789.localdomain systemd[1]: tmp-crun.yCnqke.mount: Deactivated successfully.
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d366e4b-8bfe-40a1-8832-5a9cdc4f359f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:07.940129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8efdb61a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': 'a8414a8b1f7007f4db6c24be89797b7b0a499cb894df1d753a89b2b3b7ccf63d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:07.940129', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8efdd104-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': '2d7d9d31ea133854bc93efbf6718ea7362f11cdbf8abbd91bc1b68bb470b54c1'}]}, 'timestamp': '2025-12-06 10:23:07.970945', '_unique_id': '063bce6128ab43f68d295d3c38dd91fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:07.974 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain systemd[1]: tmp-crun.9aeuN5.mount: Deactivated successfully.
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 18760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '737260c0-bb81-4379-af82-03dda64d20b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18760000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:23:07.974482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8f03521e-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.255417662, 'message_signature': 'e40e45acc04b2c3333443193188ae8840e109526b1cc0dd61fa6d39ab468539f'}]}, 'timestamp': '2025-12-06 10:23:08.006998', '_unique_id': '2c03ef6504854236b8f3adbe689d57f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.009 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain podman[334000]: 2025-12-06 10:23:08.012469043 +0000 UTC m=+0.162190138 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '576d7336-8d2c-475d-aa51-52bd0d5568a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.010213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f03e6a2-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': 'e56893d2e43d29a111482c905e5bb5c0d2ce3bdf2d231a7664f064ab03bca74b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.010213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f03faac-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': '964d36e68aa36d69b33e022a3b2392a4ca2a35435d806b2c511ab85125bf10c5'}]}, 'timestamp': '2025-12-06 10:23:08.011199', '_unique_id': '8aca2530269f42a78d5d18db51f6cc95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain podman[334000]: 2025-12-06 10:23:08.043009137 +0000 UTC m=+0.192730292 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.046 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.047 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa19168c-7921-4908-91d0-422df314a7ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.015438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f097ee6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'c693f3a6f6557c0d715bf27264da16d69848d60c8eb19dbc9f6311e24e1a385c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.015438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f099548-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '36d92b9c0d2929bc3989d037d7ee91ac8b691af9a409f3d303750a871749699c'}]}, 'timestamp': '2025-12-06 10:23:08.047981', '_unique_id': '1dcc22773f204a0b92d0af6e36c25a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.049 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.051 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed021326-8368-4bdb-9252-dc2f1d8fe584', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.051485', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0a3520-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '87eee7626835890fabafbcde5b8e5cef25f1b4a7cb5c8194c5eb872d8b86578e'}]}, 'timestamp': '2025-12-06 10:23:08.052053', '_unique_id': 'bdc13ecd3ac44209a2923315e63842d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.053 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.055 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5878c55-fd86-4858-bda5-2c9bec5cf421', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.055156', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0ac5e4-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': 'bd1176061e2b57a3b5767340e8e9ab3887d66ec1d72b36efe4ddde6c189e9318'}]}, 'timestamp': '2025-12-06 10:23:08.055903', '_unique_id': '76da1030f7fb49f7874c83c9d19fb505'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.057 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.059 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.059 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1035dcc2-7252-4e26-b44e-4feee5767eed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:23:08.059306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8f0b6756-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.255417662, 'message_signature': 'e5ff10afebb3d5ececfb6187c63817d5489eccc6cd0e63b00aac46bf669e14be'}]}, 'timestamp': '2025-12-06 10:23:08.060003', '_unique_id': '5603f39792ec49d59d02e927a7953ea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.061 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.062 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.063 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain podman[334001]: 2025-12-06 10:23:07.963230908 +0000 UTC m=+0.112288363 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b8350e-cd34-47fa-9326-25649f576eb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.063077', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0bfb80-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '32c06a3d41abadd87f3264113076ef666872b56f7baea29bf28df99372b5b408'}]}, 'timestamp': '2025-12-06 10:23:08.063720', '_unique_id': '284ee9f6834f49f3a193b4a72ba5ffd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.064 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.066 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1435c678-3829-4179-a635-b5df441d0e95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.066918', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0c922a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '322890b6114a18501c4ecd1a849dc3a8491b7ae29b273733864ef72d3f1ff31b'}]}, 'timestamp': '2025-12-06 10:23:08.067575', '_unique_id': 'fe1f3cc759894826a02eec1a685b4cf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.068 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.070 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.070 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.070 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02c640c9-b624-40c8-aea2-6bd16cc0b01e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.070477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0d17d6-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'c84f0046849d978cec5d161b774e4779b64114eaf6d3a315e6d8ca2a5179408d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.070477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0d244c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '3d2587307ce70d1958e353fd5c4608c12e3a9a664ce0581e5ec687ef6c5b5279'}]}, 'timestamp': '2025-12-06 10:23:08.071158', '_unique_id': '157de13fe64941a18a7f8efbce4878f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.072 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.073 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.073 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.074 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5af9a935-ab1c-44b5-894c-1a8405d4c064', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.073843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0d98fa-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'aa228fce003a18b90360adb1bd20d45ab3915347387a42c4aa8249a644031d44'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.073843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0da372-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '9babc7710011ef8ab0f6f7403d94cbbeed1d7fd4792cc243a2853836acf11c4e'}]}, 'timestamp': '2025-12-06 10:23:08.074406', '_unique_id': '6645be711743407a86f25e01b01929f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.075 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.076 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9fc86c6-0b32-49a5-a0b6-fcefcb94808b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.075854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0deb2a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '8ddc03dfe4b1bc88c817983fc24891d0d7dabe0d38f65de9699855df1914938a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.075854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0df5de-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'd925d75bd14a139cc0b6f28aba0314a3ae46cc772c7476309917c5abe4551eb7'}]}, 'timestamp': '2025-12-06 10:23:08.076529', '_unique_id': '6a4efd20a0f04a6ea46ef624e7694581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.077 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.078 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.078 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.078 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '056741f8-e175-4bdd-a505-6bdb7150dce7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.078279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0e46ce-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': '5218dd92a8e0fe453bfd765100d7e8045323fd20dff9d96434757c6ccda9de2e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.078279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0e513c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.189501077, 'message_signature': 'a6ddbe7fb5a175918b27797ec35b3f93da3777b52c91d05dac508c3f22a2e20d'}]}, 'timestamp': '2025-12-06 10:23:08.078876', '_unique_id': '1b56811025aa479fb5b2242e1f237311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.079 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.080 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.080 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.080 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fbd32b-4367-4a27-b272-ee44da223c4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.080439', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0e9a0c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '46749e6d8a0977a2e8675da4d937c81d6bfbfff293c2a545d799de16b1711c2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.080439', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0ea524-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'ae93e0946201eb989278f60715978475b93abbbd99a70817ca5bf2b07e74ccbf'}]}, 'timestamp': '2025-12-06 10:23:08.081061', '_unique_id': 'd59bb563716e4a7489ce1fad2a105073'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.081 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.082 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.083 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d3a79d-d086-4fc1-bc6b-7094b75d06ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:23:08.082841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8f0ef9de-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': 'e5524067f8706396eac904f6fa5fd468c9186693a131c079da27be7959f6f1ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:23:08.082841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8f0f069a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.264808099, 'message_signature': '9c2be2848569175687788fbd4cbadccc3aff02f1ff55159e72f4b10129ddc6e4'}]}, 'timestamp': '2025-12-06 10:23:08.083508', '_unique_id': 'c180f16e561141008dfe879cc48b1a0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.085 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '226dd84e-683c-4f89-8ba5-8123a5c4c183', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.085050', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0f505a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '90bc878095e56e6f3c091e488b7b325b1daef7a2f147ee8e7c8b6e28dc229b85'}]}, 'timestamp': '2025-12-06 10:23:08.085437', '_unique_id': '9cd103c741a14e269ef96def07a6cf15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.087 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5edad3b9-aa59-4615-9420-3e6d44582889', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:23:08.087000', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '8f0f9c04-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13006.168246417, 'message_signature': '34eee62ef5d80e90af4b4a1837023e1a143a983c9dc3aed4e3cc6d75319560b8'}]}, 'timestamp': '2025-12-06 10:23:08.087344', '_unique_id': '2210163cadd748358e21a9b1a953cf65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:23:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:23:08.088 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:23:08 np0005548789.localdomain podman[334001]: 2025-12-06 10:23:08.09742299 +0000 UTC m=+0.246480395 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:23:08 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "format": "json"}]: dispatch
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bf11e39e-6643-46ed-983d-322b7205a5ae", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: pgmap v464: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 6.3 KiB/s rd, 25 KiB/s wr, 14 op/s
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: osdmap e214: 6 total, 6 up, 6 in
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1802975927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:08 np0005548789.localdomain dnsmasq[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/addn_hosts - 0 addresses
Dec 06 10:23:08 np0005548789.localdomain podman[334056]: 2025-12-06 10:23:08.845522688 +0000 UTC m=+0.058333834 container kill f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:23:08 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/host
Dec 06 10:23:08 np0005548789.localdomain dnsmasq-dhcp[333826]: read /var/lib/neutron/dhcp/04e62072-8d37-46d4-a112-c923d93098a9/opts
Dec 06 10:23:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:09.051 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:09 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:23:09Z|00504|binding|INFO|Releasing lport ec870270-76fc-404f-9ac8-aae83a5c5051 from this chassis (sb_readonly=0)
Dec 06 10:23:09 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:23:09Z|00505|binding|INFO|Setting lport ec870270-76fc-404f-9ac8-aae83a5c5051 down in Southbound
Dec 06 10:23:09 np0005548789.localdomain kernel: device tapec870270-76 left promiscuous mode
Dec 06 10:23:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:09.063 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-04e62072-8d37-46d4-a112-c923d93098a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04e62072-8d37-46d4-a112-c923d93098a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3006b6c88845443ab13998bd660d02f7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dafa4fe-04f5-4502-a649-2a574bf9c45c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=ec870270-76fc-404f-9ac8-aae83a5c5051) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:23:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:09.065 160509 INFO neutron.agent.ovn.metadata.agent [-] Port ec870270-76fc-404f-9ac8-aae83a5c5051 in datapath 04e62072-8d37-46d4-a112-c923d93098a9 unbound from our chassis
Dec 06 10:23:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:09.068 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04e62072-8d37-46d4-a112-c923d93098a9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:23:09 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:09.069 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[e841b348-bf95-4537-9e1d-6f9462d15d4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:23:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:09.079 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:09.576 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:10 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:23:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:23:10 np0005548789.localdomain podman[334095]: 2025-12-06 10:23:10.085735449 +0000 UTC m=+0.061360337 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:23:10 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:23:10 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:23:10Z|00506|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:23:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:10.337 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:10.410 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:10 np0005548789.localdomain ceph-mon[298582]: pgmap v466: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 21 KiB/s wr, 72 op/s
Dec 06 10:23:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038", "format": "json"}]: dispatch
Dec 06 10:23:10 np0005548789.localdomain dnsmasq[333826]: exiting on receipt of SIGTERM
Dec 06 10:23:10 np0005548789.localdomain podman[334133]: 2025-12-06 10:23:10.786662925 +0000 UTC m=+0.064145592 container kill f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:23:10 np0005548789.localdomain systemd[1]: libpod-f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba.scope: Deactivated successfully.
Dec 06 10:23:10 np0005548789.localdomain podman[334145]: 2025-12-06 10:23:10.85226813 +0000 UTC m=+0.052728633 container died f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:23:10 np0005548789.localdomain podman[334145]: 2025-12-06 10:23:10.94319645 +0000 UTC m=+0.143656913 container cleanup f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:23:10 np0005548789.localdomain systemd[1]: libpod-conmon-f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba.scope: Deactivated successfully.
Dec 06 10:23:10 np0005548789.localdomain podman[334152]: 2025-12-06 10:23:10.966670367 +0000 UTC m=+0.155164403 container remove f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-04e62072-8d37-46d4-a112-c923d93098a9, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:23:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:11.002 263652 INFO neutron.agent.dhcp.agent [None req-454f3b01-48e3-4f7d-ae05-98866e14b70c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:23:11 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:11.003 263652 INFO neutron.agent.dhcp.agent [None req-454f3b01-48e3-4f7d-ae05-98866e14b70c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:23:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-3e17231f4b67de947b4d429d81a6f4334b12776ef1dd2b29f4fb039abcc7bc52-merged.mount: Deactivated successfully.
Dec 06 10:23:11 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0b3da8b2765503728b15ffd83b4b75c223326eadbfa4889f0b7b0340f71baba-userdata-shm.mount: Deactivated successfully.
Dec 06 10:23:11 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d04e62072\x2d8d37\x2d46d4\x2da112\x2dc923d93098a9.mount: Deactivated successfully.
Dec 06 10:23:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e215 e215: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e216 e216: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548789.localdomain ceph-mon[298582]: pgmap v467: 177 pgs: 177 active+clean; 196 MiB data, 954 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 20 KiB/s wr, 68 op/s
Dec 06 10:23:12 np0005548789.localdomain ceph-mon[298582]: osdmap e215: 6 total, 6 up, 6 in
Dec 06 10:23:12 np0005548789.localdomain ceph-mon[298582]: osdmap e216: 6 total, 6 up, 6 in
Dec 06 10:23:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:14 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:14.527 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:14Z, description=, device_id=2b5489f4-b8c8-4de2-9a14-bc27f555fe1d, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc66040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc66c70>], id=d9db14b8-6eec-454e-9bf9-05f99df47bdd, ip_allocation=immediate, mac_address=fa:16:3e:c6:34:ab, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3389, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:23:14Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:23:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:14.578 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038_339c2891-c9dd-4dd5-bc06-64f8eef95887", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "2106c631-2fac-4a48-9eb5-105ae638e038", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:14 np0005548789.localdomain ceph-mon[298582]: pgmap v470: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 44 KiB/s wr, 108 op/s
Dec 06 10:23:14 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:23:14 np0005548789.localdomain podman[334189]: 2025-12-06 10:23:14.775331382 +0000 UTC m=+0.067615029 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:23:14 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:23:14 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:23:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:23:14 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:23:14 np0005548789.localdomain systemd[1]: tmp-crun.zAwduk.mount: Deactivated successfully.
Dec 06 10:23:14 np0005548789.localdomain podman[334202]: 2025-12-06 10:23:14.894554485 +0000 UTC m=+0.083779022 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Dec 06 10:23:14 np0005548789.localdomain podman[334202]: 2025-12-06 10:23:14.901042243 +0000 UTC m=+0.090266810 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:23:14 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:23:14 np0005548789.localdomain podman[334203]: 2025-12-06 10:23:14.9401901 +0000 UTC m=+0.122120943 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:23:14 np0005548789.localdomain podman[334203]: 2025-12-06 10:23:14.975134579 +0000 UTC m=+0.157065412 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:23:14 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:23:15 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:15.026 263652 INFO neutron.agent.dhcp.agent [None req-e80e3ecc-32d2-430e-9ed7-0cedd695fc20 - - - - - -] DHCP configuration for ports {'d9db14b8-6eec-454e-9bf9-05f99df47bdd'} is completed
Dec 06 10:23:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:15.404 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:15.411 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:23:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:23:16 np0005548789.localdomain ceph-mon[298582]: pgmap v471: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 35 KiB/s wr, 86 op/s
Dec 06 10:23:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:17.639 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71_bd1921c7-7a11-44e6-a718-d94ea8eab798", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "snap_name": "4ddcde0f-43b5-4b51-a615-3fe828471f71", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:18 np0005548789.localdomain ceph-mon[298582]: pgmap v472: 177 pgs: 177 active+clean; 196 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 18 KiB/s wr, 30 op/s
Dec 06 10:23:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:23:18 np0005548789.localdomain systemd[1]: tmp-crun.dStZFj.mount: Deactivated successfully.
Dec 06 10:23:18 np0005548789.localdomain podman[334246]: 2025-12-06 10:23:18.927490277 +0000 UTC m=+0.088511817 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:23:18 np0005548789.localdomain podman[334246]: 2025-12-06 10:23:18.945234189 +0000 UTC m=+0.106255729 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:23:18 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:23:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:19.619 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:19 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e217 e217: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:20.414 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e218 e218: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548789.localdomain ceph-mon[298582]: pgmap v473: 177 pgs: 177 active+clean; 196 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 41 KiB/s wr, 75 op/s
Dec 06 10:23:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "format": "json"}]: dispatch
Dec 06 10:23:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e300c2ad-ae7a-425b-b81e-235b17341052", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:20 np0005548789.localdomain ceph-mon[298582]: osdmap e217: 6 total, 6 up, 6 in
Dec 06 10:23:20 np0005548789.localdomain sshd[334265]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:23:21 np0005548789.localdomain ceph-mon[298582]: osdmap e218: 6 total, 6 up, 6 in
Dec 06 10:23:21 np0005548789.localdomain ceph-mon[298582]: pgmap v476: 177 pgs: 177 active+clean; 196 MiB data, 981 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 23 KiB/s wr, 45 op/s
Dec 06 10:23:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1707344585' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1707344585' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:22 np0005548789.localdomain sshd[334265]: Received disconnect from 118.219.234.233 port 37800:11: Bye Bye [preauth]
Dec 06 10:23:22 np0005548789.localdomain sshd[334265]: Disconnected from authenticating user root 118.219.234.233 port 37800 [preauth]
Dec 06 10:23:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:23:22 np0005548789.localdomain podman[334267]: 2025-12-06 10:23:22.428416793 +0000 UTC m=+0.100806853 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:23:22 np0005548789.localdomain podman[334267]: 2025-12-06 10:23:22.441094341 +0000 UTC m=+0.113484411 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:23:22 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:23:22 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4034920819' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:23:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:23:23 np0005548789.localdomain ceph-mon[298582]: pgmap v477: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 8.0 MiB/s wr, 153 op/s
Dec 06 10:23:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:23:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:23:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:23:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19269 "" "Go-http-client/1.1"
Dec 06 10:23:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:24.663 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "format": "json"}]: dispatch
Dec 06 10:23:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:25.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:25 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4146455680' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:25 np0005548789.localdomain ceph-mon[298582]: pgmap v478: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 8.0 MiB/s wr, 153 op/s
Dec 06 10:23:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e219 e219: 6 total, 6 up, 6 in
Dec 06 10:23:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:23:27 np0005548789.localdomain podman[334291]: 2025-12-06 10:23:27.95592055 +0000 UTC m=+0.107626461 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:23:28 np0005548789.localdomain podman[334291]: 2025-12-06 10:23:28.020906956 +0000 UTC m=+0.172612797 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:23:28 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:23:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:28 np0005548789.localdomain ceph-mon[298582]: pgmap v479: 177 pgs: 177 active+clean; 261 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 8.0 MiB/s wr, 108 op/s
Dec 06 10:23:28 np0005548789.localdomain ceph-mon[298582]: osdmap e219: 6 total, 6 up, 6 in
Dec 06 10:23:28 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5", "format": "json"}]: dispatch
Dec 06 10:23:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:23:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:29.666 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:30.419 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "format": "json"}]: dispatch
Dec 06 10:23:30 np0005548789.localdomain ceph-mon[298582]: pgmap v481: 177 pgs: 177 active+clean; 641 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 137 KiB/s rd, 53 MiB/s wr, 216 op/s
Dec 06 10:23:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:31.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:31.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:23:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:31.199 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:23:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29", "format": "json"}]: dispatch
Dec 06 10:23:32 np0005548789.localdomain ceph-mon[298582]: pgmap v482: 177 pgs: 177 active+clean; 641 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 116 KiB/s rd, 44 MiB/s wr, 182 op/s
Dec 06 10:23:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b", "format": "json"}]: dispatch
Dec 06 10:23:34 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:34.701 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:34 np0005548789.localdomain ceph-mon[298582]: pgmap v483: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:34 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:34 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:35.421 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/685384975' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29_d91f5f9e-cb0d-49f0-967d-dd0f7dcc891c", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "82ca32b7-c603-4087-ab30-5f8e76d96b29", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: pgmap v484: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b_b3a51089-88de-4e95-b3d9-848c07c2978e", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "snap_name": "1386d79b-64b7-4747-8e0b-aeb5f89fda0b", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1133027623' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:37 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1133027623' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:38 np0005548789.localdomain ceph-mon[298582]: pgmap v485: 177 pgs: 177 active+clean; 989 MiB data, 3.2 GiB used, 39 GiB / 42 GiB avail; 104 KiB/s rd, 73 MiB/s wr, 183 op/s
Dec 06 10:23:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:23:38 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:23:38 np0005548789.localdomain podman[334315]: 2025-12-06 10:23:38.935605788 +0000 UTC m=+0.093504369 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:23:38 np0005548789.localdomain podman[334316]: 2025-12-06 10:23:38.99357311 +0000 UTC m=+0.145504079 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:23:39 np0005548789.localdomain podman[334316]: 2025-12-06 10:23:39.005284608 +0000 UTC m=+0.157215597 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:23:39 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:23:39 np0005548789.localdomain podman[334315]: 2025-12-06 10:23:39.021908476 +0000 UTC m=+0.179807057 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:23:39 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.200 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.221 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.221 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.222 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.222 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.223 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8", "format": "json"}]: dispatch
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3601030272' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e220 e220: 6 total, 6 up, 6 in
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/860284658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.702 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.710 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.779 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.779 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.970 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.972 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11187MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.972 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:39.972 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.278 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.279 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.279 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.346 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.423 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.455 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.455 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.486 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.507 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:23:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:40.542 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "format": "json"}]: dispatch
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "265f7762-8f3c-4f93-a2e8-4d31ff999c12", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: pgmap v486: 177 pgs: 177 active+clean; 337 MiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 155 KiB/s rd, 82 MiB/s wr, 276 op/s
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: osdmap e220: 6 total, 6 up, 6 in
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/860284658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e221 e221: 6 total, 6 up, 6 in
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:40 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2359987376' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:41.012 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:41.018 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:23:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:41.037 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:23:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:41.039 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:23:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:41.040 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.067s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e222 e222: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548789.localdomain ceph-mon[298582]: osdmap e221: 6 total, 6 up, 6 in
Dec 06 10:23:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2359987376' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.016 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.017 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.017 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.018 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.138 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.139 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.139 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.139 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.557 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.581 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:23:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:42.582 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:23:42 np0005548789.localdomain ceph-mon[298582]: pgmap v489: 177 pgs: 177 active+clean; 337 MiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 96 KiB/s rd, 29 MiB/s wr, 175 op/s
Dec 06 10:23:42 np0005548789.localdomain ceph-mon[298582]: osdmap e222: 6 total, 6 up, 6 in
Dec 06 10:23:42 np0005548789.localdomain sshd[334400]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:23:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e223 e223: 6 total, 6 up, 6 in
Dec 06 10:23:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8_a730a057-9533-4839-a43f-0f73eb78bbd2", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5aa4551f-daf4-40ec-a318-c443e92f70d8", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:44 np0005548789.localdomain sshd[334400]: Received disconnect from 14.194.101.210 port 50366:11: Bye Bye [preauth]
Dec 06 10:23:44 np0005548789.localdomain sshd[334400]: Disconnected from authenticating user root 14.194.101.210 port 50366 [preauth]
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.184 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:23:44 np0005548789.localdomain ceph-mon[298582]: pgmap v491: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 154 KiB/s rd, 39 MiB/s wr, 281 op/s
Dec 06 10:23:44 np0005548789.localdomain ceph-mon[298582]: osdmap e223: 6 total, 6 up, 6 in
Dec 06 10:23:44 np0005548789.localdomain ceph-mon[298582]: mgrmap e52: np0005548790.kvkfyr(active, since 12m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:23:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:44.751 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:45.426 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:45 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:45.511 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:23:45Z, description=, device_id=c9e741a0-1e78-4ba5-9ba8-789872d3aa4a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa32ee0>], id=08370a59-6bd8-4ee2-99a3-75cdfb1fe917, ip_allocation=immediate, mac_address=fa:16:3e:35:f6:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3497, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:23:45Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:23:45 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:23:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:23:45 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:23:45 np0005548789.localdomain podman[334418]: 2025-12-06 10:23:45.762928755 +0000 UTC m=+0.061220582 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:23:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:23:45 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:23:45 np0005548789.localdomain podman[334432]: 2025-12-06 10:23:45.883837131 +0000 UTC m=+0.094898242 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Dec 06 10:23:45 np0005548789.localdomain podman[334432]: 2025-12-06 10:23:45.923205505 +0000 UTC m=+0.134266626 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 10:23:45 np0005548789.localdomain systemd[1]: tmp-crun.F2963D.mount: Deactivated successfully.
Dec 06 10:23:45 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:23:45 np0005548789.localdomain podman[334434]: 2025-12-06 10:23:45.947822997 +0000 UTC m=+0.154766891 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:23:45 np0005548789.localdomain podman[334434]: 2025-12-06 10:23:45.963208418 +0000 UTC m=+0.170152372 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:23:45 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:23:46 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:23:46.127 263652 INFO neutron.agent.dhcp.agent [None req-88eff2e5-fd08-4e50-8cf8-6ae1d862819f - - - - - -] DHCP configuration for ports {'08370a59-6bd8-4ee2-99a3-75cdfb1fe917'} is completed
Dec 06 10:23:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:46.568 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:23:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:23:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:23:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e224 e224: 6 total, 6 up, 6 in
Dec 06 10:23:47 np0005548789.localdomain ceph-mon[298582]: pgmap v493: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 64 KiB/s wr, 49 op/s
Dec 06 10:23:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79", "format": "json"}]: dispatch
Dec 06 10:23:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:47.184 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:47.341 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:47.341 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:47.342 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:47 np0005548789.localdomain sshd[334476]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:23:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e225 e225: 6 total, 6 up, 6 in
Dec 06 10:23:48 np0005548789.localdomain ceph-mon[298582]: osdmap e224: 6 total, 6 up, 6 in
Dec 06 10:23:48 np0005548789.localdomain ceph-mon[298582]: pgmap v495: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 60 KiB/s wr, 47 op/s
Dec 06 10:23:48 np0005548789.localdomain ceph-mon[298582]: osdmap e225: 6 total, 6 up, 6 in
Dec 06 10:23:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:48.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:48.976 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2848039413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:49.653 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:23:49 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:49.654 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:23:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:49.685 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:49.762 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:23:49 np0005548789.localdomain podman[334477]: 2025-12-06 10:23:49.927812038 +0000 UTC m=+0.091307083 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:23:49 np0005548789.localdomain podman[334477]: 2025-12-06 10:23:49.965746877 +0000 UTC m=+0.129241902 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:23:49 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79_20013db0-3906-4db4-b01b-002cf0a0c6bc", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "5755ebf1-140d-4767-bf58-cf9be5b35f79", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: pgmap v497: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 21 KiB/s wr, 35 op/s
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3773085003' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:50 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1887499854' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:50.428 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:51 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/585236573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:51 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e226 e226: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548789.localdomain ceph-mon[298582]: osdmap e226: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548789.localdomain ceph-mon[298582]: pgmap v499: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 21 KiB/s wr, 35 op/s
Dec 06 10:23:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2689749222' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:23:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:52.195 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:52.195 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:23:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e227 e227: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e228 e228: 6 total, 6 up, 6 in
Dec 06 10:23:52 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:23:52.656 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:23:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:23:52 np0005548789.localdomain podman[334494]: 2025-12-06 10:23:52.912008772 +0000 UTC m=+0.074304902 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:23:52 np0005548789.localdomain podman[334494]: 2025-12-06 10:23:52.925136324 +0000 UTC m=+0.087432474 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:23:52 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: osdmap e227: 6 total, 6 up, 6 in
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58", "format": "json"}]: dispatch
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: osdmap e228: 6 total, 6 up, 6 in
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3355742361' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:23:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:23:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:23:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:23:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:23:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1"
Dec 06 10:23:54 np0005548789.localdomain ceph-mon[298582]: pgmap v502: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 70 KiB/s wr, 179 op/s
Dec 06 10:23:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:54.807 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e229 e229: 6 total, 6 up, 6 in
Dec 06 10:23:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:55.429 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:23:56 np0005548789.localdomain ceph-mon[298582]: osdmap e229: 6 total, 6 up, 6 in
Dec 06 10:23:56 np0005548789.localdomain ceph-mon[298582]: pgmap v504: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 66 KiB/s wr, 197 op/s
Dec 06 10:23:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3696695018' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:23:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e230 e230: 6 total, 6 up, 6 in
Dec 06 10:23:57 np0005548789.localdomain sshd[334476]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:23:57 np0005548789.localdomain sshd[334476]: banner exchange: Connection from 123.160.164.187 port 60734: Connection timed out
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58_5f296b41-2976-4ca1-90d8-b8619d1ec6e3", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "07253e42-ed74-4565-a93c-b3109a8bee58", "force": true, "format": "json"}]: dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: pgmap v505: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 45 KiB/s wr, 134 op/s
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: osdmap e230: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2773610337' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:23:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e231 e231: 6 total, 6 up, 6 in
Dec 06 10:23:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:23:58 np0005548789.localdomain systemd[1]: tmp-crun.DZDjcn.mount: Deactivated successfully.
Dec 06 10:23:58 np0005548789.localdomain podman[334516]: 2025-12-06 10:23:58.926954868 +0000 UTC m=+0.092316693 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:23:58 np0005548789.localdomain podman[334516]: 2025-12-06 10:23:58.96462259 +0000 UTC m=+0.129984445 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:23:58 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:23:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e232 e232: 6 total, 6 up, 6 in
Dec 06 10:23:59 np0005548789.localdomain ceph-mon[298582]: osdmap e231: 6 total, 6 up, 6 in
Dec 06 10:23:59 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:23:59.809 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:00.432 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:00 np0005548789.localdomain ceph-mon[298582]: pgmap v508: 177 pgs: 177 active+clean; 258 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.0 MiB/s wr, 179 op/s
Dec 06 10:24:00 np0005548789.localdomain ceph-mon[298582]: osdmap e232: 6 total, 6 up, 6 in
Dec 06 10:24:01 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:24:01 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/573079347' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/573079347' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/402114055' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/402114055' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e233 e233: 6 total, 6 up, 6 in
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05", "format": "json"}]: dispatch
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: pgmap v510: 177 pgs: 177 active+clean; 258 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.4 MiB/s rd, 4.0 MiB/s wr, 179 op/s
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:24:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e234 e234: 6 total, 6 up, 6 in
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: osdmap e233: 6 total, 6 up, 6 in
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1694106485' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:03 np0005548789.localdomain ceph-mon[298582]: osdmap e234: 6 total, 6 up, 6 in
Dec 06 10:24:04 np0005548789.localdomain ceph-mon[298582]: pgmap v512: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 7.7 MiB/s rd, 11 MiB/s wr, 355 op/s
Dec 06 10:24:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e235 e235: 6 total, 6 up, 6 in
Dec 06 10:24:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:04.813 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:05 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:05Z|00507|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:24:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:05.409 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:05 np0005548789.localdomain systemd[1]: tmp-crun.2xdW9r.mount: Deactivated successfully.
Dec 06 10:24:05 np0005548789.localdomain podman[334557]: 2025-12-06 10:24:05.412193739 +0000 UTC m=+0.075914881 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:05 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:24:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:24:05 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:24:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:05.434 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:05 np0005548789.localdomain ceph-mon[298582]: osdmap e235: 6 total, 6 up, 6 in
Dec 06 10:24:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1159844726' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1159844726' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:05 np0005548789.localdomain sudo[334579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:24:05 np0005548789.localdomain sudo[334579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:05 np0005548789.localdomain sudo[334579]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:05 np0005548789.localdomain sudo[334597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Dec 06 10:24:05 np0005548789.localdomain sudo[334597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548789.localdomain sudo[334597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:06 np0005548789.localdomain sudo[334635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:24:06 np0005548789.localdomain sudo[334635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548789.localdomain sudo[334635]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:06 np0005548789.localdomain sudo[334653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:24:06 np0005548789.localdomain sudo[334653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: pgmap v515: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 4.1 MiB/s rd, 7.1 MiB/s wr, 167 op/s
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05_4c1bc32d-22e5-4d62-bff7-71f1039d4c95", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "76242542-0436-4e10-a0ca-932a5ea39e05", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:07 np0005548789.localdomain sudo[334653]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e236 e236: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e237 e237: 6 total, 6 up, 6 in
Dec 06 10:24:07 np0005548789.localdomain sudo[334702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:24:07 np0005548789.localdomain sudo[334702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:24:07 np0005548789.localdomain sudo[334702]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: pgmap v516: 177 pgs: 177 active+clean; 337 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 6.7 MiB/s wr, 157 op/s
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: osdmap e236: 6 total, 6 up, 6 in
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:24:08 np0005548789.localdomain ceph-mon[298582]: osdmap e237: 6 total, 6 up, 6 in
Dec 06 10:24:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e238 e238: 6 total, 6 up, 6 in
Dec 06 10:24:09 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:09.840 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:24:09 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:24:09 np0005548789.localdomain podman[334721]: 2025-12-06 10:24:09.962469162 +0000 UTC m=+0.097875122 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:24:09 np0005548789.localdomain podman[334721]: 2025-12-06 10:24:09.972951313 +0000 UTC m=+0.108357283 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:24:09 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:24:10 np0005548789.localdomain systemd[1]: tmp-crun.ZPGAlS.mount: Deactivated successfully.
Dec 06 10:24:10 np0005548789.localdomain podman[334720]: 2025-12-06 10:24:10.063323545 +0000 UTC m=+0.198630662 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 06 10:24:10 np0005548789.localdomain podman[334720]: 2025-12-06 10:24:10.094826409 +0000 UTC m=+0.230133516 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:24:10 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:24:10 np0005548789.localdomain ceph-mon[298582]: pgmap v519: 177 pgs: 177 active+clean; 244 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 71 KiB/s rd, 32 KiB/s wr, 104 op/s
Dec 06 10:24:10 np0005548789.localdomain ceph-mon[298582]: osdmap e238: 6 total, 6 up, 6 in
Dec 06 10:24:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3765798835' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3765798835' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:10.437 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "format": "json"}]: dispatch
Dec 06 10:24:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e239 e239: 6 total, 6 up, 6 in
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5_77a05e19-399e-4f52-950c-489f333205cf", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "snap_name": "f95d5566-befb-4ef3-874d-5a8d6c4a6df5", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: pgmap v521: 177 pgs: 177 active+clean; 244 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 29 KiB/s wr, 96 op/s
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: osdmap e239: 6 total, 6 up, 6 in
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:24:12 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:24:12.599 2 INFO neutron.agent.securitygroups_rpc [None req-63e1cfe1-0be8-4c17-9453-907c82bfa210 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group rule updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e240 e240: 6 total, 6 up, 6 in
Dec 06 10:24:13 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:24:13.350 2 INFO neutron.agent.securitygroups_rpc [None req-9218de2a-a054-45fc-bd21-8f037be37a59 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group rule updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:13 np0005548789.localdomain ceph-mon[298582]: osdmap e240: 6 total, 6 up, 6 in
Dec 06 10:24:14 np0005548789.localdomain ceph-mon[298582]: pgmap v524: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 KiB/s rd, 75 KiB/s wr, 152 op/s
Dec 06 10:24:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "format": "json"}]: dispatch
Dec 06 10:24:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:14 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:14.843 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:15.439 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "format": "json"}]: dispatch
Dec 06 10:24:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "09dfdd64-7386-4e00-b67b-f81d142ea562", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e241 e241: 6 total, 6 up, 6 in
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:24:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:24:16 np0005548789.localdomain ceph-mon[298582]: pgmap v525: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 42 KiB/s wr, 47 op/s
Dec 06 10:24:16 np0005548789.localdomain ceph-mon[298582]: osdmap e241: 6 total, 6 up, 6 in
Dec 06 10:24:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:24:16 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:24:16 np0005548789.localdomain podman[334762]: 2025-12-06 10:24:16.922668553 +0000 UTC m=+0.080050058 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:24:16 np0005548789.localdomain podman[334762]: 2025-12-06 10:24:16.96147235 +0000 UTC m=+0.118853895 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:24:16 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:24:16 np0005548789.localdomain podman[334761]: 2025-12-06 10:24:16.979091988 +0000 UTC m=+0.139934189 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Dec 06 10:24:16 np0005548789.localdomain podman[334761]: 2025-12-06 10:24:16.989267589 +0000 UTC m=+0.150109770 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Dec 06 10:24:17 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:24:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e242 e242: 6 total, 6 up, 6 in
Dec 06 10:24:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:24:17 np0005548789.localdomain ceph-mon[298582]: osdmap e242: 6 total, 6 up, 6 in
Dec 06 10:24:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:18 np0005548789.localdomain ceph-mon[298582]: pgmap v527: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 42 KiB/s wr, 47 op/s
Dec 06 10:24:18 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1847092686' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:19 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:24:19.082 2 INFO neutron.agent.securitygroups_rpc [req-6cf9c8b9-f82f-4729-827a-87ee94dc739b req-d70b880e-b383-4ad3-91f4-06f3e667f577 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group member updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:24:19 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:19.890 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:20.441 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:20 np0005548789.localdomain ceph-mon[298582]: pgmap v529: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 88 KiB/s wr, 46 op/s
Dec 06 10:24:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:24:20 np0005548789.localdomain podman[334801]: 2025-12-06 10:24:20.923383958 +0000 UTC m=+0.085733871 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd)
Dec 06 10:24:20 np0005548789.localdomain podman[334801]: 2025-12-06 10:24:20.961139302 +0000 UTC m=+0.123489225 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 10:24:20 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:24:21 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "077e73af-9063-4209-9319-e18e1a460598", "format": "json"}]: dispatch
Dec 06 10:24:21 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "077e73af-9063-4209-9319-e18e1a460598", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3448344129' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:21 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2893884725' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 e243: 6 total, 6 up, 6 in
Dec 06 10:24:22 np0005548789.localdomain ceph-mon[298582]: pgmap v530: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 43 KiB/s wr, 2 op/s
Dec 06 10:24:22 np0005548789.localdomain ceph-mon[298582]: osdmap e243: 6 total, 6 up, 6 in
Dec 06 10:24:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:24:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:24:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:24:23 np0005548789.localdomain podman[334821]: 2025-12-06 10:24:23.920102691 +0000 UTC m=+0.080983407 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:24:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:24:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:24:24 np0005548789.localdomain podman[334821]: 2025-12-06 10:24:24.004199391 +0000 UTC m=+0.165080047 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:24:24 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:24:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:24:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19274 "" "Go-http-client/1.1"
Dec 06 10:24:24 np0005548789.localdomain ceph-mon[298582]: pgmap v532: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.9 MiB/s wr, 63 op/s
Dec 06 10:24:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "format": "json"}]: dispatch
Dec 06 10:24:24 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:24.935 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.442 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:25.516 263652 INFO neutron.agent.linux.ip_lib [None req-cb670ead-bb02-4e2a-ad39-dac9c22fbd01 - - - - - -] Device tape3bcd567-2d cannot be used as it has no MAC address
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.543 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain kernel: device tape3bcd567-2d entered promiscuous mode
Dec 06 10:24:25 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016665.5533] manager: (tape3bcd567-2d): new Generic device (/org/freedesktop/NetworkManager/Devices/81)
Dec 06 10:24:25 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:25Z|00508|binding|INFO|Claiming lport e3bcd567-2db3-4d72-9cdb-24e14598df57 for this chassis.
Dec 06 10:24:25 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:25Z|00509|binding|INFO|e3bcd567-2db3-4d72-9cdb-24e14598df57: Claiming unknown
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.554 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain systemd-udevd[334854]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:24:25 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:25.564 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-67f05a6c-bdca-4c59-9049-edf7ed03aad0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67f05a6c-bdca-4c59-9049-edf7ed03aad0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e98feac0e5947229c2baa6fc34be5fb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8812a56d-06ed-4c18-98f2-54c75645cf8d, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e3bcd567-2db3-4d72-9cdb-24e14598df57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:24:25 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:25.566 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e3bcd567-2db3-4d72-9cdb-24e14598df57 in datapath 67f05a6c-bdca-4c59-9049-edf7ed03aad0 bound to our chassis
Dec 06 10:24:25 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:25.567 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7322d78b-d00d-4ce7-9ccc-685a4924e7d6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:24:25 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:25.567 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67f05a6c-bdca-4c59-9049-edf7ed03aad0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:24:25 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:25.568 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[a85cdc77-6d3e-4ce8-bf59-370f44e629c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:25Z|00510|binding|INFO|Setting lport e3bcd567-2db3-4d72-9cdb-24e14598df57 ovn-installed in OVS
Dec 06 10:24:25 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:25Z|00511|binding|INFO|Setting lport e3bcd567-2db3-4d72-9cdb-24e14598df57 up in Southbound
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.592 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tape3bcd567-2d: No such device
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.635 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:25.662 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "format": "json"}]: dispatch
Dec 06 10:24:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:25 np0005548789.localdomain ceph-mon[298582]: pgmap v533: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 2.7 MiB/s wr, 60 op/s
Dec 06 10:24:25 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:25.850 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:24:25Z, description=, device_id=7f27da2c-b860-4e59-a4ae-32a003c36a27, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc20f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc20ee0>], id=e4fc5ebd-720f-4d2a-ba2c-8ce2c6466147, ip_allocation=immediate, mac_address=fa:16:3e:94:89:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3554, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:24:25Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:24:26 np0005548789.localdomain podman[334911]: 2025-12-06 10:24:26.09719384 +0000 UTC m=+0.061878752 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:24:26 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:24:26 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:24:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:26.404 263652 INFO neutron.agent.dhcp.agent [None req-d9acfb43-d0a8-47cf-baea-a62c7bd2849b - - - - - -] DHCP configuration for ports {'e4fc5ebd-720f-4d2a-ba2c-8ce2c6466147'} is completed
Dec 06 10:24:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:26.554 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:26 np0005548789.localdomain podman[334964]: 
Dec 06 10:24:26 np0005548789.localdomain podman[334964]: 2025-12-06 10:24:26.618955069 +0000 UTC m=+0.060293953 container create ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:26 np0005548789.localdomain systemd[1]: Started libpod-conmon-ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196.scope.
Dec 06 10:24:26 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:24:26 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b70f3ccd35a58f837d6cbb67993db9373957054834ea3aae99c93b78afe6bc66/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:24:26 np0005548789.localdomain podman[334964]: 2025-12-06 10:24:26.678446128 +0000 UTC m=+0.119785012 container init ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:24:26 np0005548789.localdomain podman[334964]: 2025-12-06 10:24:26.586269241 +0000 UTC m=+0.027608185 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:24:26 np0005548789.localdomain podman[334964]: 2025-12-06 10:24:26.688830275 +0000 UTC m=+0.130169189 container start ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[334982]: started, version 2.85 cachesize 150
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[334982]: DNS service limited to local subnets
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[334982]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[334982]: warning: no upstream servers configured
Dec 06 10:24:26 np0005548789.localdomain dnsmasq-dhcp[334982]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:24:26 np0005548789.localdomain dnsmasq[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/addn_hosts - 0 addresses
Dec 06 10:24:26 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/host
Dec 06 10:24:26 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/opts
Dec 06 10:24:26 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:26.853 263652 INFO neutron.agent.dhcp.agent [None req-e782a239-46cb-4db6-9f7f-8bba64220849 - - - - - -] DHCP configuration for ports {'01afa461-016c-41d6-8fdd-371c7b3fb32f'} is completed
Dec 06 10:24:27 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:27.175 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:24:27Z, description=, device_id=7f27da2c-b860-4e59-a4ae-32a003c36a27, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9de130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fabcc10>], id=40a72c0c-c9b6-4998-aa5c-35369f742816, ip_allocation=immediate, mac_address=fa:16:3e:a2:d6:41, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:24:23Z, description=, dns_domain=, id=67f05a6c-bdca-4c59-9049-edf7ed03aad0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-437399641-network, port_security_enabled=True, project_id=0e98feac0e5947229c2baa6fc34be5fb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24335, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3547, status=ACTIVE, subnets=['e63ebf19-6289-435c-ac51-9391b3e6813a'], tags=[], tenant_id=0e98feac0e5947229c2baa6fc34be5fb, updated_at=2025-12-06T10:24:23Z, vlan_transparent=None, network_id=67f05a6c-bdca-4c59-9049-edf7ed03aad0, port_security_enabled=False, project_id=0e98feac0e5947229c2baa6fc34be5fb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3555, status=DOWN, tags=[], tenant_id=0e98feac0e5947229c2baa6fc34be5fb, updated_at=2025-12-06T10:24:27Z on network 67f05a6c-bdca-4c59-9049-edf7ed03aad0
Dec 06 10:24:27 np0005548789.localdomain dnsmasq[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/addn_hosts - 1 addresses
Dec 06 10:24:27 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/host
Dec 06 10:24:27 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/opts
Dec 06 10:24:27 np0005548789.localdomain podman[335000]: 2025-12-06 10:24:27.39497229 +0000 UTC m=+0.044941384 container kill ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:24:27 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:27.610 263652 INFO neutron.agent.dhcp.agent [None req-7af9c5d0-c064-43f9-a4d0-629c62eae4a6 - - - - - -] DHCP configuration for ports {'40a72c0c-c9b6-4998-aa5c-35369f742816'} is completed
Dec 06 10:24:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:28.190 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:24:27Z, description=, device_id=7f27da2c-b860-4e59-a4ae-32a003c36a27, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaa2b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37faaaa30>], id=40a72c0c-c9b6-4998-aa5c-35369f742816, ip_allocation=immediate, mac_address=fa:16:3e:a2:d6:41, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:24:23Z, description=, dns_domain=, id=67f05a6c-bdca-4c59-9049-edf7ed03aad0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-437399641-network, port_security_enabled=True, project_id=0e98feac0e5947229c2baa6fc34be5fb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=24335, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3547, status=ACTIVE, subnets=['e63ebf19-6289-435c-ac51-9391b3e6813a'], tags=[], tenant_id=0e98feac0e5947229c2baa6fc34be5fb, updated_at=2025-12-06T10:24:23Z, vlan_transparent=None, network_id=67f05a6c-bdca-4c59-9049-edf7ed03aad0, port_security_enabled=False, project_id=0e98feac0e5947229c2baa6fc34be5fb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3555, status=DOWN, tags=[], tenant_id=0e98feac0e5947229c2baa6fc34be5fb, updated_at=2025-12-06T10:24:27Z on network 67f05a6c-bdca-4c59-9049-edf7ed03aad0
Dec 06 10:24:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:28 np0005548789.localdomain dnsmasq[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/addn_hosts - 1 addresses
Dec 06 10:24:28 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/host
Dec 06 10:24:28 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/opts
Dec 06 10:24:28 np0005548789.localdomain podman[335037]: 2025-12-06 10:24:28.396315561 +0000 UTC m=+0.062127980 container kill ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:28 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:24:28 np0005548789.localdomain ceph-mon[298582]: pgmap v534: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.3 MiB/s wr, 49 op/s
Dec 06 10:24:28 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:28 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:28.667 263652 INFO neutron.agent.dhcp.agent [None req-55c32fdd-31e2-4be6-add9-457d8d03fafb - - - - - -] DHCP configuration for ports {'40a72c0c-c9b6-4998-aa5c-35369f742816'} is completed
Dec 06 10:24:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "format": "json"}]: dispatch
Dec 06 10:24:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:24:29 np0005548789.localdomain podman[335059]: 2025-12-06 10:24:29.924242246 +0000 UTC m=+0.084943267 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:24:29 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:29.973 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:29 np0005548789.localdomain podman[335059]: 2025-12-06 10:24:29.988640634 +0000 UTC m=+0.149341655 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 10:24:30 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:24:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:30.444 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: pgmap v535: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.2 MiB/s wr, 124 op/s
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.677825) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670677942, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2837, "num_deletes": 279, "total_data_size": 5386268, "memory_usage": 5462552, "flush_reason": "Manual Compaction"}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670698044, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 3502146, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30608, "largest_seqno": 33440, "table_properties": {"data_size": 3490518, "index_size": 7557, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26999, "raw_average_key_size": 22, "raw_value_size": 3466567, "raw_average_value_size": 2917, "num_data_blocks": 316, "num_entries": 1188, "num_filter_entries": 1188, "num_deletions": 279, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016537, "oldest_key_time": 1765016537, "file_creation_time": 1765016670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 20293 microseconds, and 8994 cpu microseconds.
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.698119) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 3502146 bytes OK
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.698161) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.700387) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.700418) EVENT_LOG_v1 {"time_micros": 1765016670700408, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.700448) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 5373078, prev total WAL file size 5373078, number of live WAL files 2.
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.702132) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(3420KB)], [54(16MB)]
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670702221, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 21166149, "oldest_snapshot_seqno": -1}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13564 keys, 19524991 bytes, temperature: kUnknown
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670810795, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19524991, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19448197, "index_size": 41813, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33925, "raw_key_size": 364920, "raw_average_key_size": 26, "raw_value_size": 19218181, "raw_average_value_size": 1416, "num_data_blocks": 1551, "num_entries": 13564, "num_filter_entries": 13564, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.811171) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19524991 bytes
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.813253) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.8 rd, 179.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 16.8 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(11.6) write-amplify(5.6) OK, records in: 14128, records dropped: 564 output_compression: NoCompression
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.813284) EVENT_LOG_v1 {"time_micros": 1765016670813270, "job": 32, "event": "compaction_finished", "compaction_time_micros": 108670, "compaction_time_cpu_micros": 58244, "output_level": 6, "num_output_files": 1, "total_output_size": 19524991, "num_input_records": 14128, "num_output_records": 13564, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670813899, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016670817168, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.701885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:30.817250) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:30 np0005548789.localdomain dnsmasq[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/addn_hosts - 0 addresses
Dec 06 10:24:30 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/host
Dec 06 10:24:30 np0005548789.localdomain dnsmasq-dhcp[334982]: read /var/lib/neutron/dhcp/67f05a6c-bdca-4c59-9049-edf7ed03aad0/opts
Dec 06 10:24:30 np0005548789.localdomain podman[335099]: 2025-12-06 10:24:30.897838747 +0000 UTC m=+0.063019508 container kill ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:24:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:31.133 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:31 np0005548789.localdomain kernel: device tape3bcd567-2d left promiscuous mode
Dec 06 10:24:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:31Z|00512|binding|INFO|Releasing lport e3bcd567-2db3-4d72-9cdb-24e14598df57 from this chassis (sb_readonly=0)
Dec 06 10:24:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:31Z|00513|binding|INFO|Setting lport e3bcd567-2db3-4d72-9cdb-24e14598df57 down in Southbound
Dec 06 10:24:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:31.149 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-67f05a6c-bdca-4c59-9049-edf7ed03aad0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-67f05a6c-bdca-4c59-9049-edf7ed03aad0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e98feac0e5947229c2baa6fc34be5fb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8812a56d-06ed-4c18-98f2-54c75645cf8d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=e3bcd567-2db3-4d72-9cdb-24e14598df57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:24:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:31.151 160509 INFO neutron.agent.ovn.metadata.agent [-] Port e3bcd567-2db3-4d72-9cdb-24e14598df57 in datapath 67f05a6c-bdca-4c59-9049-edf7ed03aad0 unbound from our chassis
Dec 06 10:24:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:31.154 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 67f05a6c-bdca-4c59-9049-edf7ed03aad0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:24:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:31.154 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[c01ceaf8-2f99-4cbf-924c-a652ddb424e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:24:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:31.158 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "format": "json"}]: dispatch
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3e304c5e-12d4-47b8-989b-1fc1253e12f7", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/82abd4b2-157a-49c5-b0f6-995ee895ebc0/2859c761-f712-4409-870b-5e31b0c42ab8", "osd", "allow rw pool=manila_data namespace=fsvolumens_82abd4b2-157a-49c5-b0f6-995ee895ebc0", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:32 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:24:32 np0005548789.localdomain podman[335137]: 2025-12-06 10:24:32.454613315 +0000 UTC m=+0.066572085 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:24:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:24:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:24:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:24:32Z|00514|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:24:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:32.601 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:32 np0005548789.localdomain ceph-mon[298582]: pgmap v536: 177 pgs: 177 active+clean; 245 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.2 MiB/s wr, 124 op/s
Dec 06 10:24:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:33 np0005548789.localdomain podman[335174]: 2025-12-06 10:24:33.108881825 +0000 UTC m=+0.073735935 container kill ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:24:33 np0005548789.localdomain dnsmasq[334982]: exiting on receipt of SIGTERM
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: tmp-crun.xuFWvg.mount: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: libpod-ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196.scope: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain podman[335188]: 2025-12-06 10:24:33.186610181 +0000 UTC m=+0.060587383 container died ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:33 np0005548789.localdomain podman[335188]: 2025-12-06 10:24:33.220819286 +0000 UTC m=+0.094796438 container cleanup ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: libpod-conmon-ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196.scope: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain podman[335190]: 2025-12-06 10:24:33.26837762 +0000 UTC m=+0.134513993 container remove ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-67f05a6c-bdca-4c59-9049-edf7ed03aad0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:24:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:33.297 263652 INFO neutron.agent.dhcp.agent [None req-6072cad9-bf5b-4be9-9cac-a499c2717921 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:24:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:24:33.298 263652 INFO neutron.agent.dhcp.agent [None req-6072cad9-bf5b-4be9-9cac-a499c2717921 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:24:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: tmp-crun.WdbBJC.mount: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-b70f3ccd35a58f837d6cbb67993db9373957054834ea3aae99c93b78afe6bc66-merged.mount: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae534041aa594f21eccc75296424685ccea09523ed4cad3d4ba6b6620b5f4196-userdata-shm.mount: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d67f05a6c\x2dbdca\x2d4c59\x2d9049\x2dedf7ed03aad0.mount: Deactivated successfully.
Dec 06 10:24:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:34 np0005548789.localdomain ceph-mon[298582]: pgmap v537: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Dec 06 10:24:34 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:34 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "format": "json"}]: dispatch
Dec 06 10:24:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:35.012 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:35.446 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:36 np0005548789.localdomain ceph-mon[298582]: pgmap v538: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 62 KiB/s wr, 68 op/s
Dec 06 10:24:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "format": "json"}]: dispatch
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.665445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677665486, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 365, "num_deletes": 257, "total_data_size": 141375, "memory_usage": 148168, "flush_reason": "Manual Compaction"}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677667873, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 91458, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33445, "largest_seqno": 33805, "table_properties": {"data_size": 89299, "index_size": 270, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5639, "raw_average_key_size": 18, "raw_value_size": 84804, "raw_average_value_size": 272, "num_data_blocks": 12, "num_entries": 311, "num_filter_entries": 311, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016671, "oldest_key_time": 1765016671, "file_creation_time": 1765016677, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 2452 microseconds, and 729 cpu microseconds.
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.667897) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 91458 bytes OK
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.667918) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.669374) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.669469) EVENT_LOG_v1 {"time_micros": 1765016677669460, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.669498) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 138863, prev total WAL file size 138863, number of live WAL files 2.
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.670048) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353233' seq:0, type:0; will stop at (end)
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(89KB)], [57(18MB)]
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677670100, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 19616449, "oldest_snapshot_seqno": -1}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13345 keys, 19187539 bytes, temperature: kUnknown
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677760220, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 19187539, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19112610, "index_size": 40460, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33413, "raw_key_size": 361369, "raw_average_key_size": 27, "raw_value_size": 18886653, "raw_average_value_size": 1415, "num_data_blocks": 1487, "num_entries": 13345, "num_filter_entries": 13345, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016677, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.760619) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 19187539 bytes
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.763933) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.4 rd, 212.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(424.3) write-amplify(209.8) OK, records in: 13875, records dropped: 530 output_compression: NoCompression
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.763965) EVENT_LOG_v1 {"time_micros": 1765016677763951, "job": 34, "event": "compaction_finished", "compaction_time_micros": 90220, "compaction_time_cpu_micros": 45516, "output_level": 6, "num_output_files": 1, "total_output_size": 19187539, "num_input_records": 13875, "num_output_records": 13345, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677764137, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016677767736, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.669937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.767853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.767862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.767864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.767866) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:24:37.767868) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "format": "json"}]: dispatch
Dec 06 10:24:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fb1967bf-2011-4ff5-8e79-30781feb0f35", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:38 np0005548789.localdomain ceph-mon[298582]: pgmap v539: 177 pgs: 177 active+clean; 246 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 62 KiB/s wr, 68 op/s
Dec 06 10:24:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.206 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.235 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.235 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.235 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3756278959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.698 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.773 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.774 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3543682375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:24:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3756278959' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.954 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.955 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11135MB free_disk=41.77423095703125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.956 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:39 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:39.956 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.017 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.018 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.018 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.046 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.064 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.448 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:40 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/701765729' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.465 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.470 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.488 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.510 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:24:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:40.510 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:24:40 np0005548789.localdomain ceph-mon[298582]: pgmap v540: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 133 op/s
Dec 06 10:24:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/701765729' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:40 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:24:40 np0005548789.localdomain systemd[1]: tmp-crun.CwI0vW.mount: Deactivated successfully.
Dec 06 10:24:40 np0005548789.localdomain podman[335262]: 2025-12-06 10:24:40.935896653 +0000 UTC m=+0.095930343 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:24:40 np0005548789.localdomain podman[335262]: 2025-12-06 10:24:40.96720743 +0000 UTC m=+0.127241120 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:24:40 np0005548789.localdomain podman[335263]: 2025-12-06 10:24:40.976318448 +0000 UTC m=+0.131584733 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:24:40 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:24:40 np0005548789.localdomain podman[335263]: 2025-12-06 10:24:40.989218023 +0000 UTC m=+0.144484358 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:24:41 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:24:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "format": "json"}]: dispatch
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.481 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.481 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.481 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.481 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: pgmap v541: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 326 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-649576020", "format": "json"} : dispatch
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:42 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-649576020", "caps": ["mds", "allow rw path=/volumes/_nogroup/4e35c7b0-6333-486a-9deb-d9473aa05e04/5578bf84-b0d0-4fb9-8cfc-a71276656281", "osd", "allow rw pool=manila_data namespace=fsvolumens_4e35c7b0-6333-486a-9deb-d9473aa05e04", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.874 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.875 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.875 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:24:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:42.876 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:24:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:43.321 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:24:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:43.339 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:24:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:43.339 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:24:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:43 np0005548789.localdomain ceph-mon[298582]: pgmap v542: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 70 op/s
Dec 06 10:24:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "format": "json"}]: dispatch
Dec 06 10:24:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6a347634-f593-49b5-a8b9-3ebf810ccb41", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:44.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:45.050 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:45.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:45.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:24:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:45.450 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:46.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:46.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:24:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:24:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: pgmap v543: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "format": "json"}]: dispatch
Dec 06 10:24:47 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:47.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:47.342 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:47.342 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:24:47.343 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:24:47 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:24:47 np0005548789.localdomain podman[335303]: 2025-12-06 10:24:47.763400046 +0000 UTC m=+0.076377195 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:24:47 np0005548789.localdomain podman[335303]: 2025-12-06 10:24:47.806109051 +0000 UTC m=+0.119086160 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc.)
Dec 06 10:24:47 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:24:47 np0005548789.localdomain podman[335304]: 2025-12-06 10:24:47.830079715 +0000 UTC m=+0.140215008 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:24:47 np0005548789.localdomain podman[335304]: 2025-12-06 10:24:47.871255754 +0000 UTC m=+0.181391037 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:47 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:24:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:48.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222", "format": "json"}]: dispatch
Dec 06 10:24:48 np0005548789.localdomain ceph-mon[298582]: pgmap v544: 177 pgs: 177 active+clean; 279 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 67 op/s
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "format": "json"}]: dispatch
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-649576020", "format": "json"} : dispatch
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} : dispatch
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"} : dispatch
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-649576020"}]': finished
Dec 06 10:24:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "auth_id": "tempest-cephx-id-649576020", "format": "json"}]: dispatch
Dec 06 10:24:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:50.104 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:50.453 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:50 np0005548789.localdomain ceph-mon[298582]: pgmap v545: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 327 KiB/s rd, 2.2 MiB/s wr, 70 op/s
Dec 06 10:24:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "format": "json"}]: dispatch
Dec 06 10:24:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3643a05c-023e-4ee5-80c6-9c2596521f46", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222_44a54937-b38a-4109-92db-e338a6e6c4a4", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "aeb4e71b-e797-4ddb-8668-e9c13b45f222", "force": true, "format": "json"}]: dispatch
Dec 06 10:24:51 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2358963560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:51 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/425768214' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:24:51 np0005548789.localdomain podman[335343]: 2025-12-06 10:24:51.928108884 +0000 UTC m=+0.087171905 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:24:51 np0005548789.localdomain podman[335343]: 2025-12-06 10:24:51.964576219 +0000 UTC m=+0.123639220 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:24:51 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: pgmap v546: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.7 KiB/s rd, 67 KiB/s wr, 5 op/s
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1474548798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1025362414' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:24:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:53.176 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "auth_id": "Joe", "format": "json"}]: dispatch
Dec 06 10:24:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:24:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:24:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:24:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:24:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:24:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19274 "" "Go-http-client/1.1"
Dec 06 10:24:54 np0005548789.localdomain ceph-mon[298582]: pgmap v547: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s rd, 124 KiB/s wr, 11 op/s
Dec 06 10:24:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:24:54 np0005548789.localdomain podman[335362]: 2025-12-06 10:24:54.919650652 +0000 UTC m=+0.076461069 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:24:54 np0005548789.localdomain podman[335362]: 2025-12-06 10:24:54.95327786 +0000 UTC m=+0.110088317 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:24:54 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:24:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:55.155 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:24:55.456 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:24:55 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Dec 06 10:24:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e244 e244: 6 total, 6 up, 6 in
Dec 06 10:24:56 np0005548789.localdomain ceph-mon[298582]: pgmap v548: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 100 KiB/s wr, 8 op/s
Dec 06 10:24:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "admin", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:56 np0005548789.localdomain ceph-mon[298582]: osdmap e244: 6 total, 6 up, 6 in
Dec 06 10:24:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab", "format": "json"}]: dispatch
Dec 06 10:24:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:58 np0005548789.localdomain ceph-mon[298582]: pgmap v550: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 120 KiB/s wr, 9 op/s
Dec 06 10:24:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2198419892' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "tenant_id": "14fcd30962314973b2c11b49f89b4cb4", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/efbd58ea-e56f-4c21-9ed9-3d319ca403b8/413ff259-8ca8-446a-9a48-682d6b0aa9e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: pgmap v551: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 99 KiB/s wr, 8 op/s
Dec 06 10:24:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584", "format": "json"}]: dispatch
Dec 06 10:25:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:00.203 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:00.457 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:25:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1792782973' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:00 np0005548789.localdomain podman[335385]: 2025-12-06 10:25:00.923099266 +0000 UTC m=+0.080651527 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 06 10:25:00 np0005548789.localdomain podman[335385]: 2025-12-06 10:25:00.958158707 +0000 UTC m=+0.115710908 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:25:00 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:25:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1792782973' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e245 e245: 6 total, 6 up, 6 in
Dec 06 10:25:02 np0005548789.localdomain ceph-mon[298582]: pgmap v552: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 99 KiB/s wr, 8 op/s
Dec 06 10:25:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:02 np0005548789.localdomain sshd[335410]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:25:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e246 e246: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: osdmap e245: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "format": "json"}]: dispatch
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: osdmap e246: 6 total, 6 up, 6 in
Dec 06 10:25:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:03 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:25:03Z|00515|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 06 10:25:03 np0005548789.localdomain sshd[335410]: Received disconnect from 118.219.234.233 port 39572:11: Bye Bye [preauth]
Dec 06 10:25:03 np0005548789.localdomain sshd[335410]: Disconnected from authenticating user root 118.219.234.233 port 39572 [preauth]
Dec 06 10:25:04 np0005548789.localdomain ceph-mon[298582]: pgmap v555: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 95 KiB/s wr, 34 op/s
Dec 06 10:25:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208", "format": "json"}]: dispatch
Dec 06 10:25:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4089585165' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:05 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e247 e247: 6 total, 6 up, 6 in
Dec 06 10:25:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:05.236 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:05.459 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e248 e248: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: osdmap e247: 6 total, 6 up, 6 in
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "format": "json"}]: dispatch
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: pgmap v557: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 67 KiB/s wr, 39 op/s
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "tenant_id": "438e893229f742e78fe8e62ef6ea17d5", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:25:06 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:25:07 np0005548789.localdomain ceph-mon[298582]: osdmap e248: 6 total, 6 up, 6 in
Dec 06 10:25:07 np0005548789.localdomain sudo[335412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:25:07 np0005548789.localdomain sudo[335412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:07 np0005548789.localdomain sudo[335412]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.922 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain sudo[335430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:25:07 np0005548789.localdomain sudo[335430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b925e12c-8f7a-4a56-b202-66139d9c9b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:07.918206', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd67d211a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '897727b654162bd2b81659852b18f9e11f8f9d0ef1a700b96fc3b8b708cd1ea1'}]}, 'timestamp': '2025-12-06 10:25:07.923559', '_unique_id': 'ab2551df3245447dad8c6f21d443b7a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.927 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a92024f9-2596-4895-b026-47086aed0508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:07.927188', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd67dcc82-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '1d2e115c52e3685e67629a04b03c14f85043a9abc1ca688bd854b033ab874e01'}]}, 'timestamp': '2025-12-06 10:25:07.927966', '_unique_id': 'ce9882cf58314bfc9f177026c7bb9d95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.931 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '956f3999-f245-444d-9c31-e077fa25b6f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:07.931328', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd67e6dfe-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '05c7562ff6453cb18a40ee8c644f68f9de6e84842bf8f2d59c73d403cc2bd407'}]}, 'timestamp': '2025-12-06 10:25:07.932093', '_unique_id': '0905f535123f433a9e1931319866840d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '962a4956-cf49-4007-b94a-89c02373e81e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:07.935386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd683411c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': 'b07aefd2e1f06c48472a99bc0ea0b880251a268fd2b879d21b6258618cb11753'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:07.935386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd6835a62-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '110d672a7591052a4e84a62b148871efd73e31b62e83b20da4bcd2ac7675e807'}]}, 'timestamp': '2025-12-06 10:25:07.964247', '_unique_id': '381799248a8546a8b226c75dfdee0b38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.966 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.966 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9461cfac-bca5-4e78-813e-b57831131898', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:07.966838', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd683db2c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '7041223351134d255b9b5812a0f0797a0c62fabaf7997dd0d557e4e79a0e0526'}]}, 'timestamp': '2025-12-06 10:25:07.967625', '_unique_id': 'c490c7f2bd594a398d3f9cc9159d8885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.970 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.970 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0511bbf3-c91f-4b89-9cc9-d175822794df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:07.971186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68631ba-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': '2bdbc7ab541148547c278428772c0483f6de46518313d2d0b30520af5826d6cb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:07.971186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd6864a6a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': '065ddbb9581953031525cb30d51f2eba9adf2152d1a2cc20676c306ef6e5d729'}]}, 'timestamp': '2025-12-06 10:25:07.983512', '_unique_id': 'f3437a49acf84d84850de85656474b0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01f894a7-8ee5-4f72-8dc3-b6432fe00416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:07.986067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd686c7c4-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': '559ac2cdaa7220becc7499069f16c1faae5cd28b48edda3f930ba764a96c8806'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:07.986067', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd686e34e-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': 'c9fd59c22d8e3d2eb086d8b7771397def19568487354730ed7b538bf5ab34786'}]}, 'timestamp': '2025-12-06 10:25:07.987456', '_unique_id': '9d23cb45d2e7496ca10220270c08f5e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.007 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 19410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00bfe4e4-831c-49c9-82df-de08d5791999', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19410000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:25:07.990724', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd68a0236-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.256543495, 'message_signature': '823d35915aeaec99f753c2cffa3c452c6afff63461af047aaa6e8d37c581df44'}]}, 'timestamp': '2025-12-06 10:25:08.007937', '_unique_id': 'e2868c68993647f58938832042d064db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.010 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.011 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '448431f2-fc71-4b09-8f86-b1d88ef1783a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.010334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68a7b6c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '3dbb515674291adae5e870f37ef531d37682d2c838cca2dab54ec54150678c6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.010334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68a96ec-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': 'c7d2518eac638c7bf9f7037cf5d449162c191532809cb5675c9c152866ae8ea6'}]}, 'timestamp': '2025-12-06 10:25:08.011715', '_unique_id': 'c5ad99f035cb48adb0a7ecf1035d5d48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.014 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7baed7c7-e8f9-40d8-941d-ffdd8bb1a58c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:25:08.015089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd68b35a2-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.256543495, 'message_signature': 'b6efa812562efdec20ba95bd9fac7382422f9791e4ffb763f30c9e7c2f22587b'}]}, 'timestamp': '2025-12-06 10:25:08.015839', '_unique_id': 'f804975d28ab4038bd323ab65c88e7a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.018 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41dcf558-ea0b-48eb-b472-cb11cb368d17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.019283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68bd8c2-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '3afe404702d83d536bdb846683969276e18149b00301cf9202d46c957e8a1368'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.019283', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68bf2d0-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '78afbcf313bc15ecd2be6aba4b79f421b5253b4a1000dcc190f9d665c017cc28'}]}, 'timestamp': '2025-12-06 10:25:08.020611', '_unique_id': '2adee4b238364747b6a95b991437c3e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.023 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.023 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20ed0f5c-631d-4b4a-888e-ad60b3e9e859', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.023892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68c8d1c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '338bc8139188dfb396e229172a520eb5d5e3c3791ae5d2a8b2b9832df290fda2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.023892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68ca6da-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '48d3f15e8a7854ac7b58de75a0f33867f35bd240eb08d17b2b5b894925c82b76'}]}, 'timestamp': '2025-12-06 10:25:08.025224', '_unique_id': '51c9b0c8d56d4251819eefb3b1697d71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.026 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.028 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.029 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccabbba4-2762-4a94-b06e-4927b855240b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.028901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68d5120-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': 'e29b9ff363d5c7e34673c643940ae643becbe1249278c5297e8cf8e0ffdd5400'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.028901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68d6aac-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.220589686, 'message_signature': '1b1b854837377b497028b6c709b1802459d322997abbe7a9b583f8a04e2ddb6a'}]}, 'timestamp': '2025-12-06 10:25:08.030240', '_unique_id': '198a5d0389d64b5ebe3317bdf6973e09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.031 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.033 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.033 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '573b8532-3b0b-4f74-b978-f314924e7662', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.033528', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd68e0732-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '4b184117ccd42d0e6ca16ec23c0022390f5740d30d8333f448878224ad9682c9'}]}, 'timestamp': '2025-12-06 10:25:08.034282', '_unique_id': 'eab2e86a644a4b3dad2953991a94d497'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.035 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.036 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e25496a-3c75-4263-bc95-21e94d63e8ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.036995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68e88b0-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '0576acf62ef3b81c24c06470a1573d8aa32ee840f6b5eda8f38216f70739f3cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.036995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68e9a8a-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': 'f3c46bd24147f2881e719e784f27e5bd44a50ee63c5ef6166e7a4ce6cff78dc2'}]}, 'timestamp': '2025-12-06 10:25:08.037940', '_unique_id': '89891ba3a2ee48c1b218d9934d2e83ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.039 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.040 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3232cb76-7cac-4865-9620-64e17eb4d1e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.040300', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd68f05d8-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '14884e2c26c8e0800bf6a37ea6ee5ffb9d9a0288651a93e5653fb0db214b102c'}]}, 'timestamp': '2025-12-06 10:25:08.040591', '_unique_id': '8e420f47705846a5b76de0884aaa2619'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.041 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.042 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c432a824-7c66-4197-8a5f-eb143cc8e526', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:25:08.041924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd68f44f8-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '52521593409f06159e553b998095fe6f5d20150a7529e982a4fcb437e08fac0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:25:08.041924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd68f4f0c-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.184789522, 'message_signature': '5cf7a8165903342db7e9dfce8e306d5ccd90cc04e46547fd21640f896e43feed'}]}, 'timestamp': '2025-12-06 10:25:08.042446', '_unique_id': '4e931a1669454741933657ef59842b6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.043 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d3bccc0-7dfc-4341-97da-bce53448e9c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.043883', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd68f91ba-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '1a56a880af5b85ebb2658cd0db5c205ccc54d9b7559e9bdf315253e05de6d94e'}]}, 'timestamp': '2025-12-06 10:25:08.044171', '_unique_id': 'a0601cc87bd343ef85d2c953be80a36b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.045 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61647b61-c0ec-4cd8-9bfb-12e5a620aeb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.045663', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd68fd7ce-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': 'fc32cde20ee75b95bce29cff68950305f5917717adde4250deff8e41099aac70'}]}, 'timestamp': '2025-12-06 10:25:08.045964', '_unique_id': '2c3aca3d7e404b9699c911466c8a1d30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.047 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.047 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '754947cd-c7b9-425c-9c32-2a678144f30c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.047393', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd6901acc-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': 'b4730c2cc3f140d283307c79cef4770b53d57fa9e2f3677e577212407c483014'}]}, 'timestamp': '2025-12-06 10:25:08.047682', '_unique_id': 'c95974e073ef4843aa8640130f42c301'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.049 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd02ee157-99ce-4535-815f-20b204f0eba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:25:08.049048', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'd6905c08-d28d-11f0-aaf2-fa163e118844', 'monotonic_time': 13126.167576906, 'message_signature': '5abd121c5f59bf75f2da1fbf95afd8a1b1fc8e81e375e745a565c3c63b4e8299'}]}, 'timestamp': '2025-12-06 10:25:08.049372', '_unique_id': '76c349fa9b4a4743875bfcd4932627fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:25:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:25:08.050 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:25:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:08 np0005548789.localdomain sudo[335430]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:08 np0005548789.localdomain ceph-mon[298582]: pgmap v559: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 28 KiB/s rd, 76 KiB/s wr, 45 op/s
Dec 06 10:25:08 np0005548789.localdomain sudo[335480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:25:08 np0005548789.localdomain sudo[335480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:25:08 np0005548789.localdomain sudo[335480]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e249 e249: 6 total, 6 up, 6 in
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "target_sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:25:09 np0005548789.localdomain ceph-mon[298582]: osdmap e249: 6 total, 6 up, 6 in
Dec 06 10:25:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:10.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:10.462 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:10 np0005548789.localdomain ceph-mon[298582]: pgmap v561: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 34 KiB/s wr, 44 op/s
Dec 06 10:25:10 np0005548789.localdomain ceph-mon[298582]: mgrmap e53: np0005548790.kvkfyr(active, since 13m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:25:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 06 10:25:11 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4192589746' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/4192589746' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e250 e250: 6 total, 6 up, 6 in
Dec 06 10:25:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:25:11 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:25:11 np0005548789.localdomain systemd[1]: tmp-crun.cADQ2I.mount: Deactivated successfully.
Dec 06 10:25:11 np0005548789.localdomain podman[335499]: 2025-12-06 10:25:11.944778986 +0000 UTC m=+0.085843206 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:25:11 np0005548789.localdomain podman[335498]: 2025-12-06 10:25:11.9563847 +0000 UTC m=+0.100329248 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Dec 06 10:25:11 np0005548789.localdomain podman[335499]: 2025-12-06 10:25:11.977050763 +0000 UTC m=+0.118115073 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:25:11 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:25:12 np0005548789.localdomain podman[335498]: 2025-12-06 10:25:12.036103408 +0000 UTC m=+0.180047956 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:25:12 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: pgmap v562: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 33 KiB/s wr, 43 op/s
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: osdmap e250: 6 total, 6 up, 6 in
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:25:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e251 e251: 6 total, 6 up, 6 in
Dec 06 10:25:12 np0005548789.localdomain systemd[1]: tmp-crun.qZbbfg.mount: Deactivated successfully.
Dec 06 10:25:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "david", "format": "json"}]: dispatch
Dec 06 10:25:13 np0005548789.localdomain ceph-mon[298582]: osdmap e251: 6 total, 6 up, 6 in
Dec 06 10:25:14 np0005548789.localdomain ceph-mon[298582]: pgmap v565: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 156 KiB/s wr, 116 op/s
Dec 06 10:25:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:15.322 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:15.464 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e252 e252: 6 total, 6 up, 6 in
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:25:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:25:16 np0005548789.localdomain ceph-mon[298582]: pgmap v566: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 119 KiB/s wr, 69 op/s
Dec 06 10:25:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283_59a2f9e2-3ade-456e-8a51-63c6c5f92484", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "da9cb17d-a418-4d10-ae1e-46dd9b618283", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:16 np0005548789.localdomain ceph-mon[298582]: osdmap e252: 6 total, 6 up, 6 in
Dec 06 10:25:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "format": "json"}]: dispatch
Dec 06 10:25:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e253 e253: 6 total, 6 up, 6 in
Dec 06 10:25:17 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6baadd1a-0add-4c99-8c70-7fc88a6a0739", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:17 np0005548789.localdomain ceph-mon[298582]: osdmap e253: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548789.localdomain sshd[335538]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:25:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e254 e254: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:25:18 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:25:18 np0005548789.localdomain ceph-mon[298582]: pgmap v568: 177 pgs: 177 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 122 KiB/s wr, 71 op/s
Dec 06 10:25:18 np0005548789.localdomain ceph-mon[298582]: osdmap e254: 6 total, 6 up, 6 in
Dec 06 10:25:18 np0005548789.localdomain podman[335541]: 2025-12-06 10:25:18.922657185 +0000 UTC m=+0.078701016 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:25:18 np0005548789.localdomain podman[335541]: 2025-12-06 10:25:18.966233807 +0000 UTC m=+0.122277618 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:25:18 np0005548789.localdomain systemd[1]: tmp-crun.F68GqT.mount: Deactivated successfully.
Dec 06 10:25:18 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:25:18 np0005548789.localdomain podman[335542]: 2025-12-06 10:25:18.992235692 +0000 UTC m=+0.143986903 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:25:19 np0005548789.localdomain podman[335542]: 2025-12-06 10:25:19.007326523 +0000 UTC m=+0.159077714 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:19 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:25:19 np0005548789.localdomain sshd[335538]: Received disconnect from 14.194.101.210 port 35778:11: Bye Bye [preauth]
Dec 06 10:25:19 np0005548789.localdomain sshd[335538]: Disconnected from authenticating user root 14.194.101.210 port 35778 [preauth]
Dec 06 10:25:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7_1ac163ca-79ea-43b6-8028-d76d24ca4cd1", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "541781d2-de06-47af-8a8c-92b3f31186b7", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4e35c7b0-6333-486a-9deb-d9473aa05e04", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:19 np0005548789.localdomain ceph-mon[298582]: pgmap v571: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 64 KiB/s wr, 56 op/s
Dec 06 10:25:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:20.356 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:20.466 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:20 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e255 e255: 6 total, 6 up, 6 in
Dec 06 10:25:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:21.846 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:21.846 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:25:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:21.847 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:25:21 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:21.849 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:25:21 np0005548789.localdomain ceph-mon[298582]: osdmap e255: 6 total, 6 up, 6 in
Dec 06 10:25:21 np0005548789.localdomain ceph-mon[298582]: pgmap v573: 177 pgs: 2 active+clean+snaptrim, 175 active+clean; 282 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 70 KiB/s wr, 61 op/s
Dec 06 10:25:22 np0005548789.localdomain neutron_sriov_agent[256690]: 2025-12-06 10:25:22.656 2 INFO neutron.agent.securitygroups_rpc [req-63e33143-79ba-4452-b217-6b4868995963 req-6d925882-c432-4b30-bcfa-4ea2e9401f50 b40d497af0834616a664e6909c0f6685 b51f704fe6204487b0317c3332364cca - - default default] Security group member updated ['d407968b-b8de-45cd-a244-3bf62d3c0357']
Dec 06 10:25:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e256 e256: 6 total, 6 up, 6 in
Dec 06 10:25:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:25:22 np0005548789.localdomain podman[335581]: 2025-12-06 10:25:22.953498552 +0000 UTC m=+0.080534363 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:25:22 np0005548789.localdomain podman[335581]: 2025-12-06 10:25:22.96619485 +0000 UTC m=+0.093230651 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:25:22 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208_9652976b-d5e6-4b47-ae83-6f26b1212f0e", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "bd3fee73-c2d0-4d8d-bc68-793e93881208", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "82abd4b2-157a-49c5-b0f6-995ee895ebc0", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: osdmap e256: 6 total, 6 up, 6 in
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3694335426' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e257 e257: 6 total, 6 up, 6 in
Dec 06 10:25:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:25:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:25:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:25:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:25:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:25:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1"
Dec 06 10:25:24 np0005548789.localdomain ceph-mon[298582]: pgmap v575: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 84 KiB/s rd, 169 KiB/s wr, 130 op/s
Dec 06 10:25:24 np0005548789.localdomain ceph-mon[298582]: osdmap e257: 6 total, 6 up, 6 in
Dec 06 10:25:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:25.359 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:25.468 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e258 e258: 6 total, 6 up, 6 in
Dec 06 10:25:25 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:25:25 np0005548789.localdomain podman[335600]: 2025-12-06 10:25:25.925280966 +0000 UTC m=+0.082142453 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:25:25 np0005548789.localdomain podman[335600]: 2025-12-06 10:25:25.933874338 +0000 UTC m=+0.090735845 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:25:25 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584_844e7d16-6de7-405a-b7f6-e1408c2dd627", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: pgmap v577: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 95 KiB/s wr, 65 op/s
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "9ca8d539-a020-426e-a70b-3e1ee1b92584", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "auth_id": "admin", "format": "json"}]: dispatch
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: osdmap e258: 6 total, 6 up, 6 in
Dec 06 10:25:26 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e259 e259: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e260 e260: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "format": "json"}]: dispatch
Dec 06 10:25:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "efbd58ea-e56f-4c21-9ed9-3d319ca403b8", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:27 np0005548789.localdomain ceph-mon[298582]: osdmap e259: 6 total, 6 up, 6 in
Dec 06 10:25:27 np0005548789.localdomain ceph-mon[298582]: osdmap e260: 6 total, 6 up, 6 in
Dec 06 10:25:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:28 np0005548789.localdomain ceph-mon[298582]: pgmap v580: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 123 KiB/s wr, 85 op/s
Dec 06 10:25:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2541831024' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:28 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/2541831024' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab_60347acc-96b8-4ecf-9b3d-01ad73eeabab", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "snap_name": "cdf82d1a-cb71-4bc8-a483-dd9f0663d1ab", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:30.363 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:30.469 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:30 np0005548789.localdomain ceph-mon[298582]: pgmap v582: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 76 KiB/s wr, 61 op/s
Dec 06 10:25:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:25:31Z|00516|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:25:31 np0005548789.localdomain systemd[1]: tmp-crun.JybQrl.mount: Deactivated successfully.
Dec 06 10:25:31 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:25:31 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:25:31 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:25:31 np0005548789.localdomain podman[335640]: 2025-12-06 10:25:31.460907511 +0000 UTC m=+0.074666904 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:25:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:25:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:31.527 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:31 np0005548789.localdomain podman[335653]: 2025-12-06 10:25:31.589694618 +0000 UTC m=+0.098993608 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:31 np0005548789.localdomain podman[335653]: 2025-12-06 10:25:31.654383655 +0000 UTC m=+0.163682655 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Dec 06 10:25:31 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:25:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e261 e261: 6 total, 6 up, 6 in
Dec 06 10:25:32 np0005548789.localdomain ceph-mon[298582]: pgmap v583: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 71 KiB/s wr, 57 op/s
Dec 06 10:25:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "format": "json"}]: dispatch
Dec 06 10:25:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0908fc08-ca54-45dc-a60b-9cb6f31660bf", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:32 np0005548789.localdomain ceph-mon[298582]: osdmap e261: 6 total, 6 up, 6 in
Dec 06 10:25:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e262 e262: 6 total, 6 up, 6 in
Dec 06 10:25:34 np0005548789.localdomain ceph-mon[298582]: pgmap v585: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 123 KiB/s wr, 60 op/s
Dec 06 10:25:34 np0005548789.localdomain ceph-mon[298582]: osdmap e262: 6 total, 6 up, 6 in
Dec 06 10:25:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:35.398 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:35.472 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:35 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:25:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:36 np0005548789.localdomain ceph-mon[298582]: pgmap v587: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 106 KiB/s wr, 52 op/s
Dec 06 10:25:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:25:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:36 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e263 e263: 6 total, 6 up, 6 in
Dec 06 10:25:37 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e264 e264: 6 total, 6 up, 6 in
Dec 06 10:25:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "format": "json"}]: dispatch
Dec 06 10:25:37 np0005548789.localdomain ceph-mon[298582]: osdmap e263: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:38 np0005548789.localdomain ceph-mon[298582]: pgmap v589: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.7 KiB/s rd, 63 KiB/s wr, 9 op/s
Dec 06 10:25:38 np0005548789.localdomain ceph-mon[298582]: osdmap e264: 6 total, 6 up, 6 in
Dec 06 10:25:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e265 e265: 6 total, 6 up, 6 in
Dec 06 10:25:39 np0005548789.localdomain ceph-mon[298582]: osdmap e265: 6 total, 6 up, 6 in
Dec 06 10:25:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3052221500' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3052221500' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.213 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.213 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.214 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.214 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.214 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.435 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.474 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/302515343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.662 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.731 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.732 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e266 e266: 6 total, 6 up, 6 in
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: pgmap v592: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 51 KiB/s wr, 36 op/s
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "new_size": 2147483648, "format": "json"}]: dispatch
Dec 06 10:25:40 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/302515343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.938 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.939 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11138MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.940 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:40.940 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.015 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.016 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.016 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.064 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2812440448' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.467 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.473 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.495 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.496 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:25:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:41.497 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: osdmap e266: 6 total, 6 up, 6 in
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: pgmap v594: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 63 KiB/s wr, 45 op/s
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2812440448' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:41 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e267 e267: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.492 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.493 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.493 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.493 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.573 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.574 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.574 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:25:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:42.574 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:25:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e268 e268: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:25:42 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:25:42 np0005548789.localdomain ceph-mon[298582]: osdmap e267: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548789.localdomain ceph-mon[298582]: osdmap e268: 6 total, 6 up, 6 in
Dec 06 10:25:42 np0005548789.localdomain podman[335729]: 2025-12-06 10:25:42.908382561 +0000 UTC m=+0.068116233 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:25:42 np0005548789.localdomain podman[335730]: 2025-12-06 10:25:42.976449161 +0000 UTC m=+0.129807948 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:25:42 np0005548789.localdomain podman[335730]: 2025-12-06 10:25:42.987349515 +0000 UTC m=+0.140708362 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:25:42 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:25:43 np0005548789.localdomain podman[335729]: 2025-12-06 10:25:43.041909873 +0000 UTC m=+0.201643555 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:25:43 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:25:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:43.159 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:25:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:43.174 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:25:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:43.175 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:25:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e269 e269: 6 total, 6 up, 6 in
Dec 06 10:25:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:44.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "format": "json"}]: dispatch
Dec 06 10:25:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b90e39e0-9458-4ad1-b3e6-84f20975b6e9", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:44 np0005548789.localdomain ceph-mon[298582]: pgmap v597: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 57 KiB/s wr, 105 op/s
Dec 06 10:25:44 np0005548789.localdomain ceph-mon[298582]: osdmap e269: 6 total, 6 up, 6 in
Dec 06 10:25:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:45.461 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:45.475 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:25:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1501997510' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:25:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:46.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:46.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:25:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:25:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:25:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:46 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "format": "json"}]: dispatch
Dec 06 10:25:46 np0005548789.localdomain ceph-mon[298582]: pgmap v599: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 58 KiB/s wr, 106 op/s
Dec 06 10:25:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:47.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:47.342 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:47.343 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:25:47.344 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:47 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e270 e270: 6 total, 6 up, 6 in
Dec 06 10:25:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:48.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:48 np0005548789.localdomain ceph-mon[298582]: pgmap v600: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 43 KiB/s wr, 78 op/s
Dec 06 10:25:48 np0005548789.localdomain ceph-mon[298582]: osdmap e270: 6 total, 6 up, 6 in
Dec 06 10:25:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:49.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:49.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:25:49 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:25:49 np0005548789.localdomain podman[335769]: 2025-12-06 10:25:49.929566754 +0000 UTC m=+0.084205435 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:49 np0005548789.localdomain podman[335769]: 2025-12-06 10:25:49.940945472 +0000 UTC m=+0.095584093 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:25:49 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:25:49 np0005548789.localdomain podman[335768]: 2025-12-06 10:25:49.985885995 +0000 UTC m=+0.144688493 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public)
Dec 06 10:25:50 np0005548789.localdomain podman[335768]: 2025-12-06 10:25:50.003200255 +0000 UTC m=+0.162002793 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Dec 06 10:25:50 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.476 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.478 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.495 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:50.496 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:25:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "format": "json"}]: dispatch
Dec 06 10:25:50 np0005548789.localdomain ceph-mon[298582]: pgmap v602: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 72 KiB/s rd, 81 KiB/s wr, 101 op/s
Dec 06 10:25:52 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 e271: 6 total, 6 up, 6 in
Dec 06 10:25:52 np0005548789.localdomain ceph-mon[298582]: pgmap v603: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 35 KiB/s wr, 24 op/s
Dec 06 10:25:52 np0005548789.localdomain ceph-mon[298582]: osdmap e271: 6 total, 6 up, 6 in
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "82dd5c85-b26d-4a99-a866-de66484ffe5c", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4186351195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2061225661' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3506710697' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:25:53 np0005548789.localdomain podman[335806]: 2025-12-06 10:25:53.920381136 +0000 UTC m=+0.077667556 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:25:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:25:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:25:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:25:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:25:54 np0005548789.localdomain podman[335806]: 2025-12-06 10:25:54.053298308 +0000 UTC m=+0.210584728 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:25:54 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:25:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:25:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19279 "" "Go-http-client/1.1"
Dec 06 10:25:54 np0005548789.localdomain ceph-mon[298582]: pgmap v605: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 94 KiB/s wr, 27 op/s
Dec 06 10:25:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3655531887' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.497 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.499 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.499 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.500 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.537 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:25:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:25:55.538 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:25:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a6780a44-878b-4ff8-b9d6-a0cce9b3cee0", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548789.localdomain ceph-mon[298582]: pgmap v606: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 94 KiB/s wr, 27 op/s
Dec 06 10:25:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3f0a96b5-7080-41d0-b8de-4c6a54c262e3", "force": true, "format": "json"}]: dispatch
Dec 06 10:25:56 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:25:56 np0005548789.localdomain podman[335824]: 2025-12-06 10:25:56.927622011 +0000 UTC m=+0.085534026 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:25:56 np0005548789.localdomain podman[335824]: 2025-12-06 10:25:56.938317398 +0000 UTC m=+0.096229423 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:25:56 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:25:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:58 np0005548789.localdomain ceph-mon[298582]: pgmap v607: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 78 KiB/s wr, 22 op/s
Dec 06 10:25:58 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:25:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:25:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "format": "json"}]: dispatch
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.539 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.541 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.541 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.541 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.572 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:00.572 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:00 np0005548789.localdomain ceph-mon[298582]: pgmap v608: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 87 KiB/s wr, 4 op/s
Dec 06 10:26:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:26:01 np0005548789.localdomain ceph-mon[298582]: pgmap v609: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 87 KiB/s wr, 4 op/s
Dec 06 10:26:01 np0005548789.localdomain podman[335848]: 2025-12-06 10:26:01.908861217 +0000 UTC m=+0.069776813 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:26:01 np0005548789.localdomain podman[335848]: 2025-12-06 10:26:01.972234475 +0000 UTC m=+0.133150081 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:26:01 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:26:02 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:02Z|00517|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Dec 06 10:26:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "format": "json"}]: dispatch
Dec 06 10:26:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a6b976c3-30be-4a3d-bf97-d12650362891", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:04 np0005548789.localdomain ceph-mon[298582]: pgmap v610: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 386 B/s rd, 117 KiB/s wr, 6 op/s
Dec 06 10:26:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "format": "json"}]: dispatch
Dec 06 10:26:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.573 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.575 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.575 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.575 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.619 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:05.619 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:06 np0005548789.localdomain ceph-mon[298582]: pgmap v611: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 64 KiB/s wr, 3 op/s
Dec 06 10:26:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "format": "json"}]: dispatch
Dec 06 10:26:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "691a6613-80b6-44f7-899f-c0e9f04c8f64", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:08 np0005548789.localdomain ceph-mon[298582]: pgmap v612: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 64 KiB/s wr, 3 op/s
Dec 06 10:26:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "format": "json"}]: dispatch
Dec 06 10:26:09 np0005548789.localdomain sudo[335873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:26:09 np0005548789.localdomain sudo[335873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:09 np0005548789.localdomain sudo[335873]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:09 np0005548789.localdomain sudo[335891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:26:09 np0005548789.localdomain sudo[335891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404_42a3d50e-9358-4b7a-9bc8-ccb63c964302", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "snap_name": "f5e12b35-c54a-4c08-b88c-95eb83ef6404", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:09 np0005548789.localdomain sudo[335891]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:10 np0005548789.localdomain sudo[335940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:26:10 np0005548789.localdomain sudo[335940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:26:10 np0005548789.localdomain sudo[335940]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.621 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.623 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.623 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.623 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.666 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:10.667 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:10 np0005548789.localdomain ceph-mon[298582]: pgmap v613: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 103 KiB/s wr, 6 op/s
Dec 06 10:26:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:26:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:26:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:26:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: pgmap v614: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 70 KiB/s wr, 4 op/s
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56f0c12e-9bce-473e-90b3-283dbd57851f", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:12 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:26:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:26:13 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:26:13 np0005548789.localdomain ceph-mon[298582]: pgmap v615: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 113 KiB/s wr, 7 op/s
Dec 06 10:26:13 np0005548789.localdomain systemd[1]: tmp-crun.AyJZvv.mount: Deactivated successfully.
Dec 06 10:26:13 np0005548789.localdomain podman[335959]: 2025-12-06 10:26:13.930165047 +0000 UTC m=+0.089760554 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:26:13 np0005548789.localdomain podman[335958]: 2025-12-06 10:26:13.975569115 +0000 UTC m=+0.135203424 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:26:13 np0005548789.localdomain podman[335958]: 2025-12-06 10:26:13.983119356 +0000 UTC m=+0.142753665 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:13 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:26:14 np0005548789.localdomain podman[335959]: 2025-12-06 10:26:14.0267832 +0000 UTC m=+0.186378647 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:26:14 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:26:14 np0005548789.localdomain ceph-mon[298582]: mgrmap e54: np0005548790.kvkfyr(active, since 14m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:26:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.668 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.670 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.670 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.670 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.682 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:15.683 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:15 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e272 e272: 6 total, 6 up, 6 in
Dec 06 10:26:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "format": "json"}]: dispatch
Dec 06 10:26:15 np0005548789.localdomain ceph-mon[298582]: pgmap v616: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 82 KiB/s wr, 5 op/s
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:26:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:26:16 np0005548789.localdomain ceph-mon[298582]: osdmap e272: 6 total, 6 up, 6 in
Dec 06 10:26:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:18 np0005548789.localdomain ceph-mon[298582]: pgmap v618: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 99 KiB/s wr, 6 op/s
Dec 06 10:26:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.684 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.686 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.686 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.686 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.687 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:20.689 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548789.localdomain ceph-mon[298582]: pgmap v619: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 78 KiB/s wr, 5 op/s
Dec 06 10:26:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ab908986-633b-4dd3-a1cc-27834b2adc14", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:26:20 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:26:20 np0005548789.localdomain podman[336001]: 2025-12-06 10:26:20.927645017 +0000 UTC m=+0.082655647 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Dec 06 10:26:20 np0005548789.localdomain podman[336001]: 2025-12-06 10:26:20.937740696 +0000 UTC m=+0.092751346 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 06 10:26:20 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:26:20 np0005548789.localdomain systemd[1]: tmp-crun.HXT9wa.mount: Deactivated successfully.
Dec 06 10:26:20 np0005548789.localdomain podman[336002]: 2025-12-06 10:26:20.996941495 +0000 UTC m=+0.148581032 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:26:21 np0005548789.localdomain podman[336002]: 2025-12-06 10:26:21.011140319 +0000 UTC m=+0.162779856 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:26:21 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:26:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:22.247 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:22.247 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:22 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:22.249 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 e273: 6 total, 6 up, 6 in
Dec 06 10:26:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:22.784 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:22Z, description=, device_id=b0de0aa3-0513-45a7-a160-43d6176211a5, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd01730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc91460>], id=37156bdd-58f7-4be9-babe-eb430466a407, ip_allocation=immediate, mac_address=fa:16:3e:32:c5:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3753, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:26:22Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: pgmap v620: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 78 KiB/s wr, 5 op/s
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:22 np0005548789.localdomain ceph-mon[298582]: osdmap e273: 6 total, 6 up, 6 in
Dec 06 10:26:23 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:26:23 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:26:23 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:26:23 np0005548789.localdomain podman[336055]: 2025-12-06 10:26:23.0119524 +0000 UTC m=+0.060037336 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:26:23 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:23.324 263652 INFO neutron.agent.dhcp.agent [None req-956f96c8-497f-4ab0-b32b-a1d0ba7fc1d8 - - - - - -] DHCP configuration for ports {'37156bdd-58f7-4be9-babe-eb430466a407'} is completed
Dec 06 10:26:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:23.547 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "87f760d1-7d00-4cc5-99b2-a7d946785f7e", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:26:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:26:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:26:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:26:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:26:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19282 "" "Go-http-client/1.1"
Dec 06 10:26:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:26:24 np0005548789.localdomain ceph-mon[298582]: pgmap v622: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 130 KiB/s wr, 7 op/s
Dec 06 10:26:24 np0005548789.localdomain podman[336076]: 2025-12-06 10:26:24.913673953 +0000 UTC m=+0.078467269 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Dec 06 10:26:24 np0005548789.localdomain podman[336076]: 2025-12-06 10:26:24.924877656 +0000 UTC m=+0.089670992 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 06 10:26:24 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:26:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:25.728 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:25 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 06 10:26:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:26.684 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: pgmap v623: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 435 B/s rd, 111 KiB/s wr, 6 op/s
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "format": "json"}]: dispatch
Dec 06 10:26:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "84cf61e2-4e49-410b-a14b-54abbb5a4b95", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:26 np0005548789.localdomain sshd[336095]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:27 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:26:27 np0005548789.localdomain podman[336097]: 2025-12-06 10:26:27.925452198 +0000 UTC m=+0.084975688 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:26:27 np0005548789.localdomain podman[336097]: 2025-12-06 10:26:27.962340657 +0000 UTC m=+0.121864157 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:26:27 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:26:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:28 np0005548789.localdomain ceph-mon[298582]: pgmap v624: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 104 KiB/s wr, 5 op/s
Dec 06 10:26:28 np0005548789.localdomain sshd[336095]: Connection reset by authenticating user root 91.202.233.33 port 64228 [preauth]
Dec 06 10:26:28 np0005548789.localdomain sshd[336120]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Dec 06 10:26:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 06 10:26:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Dec 06 10:26:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Dec 06 10:26:29 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:29.830 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:29Z, description=, device_id=a662b0a7-3c9a-4ddd-a647-8d57adfa246a, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc26be0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc26280>], id=7dff28d2-a5fd-4739-bfc4-65f4b8d30daf, ip_allocation=immediate, mac_address=fa:16:3e:6a:c4:3f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3771, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:26:29Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:26:30 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:26:30 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:26:30 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:26:30 np0005548789.localdomain podman[336140]: 2025-12-06 10:26:30.051745915 +0000 UTC m=+0.048277946 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:26:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:30.313 263652 INFO neutron.agent.dhcp.agent [None req-b56938b1-71a8-44d2-b92d-4085c8a23c16 - - - - - -] DHCP configuration for ports {'7dff28d2-a5fd-4739-bfc4-65f4b8d30daf'} is completed
Dec 06 10:26:30 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:30.435 263652 INFO neutron.agent.linux.ip_lib [None req-23c56141-c409-4b4d-b753-26898ad4086b - - - - - -] Device tap68ac2d58-05 cannot be used as it has no MAC address
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.440 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.460 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain kernel: device tap68ac2d58-05 entered promiscuous mode
Dec 06 10:26:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:30Z|00518|binding|INFO|Claiming lport 68ac2d58-053a-483f-a60b-4eaff9e97708 for this chassis.
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.471 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:30Z|00519|binding|INFO|68ac2d58-053a-483f-a60b-4eaff9e97708: Claiming unknown
Dec 06 10:26:30 np0005548789.localdomain systemd-udevd[336171]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:26:30 np0005548789.localdomain NetworkManager[5973]: <info>  [1765016790.4754] manager: (tap68ac2d58-05): new Generic device (/org/freedesktop/NetworkManager/Devices/82)
Dec 06 10:26:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:30.485 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-ffefea15-4edb-43a1-a498-6e71b5510aec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffefea15-4edb-43a1-a498-6e71b5510aec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0680825af8a248e9b1a46d099dbba654', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a90362af-d474-4111-8af9-afb889638492, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=68ac2d58-053a-483f-a60b-4eaff9e97708) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:30.488 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 68ac2d58-053a-483f-a60b-4eaff9e97708 in datapath ffefea15-4edb-43a1-a498-6e71b5510aec bound to our chassis
Dec 06 10:26:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:30.490 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port b858d185-a8c6-4173-bbf1-f74fdc623898 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:26:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:30.490 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffefea15-4edb-43a1-a498-6e71b5510aec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:26:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:30.491 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[8f265c37-4e87-416f-8a65-0a9b9673168d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:26:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:30Z|00520|binding|INFO|Setting lport 68ac2d58-053a-483f-a60b-4eaff9e97708 ovn-installed in OVS
Dec 06 10:26:30 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:30Z|00521|binding|INFO|Setting lport 68ac2d58-053a-483f-a60b-4eaff9e97708 up in Southbound
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.517 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.554 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.620 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.726 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:30.732 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f7d12c1-956a-4451-b950-3662d25c7591", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: pgmap v625: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 126 KiB/s wr, 7 op/s
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve48", "format": "json"}]: dispatch
Dec 06 10:26:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:31 np0005548789.localdomain sshd[336120]: Connection reset by authenticating user root 91.202.233.33 port 64232 [preauth]
Dec 06 10:26:31 np0005548789.localdomain sshd[336205]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:31 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:31.252 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:26:31 np0005548789.localdomain podman[336228]: 
Dec 06 10:26:31 np0005548789.localdomain podman[336228]: 2025-12-06 10:26:31.429280554 +0000 UTC m=+0.077726517 container create 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:26:31 np0005548789.localdomain podman[336228]: 2025-12-06 10:26:31.381203565 +0000 UTC m=+0.029649568 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:26:31 np0005548789.localdomain systemd[1]: Started libpod-conmon-47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23.scope.
Dec 06 10:26:31 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:26:31 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c812ec028dfa41ac696173d138a7fa185692627e0817d322fac67fcf1f33ade3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:26:31 np0005548789.localdomain podman[336228]: 2025-12-06 10:26:31.516266803 +0000 UTC m=+0.164712776 container init 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 10:26:31 np0005548789.localdomain podman[336228]: 2025-12-06 10:26:31.527388123 +0000 UTC m=+0.175834086 container start 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: started, version 2.85 cachesize 150
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: DNS service limited to local subnets
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: warning: no upstream servers configured
Dec 06 10:26:31 np0005548789.localdomain dnsmasq-dhcp[336247]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/addn_hosts - 0 addresses
Dec 06 10:26:31 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/host
Dec 06 10:26:31 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/opts
Dec 06 10:26:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:31.585 263652 INFO neutron.agent.dhcp.agent [None req-69b7eedd-053f-4c4d-a599-b5b626a2a1c3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:31Z, description=, device_id=a662b0a7-3c9a-4ddd-a647-8d57adfa246a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fd3a9a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc55670>], id=cef4873b-e89f-4614-b320-d769cf02dd7f, ip_allocation=immediate, mac_address=fa:16:3e:35:bb:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:26:28Z, description=, dns_domain=, id=ffefea15-4edb-43a1-a498-6e71b5510aec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1539709692-network, port_security_enabled=True, project_id=0680825af8a248e9b1a46d099dbba654, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45640, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3765, status=ACTIVE, subnets=['bd32e7e6-ec38-47a8-b41d-1da05e64c200'], tags=[], tenant_id=0680825af8a248e9b1a46d099dbba654, updated_at=2025-12-06T10:26:28Z, vlan_transparent=None, network_id=ffefea15-4edb-43a1-a498-6e71b5510aec, port_security_enabled=False, project_id=0680825af8a248e9b1a46d099dbba654, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3773, status=DOWN, tags=[], tenant_id=0680825af8a248e9b1a46d099dbba654, updated_at=2025-12-06T10:26:31Z on network ffefea15-4edb-43a1-a498-6e71b5510aec
Dec 06 10:26:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:31.684 263652 INFO neutron.agent.dhcp.agent [None req-ae6b865b-7fbb-4a77-afb3-33f4bb273864 - - - - - -] DHCP configuration for ports {'afad7adb-a4ea-42b9-8e21-3426fb577880'} is completed
Dec 06 10:26:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "format": "json"}]: dispatch
Dec 06 10:26:31 np0005548789.localdomain dnsmasq[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/addn_hosts - 1 addresses
Dec 06 10:26:31 np0005548789.localdomain podman[336263]: 2025-12-06 10:26:31.849156549 +0000 UTC m=+0.066840164 container kill 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:26:31 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/host
Dec 06 10:26:31 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/opts
Dec 06 10:26:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:32.148 263652 INFO neutron.agent.dhcp.agent [None req-fc5a40ea-4e7e-47c2-8c2a-f83a7f141851 - - - - - -] DHCP configuration for ports {'cef4873b-e89f-4614-b320-d769cf02dd7f'} is completed
Dec 06 10:26:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:26:32 np0005548789.localdomain podman[336285]: 2025-12-06 10:26:32.424948411 +0000 UTC m=+0.088975021 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:26:32 np0005548789.localdomain podman[336285]: 2025-12-06 10:26:32.487670718 +0000 UTC m=+0.151697388 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125)
Dec 06 10:26:32 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:26:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:32.614 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:26:31Z, description=, device_id=a662b0a7-3c9a-4ddd-a647-8d57adfa246a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa993a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa99ac0>], id=cef4873b-e89f-4614-b320-d769cf02dd7f, ip_allocation=immediate, mac_address=fa:16:3e:35:bb:c4, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:26:28Z, description=, dns_domain=, id=ffefea15-4edb-43a1-a498-6e71b5510aec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1539709692-network, port_security_enabled=True, project_id=0680825af8a248e9b1a46d099dbba654, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45640, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3765, status=ACTIVE, subnets=['bd32e7e6-ec38-47a8-b41d-1da05e64c200'], tags=[], tenant_id=0680825af8a248e9b1a46d099dbba654, updated_at=2025-12-06T10:26:28Z, vlan_transparent=None, network_id=ffefea15-4edb-43a1-a498-6e71b5510aec, port_security_enabled=False, project_id=0680825af8a248e9b1a46d099dbba654, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3773, status=DOWN, tags=[], tenant_id=0680825af8a248e9b1a46d099dbba654, updated_at=2025-12-06T10:26:31Z on network ffefea15-4edb-43a1-a498-6e71b5510aec
Dec 06 10:26:32 np0005548789.localdomain systemd[1]: tmp-crun.gIozBd.mount: Deactivated successfully.
Dec 06 10:26:32 np0005548789.localdomain ceph-mon[298582]: pgmap v626: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 126 KiB/s wr, 7 op/s
Dec 06 10:26:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 06 10:26:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:32 np0005548789.localdomain dnsmasq[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/addn_hosts - 1 addresses
Dec 06 10:26:32 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/host
Dec 06 10:26:32 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/opts
Dec 06 10:26:32 np0005548789.localdomain podman[336326]: 2025-12-06 10:26:32.852484529 +0000 UTC m=+0.071161906 container kill 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:26:32 np0005548789.localdomain sshd[336205]: Connection reset by authenticating user root 91.202.233.33 port 64236 [preauth]
Dec 06 10:26:33 np0005548789.localdomain sshd[336348]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:33.138 263652 INFO neutron.agent.dhcp.agent [None req-6c7e05cb-c87c-45cf-9829-f53da3377085 - - - - - -] DHCP configuration for ports {'cef4873b-e89f-4614-b320-d769cf02dd7f'} is completed
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "tenant_id": "e0991b50d433489b9122b5c71fdb2883", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/601ad3a1-8738-4a72-911d-38595abebd4b/a5c5bd7f-cc74-4f03-8395-ee4b429e02e1", "osd", "allow rw pool=manila_data namespace=fsvolumens_601ad3a1-8738-4a72-911d-38595abebd4b", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:33 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:34 np0005548789.localdomain sshd[336348]: Connection reset by authenticating user root 91.202.233.33 port 25314 [preauth]
Dec 06 10:26:34 np0005548789.localdomain sshd[336350]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:34 np0005548789.localdomain ceph-mon[298582]: pgmap v627: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 870 B/s rd, 208 KiB/s wr, 13 op/s
Dec 06 10:26:34 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.736 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.774 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:35 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:35.775 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:35 np0005548789.localdomain ceph-mon[298582]: pgmap v628: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 119 KiB/s wr, 8 op/s
Dec 06 10:26:35 np0005548789.localdomain sshd[336350]: Connection reset by authenticating user root 91.202.233.33 port 25326 [preauth]
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Dec 06 10:26:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve47", "format": "json"}]: dispatch
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: pgmap v629: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 119 KiB/s wr, 8 op/s
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:38 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:38Z|00522|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:26:38 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:26:38 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:26:38 np0005548789.localdomain podman[336367]: 2025-12-06 10:26:38.415925204 +0000 UTC m=+0.053280320 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:26:38 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:26:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:38.428 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3136332362' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:26:39 np0005548789.localdomain sshd[336388]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:40 np0005548789.localdomain dnsmasq[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/addn_hosts - 0 addresses
Dec 06 10:26:40 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/host
Dec 06 10:26:40 np0005548789.localdomain dnsmasq-dhcp[336247]: read /var/lib/neutron/dhcp/ffefea15-4edb-43a1-a498-6e71b5510aec/opts
Dec 06 10:26:40 np0005548789.localdomain podman[336407]: 2025-12-06 10:26:40.141048048 +0000 UTC m=+0.069046852 container kill 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:26:40 np0005548789.localdomain systemd[1]: tmp-crun.8pw93Y.mount: Deactivated successfully.
Dec 06 10:26:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:40.334 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:40 np0005548789.localdomain kernel: device tap68ac2d58-05 left promiscuous mode
Dec 06 10:26:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:40Z|00523|binding|INFO|Releasing lport 68ac2d58-053a-483f-a60b-4eaff9e97708 from this chassis (sb_readonly=0)
Dec 06 10:26:40 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:40Z|00524|binding|INFO|Setting lport 68ac2d58-053a-483f-a60b-4eaff9e97708 down in Southbound
Dec 06 10:26:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:40.345 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-ffefea15-4edb-43a1-a498-6e71b5510aec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffefea15-4edb-43a1-a498-6e71b5510aec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0680825af8a248e9b1a46d099dbba654', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a90362af-d474-4111-8af9-afb889638492, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=68ac2d58-053a-483f-a60b-4eaff9e97708) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:26:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:40.347 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 68ac2d58-053a-483f-a60b-4eaff9e97708 in datapath ffefea15-4edb-43a1-a498-6e71b5510aec unbound from our chassis
Dec 06 10:26:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:40.349 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffefea15-4edb-43a1-a498-6e71b5510aec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:26:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:40.350 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[d59372a6-5103-4169-bb88-82a28f394172]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:26:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:40.353 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:40 np0005548789.localdomain ceph-mon[298582]: pgmap v630: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 183 KiB/s wr, 12 op/s
Dec 06 10:26:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:40.815 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:41 np0005548789.localdomain sshd[336388]: Received disconnect from 118.219.234.233 port 41354:11: Bye Bye [preauth]
Dec 06 10:26:41 np0005548789.localdomain sshd[336388]: Disconnected from authenticating user root 118.219.234.233 port 41354 [preauth]
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.386 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.386 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.387 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.387 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "auth_id": "eve49", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:41 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:26:41Z|00525|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.561 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:41 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:26:41 np0005548789.localdomain podman[336447]: 2025-12-06 10:26:41.582145451 +0000 UTC m=+0.071760125 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:26:41 np0005548789.localdomain systemd[1]: tmp-crun.VtVGRz.mount: Deactivated successfully.
Dec 06 10:26:41 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:26:41 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.946 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.977 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.978 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:26:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:41.978 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.038 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.038 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.038 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.039 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.039 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:42 np0005548789.localdomain dnsmasq[336247]: exiting on receipt of SIGTERM
Dec 06 10:26:42 np0005548789.localdomain podman[336485]: 2025-12-06 10:26:42.152941649 +0000 UTC m=+0.055749405 container kill 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:42 np0005548789.localdomain systemd[1]: libpod-47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23.scope: Deactivated successfully.
Dec 06 10:26:42 np0005548789.localdomain podman[336500]: 2025-12-06 10:26:42.210461978 +0000 UTC m=+0.045639497 container died 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:26:42 np0005548789.localdomain podman[336500]: 2025-12-06 10:26:42.289283867 +0000 UTC m=+0.124461356 container cleanup 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:26:42 np0005548789.localdomain systemd[1]: libpod-conmon-47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23.scope: Deactivated successfully.
Dec 06 10:26:42 np0005548789.localdomain podman[336508]: 2025-12-06 10:26:42.312382583 +0000 UTC m=+0.127462297 container remove 47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffefea15-4edb-43a1-a498-6e71b5510aec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:26:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:42.350 263652 INFO neutron.agent.dhcp.agent [None req-d0afb8d4-33d0-41be-ad79-ef55f87f4a4d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:26:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:26:42.375 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1615917791' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.499 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.571 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:26:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c812ec028dfa41ac696173d138a7fa185692627e0817d322fac67fcf1f33ade3-merged.mount: Deactivated successfully.
Dec 06 10:26:42 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47a2771951379052c135febd1bc2cced535b242e476bebe9cf722bd66e2e1a23-userdata-shm.mount: Deactivated successfully.
Dec 06 10:26:42 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2dffefea15\x2d4edb\x2d43a1\x2da498\x2d6e71b5510aec.mount: Deactivated successfully.
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.571 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.757 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.759 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11128MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.760 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.760 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.840 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.841 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.841 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:26:42 np0005548789.localdomain ceph-mgr[288591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3354697053
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "format": "json"}]: dispatch
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "601ad3a1-8738-4a72-911d-38595abebd4b", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: pgmap v631: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 142 KiB/s wr, 9 op/s
Dec 06 10:26:42 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1615917791' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:42.886 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4068335658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:43.326 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:26:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:43.333 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:43.394 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:26:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:43.396 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:26:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:43.396 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4068335658' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: pgmap v632: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 207 KiB/s wr, 14 op/s
Dec 06 10:26:43 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a", "format": "json"}]: dispatch
Dec 06 10:26:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:26:44 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:26:44 np0005548789.localdomain systemd[1]: tmp-crun.fLHIIJ.mount: Deactivated successfully.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.928682) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804928743, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2661, "num_deletes": 266, "total_data_size": 4649519, "memory_usage": 4706728, "flush_reason": "Manual Compaction"}
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec 06 10:26:44 np0005548789.localdomain podman[336572]: 2025-12-06 10:26:44.937999903 +0000 UTC m=+0.096139519 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804946481, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3017632, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33810, "largest_seqno": 36466, "table_properties": {"data_size": 3007152, "index_size": 6537, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25415, "raw_average_key_size": 22, "raw_value_size": 2984935, "raw_average_value_size": 2613, "num_data_blocks": 280, "num_entries": 1142, "num_filter_entries": 1142, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016677, "oldest_key_time": 1765016677, "file_creation_time": 1765016804, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 17864 microseconds, and 7678 cpu microseconds.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.946543) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3017632 bytes OK
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.946573) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.948089) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.948110) EVENT_LOG_v1 {"time_micros": 1765016804948103, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.948136) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 4637022, prev total WAL file size 4637022, number of live WAL files 2.
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.949202) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2946KB)], [60(18MB)]
Dec 06 10:26:44 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016804949290, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 22205171, "oldest_snapshot_seqno": -1}
Dec 06 10:26:44 np0005548789.localdomain podman[336572]: 2025-12-06 10:26:44.976472089 +0000 UTC m=+0.134611665 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:26:44 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 13940 keys, 20512656 bytes, temperature: kUnknown
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805051167, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 20512656, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20432352, "index_size": 44363, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34885, "raw_key_size": 375669, "raw_average_key_size": 26, "raw_value_size": 20194722, "raw_average_value_size": 1448, "num_data_blocks": 1642, "num_entries": 13940, "num_filter_entries": 13940, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016804, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.051647) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 20512656 bytes
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.053527) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.6 rd, 201.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 18.3 +0.0 blob) out(19.6 +0.0 blob), read-write-amplify(14.2) write-amplify(6.8) OK, records in: 14487, records dropped: 547 output_compression: NoCompression
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.053558) EVENT_LOG_v1 {"time_micros": 1765016805053545, "job": 36, "event": "compaction_finished", "compaction_time_micros": 102034, "compaction_time_cpu_micros": 48090, "output_level": 6, "num_output_files": 1, "total_output_size": 20512656, "num_input_records": 14487, "num_output_records": 13940, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805054164, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016805057050, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:44.949084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.057182) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.057192) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.057196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.057199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:26:45.057210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:45 np0005548789.localdomain podman[336573]: 2025-12-06 10:26:45.071388741 +0000 UTC m=+0.227877667 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:26:45 np0005548789.localdomain podman[336573]: 2025-12-06 10:26:45.109238227 +0000 UTC m=+0.265727123 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:26:45 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:26:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:45.817 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:46.599 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:26:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:26:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:26:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:47.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:47.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:47.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:26:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:47.343 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:47 np0005548789.localdomain ceph-mon[298582]: pgmap v633: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 129 KiB/s wr, 9 op/s
Dec 06 10:26:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:47.345 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:26:47.346 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:48.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:26:48 np0005548789.localdomain ceph-mon[298582]: pgmap v634: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 129 KiB/s wr, 9 op/s
Dec 06 10:26:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:49.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a_5759c99a-9a0c-4c9b-8de2-da85e4830d9b", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "snap_name": "1d0736b7-9f9a-4eab-ab98-cf736d1a3e1a", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.823 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.825 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.825 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.825 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:50 np0005548789.localdomain ceph-mon[298582]: pgmap v635: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 191 KiB/s wr, 13 op/s
Dec 06 10:26:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:50 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:50 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:50.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:26:51 np0005548789.localdomain sshd[336612]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:26:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:26:51 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:26:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:51 np0005548789.localdomain systemd[1]: tmp-crun.X2s6LK.mount: Deactivated successfully.
Dec 06 10:26:51 np0005548789.localdomain podman[336614]: 2025-12-06 10:26:51.917370971 +0000 UTC m=+0.078909173 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm)
Dec 06 10:26:51 np0005548789.localdomain podman[336615]: 2025-12-06 10:26:51.96971328 +0000 UTC m=+0.130472639 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 06 10:26:51 np0005548789.localdomain podman[336614]: 2025-12-06 10:26:51.987096272 +0000 UTC m=+0.148634474 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:26:52 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:26:52 np0005548789.localdomain podman[336615]: 2025-12-06 10:26:52.010218779 +0000 UTC m=+0.170978138 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 06 10:26:52 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:26:52 np0005548789.localdomain sshd[336612]: Received disconnect from 14.194.101.210 port 52548:11: Bye Bye [preauth]
Dec 06 10:26:52 np0005548789.localdomain sshd[336612]: Disconnected from authenticating user root 14.194.101.210 port 52548 [preauth]
Dec 06 10:26:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "format": "json"}]: dispatch
Dec 06 10:26:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a4535dc8-c535-4a4f-a931-b9e1e14f3568", "force": true, "format": "json"}]: dispatch
Dec 06 10:26:52 np0005548789.localdomain ceph-mon[298582]: pgmap v636: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 126 KiB/s wr, 9 op/s
Dec 06 10:26:52 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1038771449' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1752695086' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:26:53 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:26:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:26:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:26:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:26:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:26:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:26:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Dec 06 10:26:54 np0005548789.localdomain ceph-mon[298582]: pgmap v637: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 177 KiB/s wr, 13 op/s
Dec 06 10:26:54 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:26:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:26:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:55.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:26:55 np0005548789.localdomain ceph-mon[298582]: pgmap v638: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 112 KiB/s wr, 7 op/s
Dec 06 10:26:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1759336759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:55 np0005548789.localdomain podman[336653]: 2025-12-06 10:26:55.916514619 +0000 UTC m=+0.081530693 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:26:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e274 e274: 6 total, 6 up, 6 in
Dec 06 10:26:55 np0005548789.localdomain podman[336653]: 2025-12-06 10:26:55.933363584 +0000 UTC m=+0.098379668 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:26:55 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:26:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:26:56.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:56 np0005548789.localdomain ceph-mon[298582]: osdmap e274: 6 total, 6 up, 6 in
Dec 06 10:26:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/792248445' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: pgmap v640: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 134 KiB/s wr, 9 op/s
Dec 06 10:26:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:26:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:58 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:26:58 np0005548789.localdomain podman[336672]: 2025-12-06 10:26:58.918896227 +0000 UTC m=+0.081685798 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:26:58 np0005548789.localdomain podman[336672]: 2025-12-06 10:26:58.9281746 +0000 UTC m=+0.090964221 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:26:58 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:26:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:00 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "format": "json"}]: dispatch
Dec 06 10:27:00 np0005548789.localdomain ceph-mon[298582]: pgmap v641: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 117 KiB/s wr, 8 op/s
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.853 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.855 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.855 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.855 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.888 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:00 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:00.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:01 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e275 e275: 6 total, 6 up, 6 in
Dec 06 10:27:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:27:02 np0005548789.localdomain ceph-mon[298582]: pgmap v642: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 117 KiB/s wr, 8 op/s
Dec 06 10:27:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:02 np0005548789.localdomain ceph-mon[298582]: osdmap e275: 6 total, 6 up, 6 in
Dec 06 10:27:02 np0005548789.localdomain podman[336695]: 2025-12-06 10:27:02.901046544 +0000 UTC m=+0.068334729 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:27:02 np0005548789.localdomain podman[336695]: 2025-12-06 10:27:02.93723081 +0000 UTC m=+0.104518945 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:27:02 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4936 writes, 36K keys, 4936 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.06 MB/s
                                                           Cumulative WAL: 4936 writes, 4936 syncs, 1.00 writes per sync, written: 0.06 GB, 0.06 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2414 writes, 12K keys, 2414 commit groups, 1.0 writes per commit group, ingest: 19.94 MB, 0.03 MB/s
                                                           Interval WAL: 2414 writes, 2414 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    150.5      0.31              0.09        18    0.017       0      0       0.0       0.0
                                                             L6      1/0   19.56 MB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   6.6    184.8    170.0      1.79              0.82        17    0.105    220K   8763       0.0       0.0
                                                            Sum      1/0   19.56 MB   0.0      0.3     0.0      0.3       0.3      0.1       0.0   7.6    157.8    167.2      2.10              0.91        35    0.060    220K   8763       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0  12.6    172.7    173.8      1.02              0.47        18    0.057    123K   4774       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   0.0    184.8    170.0      1.79              0.82        17    0.105    220K   8763       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    151.7      0.30              0.09        17    0.018       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.045, interval 0.014
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.34 GB write, 0.29 MB/s write, 0.32 GB read, 0.28 MB/s read, 2.1 seconds
                                                           Interval compaction: 0.17 GB write, 0.30 MB/s write, 0.17 GB read, 0.29 MB/s read, 1.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x55b608f29350#2 capacity: 304.00 MB usage: 29.19 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000187 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1507,27.77 MB,9.13637%) FilterBlock(35,637.55 KB,0.204804%) IndexBlock(35,809.30 KB,0.259977%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "format": "json"}]: dispatch
Dec 06 10:27:04 np0005548789.localdomain ceph-mon[298582]: pgmap v644: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 172 KiB/s wr, 11 op/s
Dec 06 10:27:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:05.889 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:05 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:05.890 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178", "format": "json"}]: dispatch
Dec 06 10:27:06 np0005548789.localdomain ceph-mon[298582]: pgmap v645: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 325 B/s rd, 146 KiB/s wr, 10 op/s
Dec 06 10:27:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "format": "json"}]: dispatch
Dec 06 10:27:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c48080d3-3662-400d-9fd3-efdeb2cd8e3f", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: pgmap v646: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 138 KiB/s wr, 9 op/s
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.952 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.952 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85264f60-dfcf-45e9-b02d-6c58e64ed5f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:07.918804', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e082692-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': 'a3576550d98e6f807c07cff5eafadf7c7c0cd22818514687b86d7f635300b1dd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:07.918804', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e083e8e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': 'a9f7511a3611e3389d35d8f4cb2aa8cef10e32477c9c492065b9025d51eee028'}]}, 'timestamp': '2025-12-06 10:27:07.953341', '_unique_id': '6548ab37caa44896a3e441395b54200d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.961 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '867a32c1-230e-4261-b478-8db1aa677516', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:07.957850', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e099ffe-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'a997ce121e5f9a4c37ee1c1e08f9f2b1ba0a74d229db1c522cb2c6a2b707c048'}]}, 'timestamp': '2025-12-06 10:27:07.962412', '_unique_id': 'fd6cac175ecf41e8a9cf4b9b18090ead'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.964 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.981 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a8426c-9603-45e3-8171-b50d577e2adf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:27:07.965003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1e0ca42e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.230851807, 'message_signature': '9ae9493121eb363d7c193745e9be7b1462243af51fdeb7220639d121e39bbdb6'}]}, 'timestamp': '2025-12-06 10:27:07.982169', '_unique_id': '27ed04da12614bfd8726906b5b75d625'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.983 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.997 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f68680a-4119-46ff-b712-8d4361e1564d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:07.984659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e0ef1d4-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': 'e49d73d77690333a29182738ccf65be96e5be71646ad7b0e1671650ab08c1c62'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:07.984659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e0f0304-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': 'e277e75aa012cdd8011ecb16f9b22906207857807b2dc8ce793a42b18fd5df94'}]}, 'timestamp': '2025-12-06 10:27:07.997671', '_unique_id': '6c2f61afca1149df81edb49509620390'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:07.998 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.000 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3753812e-b680-493f-88c6-d943489a863f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.000209', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e0f797e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'db496a0c6fbf3cba63079e963316c63dda7132712dcb036df0b932fdfdd4ca0b'}]}, 'timestamp': '2025-12-06 10:27:08.000952', '_unique_id': 'a3d80efd36484c88bb4fc367d6d9ee88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910868b5-f21d-4e75-b267-21964c638e64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.003358', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e0ff336-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'e7483b56e6dca24d12b2da7bf077bc5d3ce5967ec9118a5c908c785fcc29576f'}]}, 'timestamp': '2025-12-06 10:27:08.004165', '_unique_id': 'e69b39fcce444f92a6bd1336c2eb1879'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.005 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aeb8b29-0a18-4b49-abed-6ae3ded58cc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.006517', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e106f00-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'bc3b81d29adf8fd47925e97a1bd213a931d19bd0659e10a368015a2624405503'}]}, 'timestamp': '2025-12-06 10:27:08.007025', '_unique_id': 'c067dd640f3d46b3b1b8884a382b6478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.008 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.009 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.009 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 20050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '866c67cf-450d-4b1b-95e8-c1ffb327d00e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20050000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:27:08.009567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1e10e7aa-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.230851807, 'message_signature': 'e2e43a21e36ab1063ddf8280b0a601985a5b2f21603fdc9764219295ce0045aa'}]}, 'timestamp': '2025-12-06 10:27:08.010141', '_unique_id': 'e047c119de0241449b84e113662bcb97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.011 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.012 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4838331d-19d6-4a72-b451-defe4480fb48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.012341', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e11510e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': '03aee9e81ede44c5cde2ecbc3cce0e2a855e4f23b130d21eec8fb85bbe018fc4'}]}, 'timestamp': '2025-12-06 10:27:08.012842', '_unique_id': 'a88a13d5a99f47259ba531a1c67f7cdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.013 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.015 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5c576ae-7e0c-438c-88d5-e9c15c47c69e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.015497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e11d2a0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '95f7487fa3ec6a22adec1dec68e55373c1c51ceeba2dea40e94faf9b0943d5ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.015497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e11e5ba-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': 'd6b0043f3a4608db01c34669cb557b9c2b62293cf9d37ac5db0059bec9079fbe'}]}, 'timestamp': '2025-12-06 10:27:08.016590', '_unique_id': 'c71c50b0111d4c6fa0edb7cdfee6fb03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '174d54c8-aa41-4235-996d-e43f0fcb406f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.019012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e1255c2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': 'cafb781cff58f6b87a74d3042878a52a11747febdaea610d76d468382cddfa93'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.019012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e126652-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': '27cb764f76dd424e7dd2f9c677d8d243597a3f59e8f2224cd494568f8aa6b5ff'}]}, 'timestamp': '2025-12-06 10:27:08.019985', '_unique_id': 'be896d7ab607426387285bf3348dd51b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.021 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.022 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.022 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a52b6b9-8247-4258-9166-0ab31f1721a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.022492', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e12ddda-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': 'a8500b8466108073ba409556e7f8237af0673a55f5f5eee4a3f02bf6ed30006d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.022492', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e12ef8c-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.234016774, 'message_signature': '14d88c1a159f7c4c48155c35dfd522dabfcfe36d642c8ad71be9f06fc3c9da7b'}]}, 'timestamp': '2025-12-06 10:27:08.023388', '_unique_id': '641d44e08bdb43ff99fbe65592443127'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.024 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.025 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd18f8e21-bf3c-40b4-8507-ea52d6ec0c3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.025905', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e136534-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'e1b6685b6d05a2123a39168918db1c57ca3868e0cd27d1fbfbf71ced32d2f66d'}]}, 'timestamp': '2025-12-06 10:27:08.026480', '_unique_id': 'b8d509a304aa4bd4a1822409835808db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.027 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.028 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.028 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.029 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6afd2bfc-d1c7-4ffc-9856-8d2fb7b6388f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.029094', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e13df50-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': '95f3228ac3bb591ef7ab585010f118e5e1bd04f24c87ca92f217f558d0cd9926'}]}, 'timestamp': '2025-12-06 10:27:08.029581', '_unique_id': '081ae50691d746238765831a19f4a326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.030 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.032 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dfed8df-8184-4ead-8400-0ab18247545d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.031976', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e1450ca-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': '1038b0afa147cc79e9d94f039d5b194cbcd2cbfa9a9d145cefa054fc225339e6'}]}, 'timestamp': '2025-12-06 10:27:08.032475', '_unique_id': 'e4bccc1d08d2481bbebcab8a6167b032'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.033 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.034 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.034 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.035 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '613736a5-c6a0-4686-8928-56320f4018d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.034844', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e14c08c-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': 'e82f08b18017d22e64c691f236e7510fc4a2c9c7cd4574f7c9fb31a51eff7534'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.034844', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e14d338-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '544458b8de334c98488eccc64bdf7d12e259889f4e024c29f8bddc8c01bde7d4'}]}, 'timestamp': '2025-12-06 10:27:08.035834', '_unique_id': 'd82aad46034f4e1cbc3b19da414fb196'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.036 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.037 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.037 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a33b9273-b0d8-4ec6-8913-3f96948e6ace', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.037723', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e152e28-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': 'bfe418d5f95a64b49f9dae0f9d00c21c34099270736acd538f30e0f296a06adb'}]}, 'timestamp': '2025-12-06 10:27:08.038040', '_unique_id': '43fbfe29dfe148e5a0c3d5737a7f2bb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.038 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.039 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.039 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b263b353-7ebe-4ac3-a368-052a2db4e559', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.039462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e1570cc-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '86bb5e8a630acff906c3657f45d24506a671746a9b34815fcdc4bb02e19dc8a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.039462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e157c8e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '3b53eedb589c77e98199109f9c53607e4719293e04f852dcec34f1e7e8757c89'}]}, 'timestamp': '2025-12-06 10:27:08.040033', '_unique_id': '956eb24934c4432e9f74b8de9522a5dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.040 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.041 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.041 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fdd5db0-e2e8-45c1-a99e-e531c210f691', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:27:08.041383', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '1e15bd20-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.207233986, 'message_signature': '396b374410854e4c42c9c40eb983ab908ad805a419c3f90d49bc3b500bb3e583'}]}, 'timestamp': '2025-12-06 10:27:08.041724', '_unique_id': '49a4404d795f4032812d4e782b9cebea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.042 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.043 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.043 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.043 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2c0d7a6-b838-42ca-a8ab-b991cca0f829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.043188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e16029e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '6fcef92ff38e30a0cc29b0215d600838da06d8d1e4762f08eb64650565f54873'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.043188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e160cee-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '35b2b08fbd4c51694302e1d610706aca092a096bde5034efb8093cec5a97aa4f'}]}, 'timestamp': '2025-12-06 10:27:08.043737', '_unique_id': '2bf2305b8892477b9e07462126789069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.044 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.045 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfbc9585-1232-4dd8-9e9e-fa6739d1917d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:27:08.045198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1e16510e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '0acb28d51658a390ca2df6b30b302e3a41ab6c009fe358b097e566c3dbc87641'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:27:08.045198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1e165b40-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13246.168172242, 'message_signature': '2962a1dd3f638e5fceb838a51689dbbcfac8a4cc91c56f64d24394f42f573c75'}]}, 'timestamp': '2025-12-06 10:27:08.045728', '_unique_id': 'd3d0110f5f91422787d5984cea26ca1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.046 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:27:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:27:08.047 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:27:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178_8e2229e7-2c10-4e6d-9970-a62905b25ae2", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "snap_name": "1034db86-0931-4bff-b9cd-e904ae3ce178", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1be4d0fc-02f4-4c42-ae8f-f48f51b28e7a", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:09 np0005548789.localdomain ceph-mon[298582]: pgmap v647: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:10 np0005548789.localdomain sudo[336721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:27:10 np0005548789.localdomain sudo[336721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:10 np0005548789.localdomain sudo[336721]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:10 np0005548789.localdomain sudo[336739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:27:10 np0005548789.localdomain sudo[336739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.891 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.893 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.893 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.893 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.926 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:10 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:10.928 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e276 e276: 6 total, 6 up, 6 in
Dec 06 10:27:11 np0005548789.localdomain sudo[336739]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:11 np0005548789.localdomain sudo[336789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:27:11 np0005548789.localdomain sudo[336789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:27:11 np0005548789.localdomain sudo[336789]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: osdmap e276: 6 total, 6 up, 6 in
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: pgmap v649: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 239 B/s rd, 197 KiB/s wr, 11 op/s
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "format": "json"}]: dispatch
Dec 06 10:27:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d70c0b2c-e84c-411c-8527-44acb71e029d", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:12 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:27:12Z|00526|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 06 10:27:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:27:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "format": "json"}]: dispatch
Dec 06 10:27:13 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:14 np0005548789.localdomain ceph-mon[298582]: pgmap v650: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:14 np0005548789.localdomain sshd[336807]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:14 np0005548789.localdomain sshd[336807]: error: kex_exchange_identification: Connection closed by remote host
Dec 06 10:27:14 np0005548789.localdomain sshd[336807]: Connection closed by 117.50.226.213 port 39820
Dec 06 10:27:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:27:15 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:27:15 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:15.928 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:15 np0005548789.localdomain systemd[1]: tmp-crun.bsbKKc.mount: Deactivated successfully.
Dec 06 10:27:15 np0005548789.localdomain podman[336809]: 2025-12-06 10:27:15.994677981 +0000 UTC m=+0.140060523 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:27:16 np0005548789.localdomain podman[336809]: 2025-12-06 10:27:16.008207254 +0000 UTC m=+0.153589836 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:27:16 np0005548789.localdomain podman[336808]: 2025-12-06 10:27:15.960880837 +0000 UTC m=+0.106777074 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:27:16 np0005548789.localdomain podman[336808]: 2025-12-06 10:27:16.044426672 +0000 UTC m=+0.190322909 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:27:16 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:27:16 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:27:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548789.localdomain ceph-mon[298582]: pgmap v651: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a57baf93-1000-4372-9325-859e73a86488", "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a57baf93-1000-4372-9325-859e73a86488", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:27:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:27:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e277 e277: 6 total, 6 up, 6 in
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: pgmap v652: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 173 KiB/s wr, 10 op/s
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: osdmap e277: 6 total, 6 up, 6 in
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:18 np0005548789.localdomain sshd[336849]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc", "format": "json"}]: dispatch
Dec 06 10:27:20 np0005548789.localdomain ceph-mon[298582]: pgmap v654: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 731 B/s rd, 197 KiB/s wr, 12 op/s
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.931 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.933 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.933 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.934 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.966 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:20.967 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:20 np0005548789.localdomain sshd[336849]: Connection reset by authenticating user root 45.140.17.124 port 51420 [preauth]
Dec 06 10:27:21 np0005548789.localdomain sshd[336851]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:27:22 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:27:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:22 np0005548789.localdomain ceph-mon[298582]: pgmap v655: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 165 KiB/s wr, 10 op/s
Dec 06 10:27:22 np0005548789.localdomain podman[336854]: 2025-12-06 10:27:22.922665938 +0000 UTC m=+0.078946765 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 06 10:27:22 np0005548789.localdomain systemd[1]: tmp-crun.WZ8lCD.mount: Deactivated successfully.
Dec 06 10:27:22 np0005548789.localdomain podman[336854]: 2025-12-06 10:27:22.967171958 +0000 UTC m=+0.123452825 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:27:22 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:27:23 np0005548789.localdomain podman[336853]: 2025-12-06 10:27:22.974521763 +0000 UTC m=+0.134313277 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Dec 06 10:27:23 np0005548789.localdomain podman[336853]: 2025-12-06 10:27:23.054666913 +0000 UTC m=+0.214458417 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Dec 06 10:27:23 np0005548789.localdomain sshd[336851]: Connection reset by authenticating user root 45.140.17.124 port 51438 [preauth]
Dec 06 10:27:23 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:27:23 np0005548789.localdomain sshd[336890]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc_a26a2613-79e3-4fad-87ed-7fada9c58a20", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "snap_name": "7884d7aa-ac96-42e0-bc99-f42bebf4c1fc", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:23 np0005548789.localdomain ceph-mon[298582]: pgmap v656: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 157 KiB/s wr, 10 op/s
Dec 06 10:27:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:27:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:27:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:27:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:27:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:27:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1"
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:24 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548789.localdomain sshd[336890]: Connection reset by authenticating user root 45.140.17.124 port 42848 [preauth]
Dec 06 10:27:25 np0005548789.localdomain sshd[336892]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:25 np0005548789.localdomain ceph-mon[298582]: pgmap v657: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 157 KiB/s wr, 10 op/s
Dec 06 10:27:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d20a1bca-aac4-44af-8a03-caef89ee2c05", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:25 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e278 e278: 6 total, 6 up, 6 in
Dec 06 10:27:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:25.969 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:25.971 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:25.971 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:25 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:25.971 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:26.014 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:26.014 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:27:26 np0005548789.localdomain podman[336894]: 2025-12-06 10:27:26.923060494 +0000 UTC m=+0.080321466 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:27:26 np0005548789.localdomain podman[336894]: 2025-12-06 10:27:26.939142545 +0000 UTC m=+0.096403517 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:27:26 np0005548789.localdomain ceph-mon[298582]: osdmap e278: 6 total, 6 up, 6 in
Dec 06 10:27:26 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: pgmap v659: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 536 B/s rd, 165 KiB/s wr, 10 op/s
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:27 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:28 np0005548789.localdomain sshd[336892]: Connection reset by authenticating user root 45.140.17.124 port 42860 [preauth]
Dec 06 10:27:28 np0005548789.localdomain sshd[336912]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:27:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:27:29 np0005548789.localdomain podman[336914]: 2025-12-06 10:27:29.920571993 +0000 UTC m=+0.081152182 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:27:29 np0005548789.localdomain podman[336914]: 2025-12-06 10:27:29.929079043 +0000 UTC m=+0.089659192 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:27:29 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:27:29 np0005548789.localdomain ceph-mon[298582]: pgmap v660: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 146 KiB/s wr, 9 op/s
Dec 06 10:27:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:30.519 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:27:30 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:30.519 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:30 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:30.521 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:27:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "format": "json"}]: dispatch
Dec 06 10:27:31 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:31.050 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:31.052 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:31 np0005548789.localdomain sshd[336912]: Connection reset by authenticating user root 45.140.17.124 port 42878 [preauth]
Dec 06 10:27:31 np0005548789.localdomain auditd[725]: Audit daemon rotating log files
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: pgmap v661: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 146 KiB/s wr, 9 op/s
Dec 06 10:27:32 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 e279: 6 total, 6 up, 6 in
Dec 06 10:27:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:27:33 np0005548789.localdomain ceph-mon[298582]: osdmap e279: 6 total, 6 up, 6 in
Dec 06 10:27:33 np0005548789.localdomain podman[336938]: 2025-12-06 10:27:33.930580519 +0000 UTC m=+0.087834825 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:27:33 np0005548789.localdomain podman[336938]: 2025-12-06 10:27:33.966786596 +0000 UTC m=+0.124040852 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 10:27:33 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: pgmap v663: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 191 KiB/s wr, 11 op/s
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/39a0fb23-501d-479d-b543-1f708ea4574a/3e5f0a88-786f-4b8b-a999-6c8148201aa5", "osd", "allow rw pool=manila_data namespace=fsvolumens_39a0fb23-501d-479d-b543-1f708ea4574a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:36.053 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:36.056 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:36 np0005548789.localdomain ceph-mon[298582]: pgmap v664: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 434 B/s rd, 162 KiB/s wr, 10 op/s
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: pgmap v665: 177 pgs: 177 active+clean; 218 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 153 KiB/s wr, 9 op/s
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:37 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:38 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:38.522 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "39a0fb23-501d-479d-b543-1f708ea4574a", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:27:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1777930601' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:27:39 np0005548789.localdomain ceph-mon[298582]: pgmap v666: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 184 KiB/s wr, 10 op/s
Dec 06 10:27:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "format": "json"}]: dispatch
Dec 06 10:27:40 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.056 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.058 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.058 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.059 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.095 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.095 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.304 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.305 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.305 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.305 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.734 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.759 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.759 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.760 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.780 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.781 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.781 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.781 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:27:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:41.782 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:41 np0005548789.localdomain ceph-mon[298582]: pgmap v667: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 184 KiB/s wr, 10 op/s
Dec 06 10:27:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:27:42 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/585070944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.224 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.300 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.301 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.498 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.499 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11112MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.500 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.500 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.592 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.592 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.593 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:27:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:42.643 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:27:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:27:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3710319401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:43.121 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:43.127 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:27:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/585070944' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3710319401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:43.153 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:27:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:43.155 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:27:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:43.156 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:44 np0005548789.localdomain ceph-mon[298582]: pgmap v668: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 485 B/s rd, 272 KiB/s wr, 16 op/s
Dec 06 10:27:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:45.152 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/5588df5a-ffa6-47fc-b21c-3740e4fe0c02/c27ff085-a577-4abe-b26a-9dc51132e3c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.096 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.098 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.098 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.099 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.137 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.138 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:46.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:46 np0005548789.localdomain ceph-mon[298582]: pgmap v669: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:27:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:27:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:27:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:27:46 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:27:46 np0005548789.localdomain podman[337008]: 2025-12-06 10:27:46.936427356 +0000 UTC m=+0.089497327 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:27:46 np0005548789.localdomain podman[337008]: 2025-12-06 10:27:46.944221824 +0000 UTC m=+0.097291795 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:27:46 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:27:47 np0005548789.localdomain systemd[1]: tmp-crun.lsqYPN.mount: Deactivated successfully.
Dec 06 10:27:47 np0005548789.localdomain podman[337009]: 2025-12-06 10:27:47.0484498 +0000 UTC m=+0.198185689 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:27:47 np0005548789.localdomain podman[337009]: 2025-12-06 10:27:47.061637943 +0000 UTC m=+0.211373802 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:27:47 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:27:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:47.344 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:47.345 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:27:47.346 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:47 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: pgmap v670: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 168 KiB/s wr, 9 op/s
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5588df5a-ffa6-47fc-b21c-3740e4fe0c02", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:49.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:49.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:49.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:27:49 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:27:50 np0005548789.localdomain ceph-mon[298582]: pgmap v671: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 249 KiB/s wr, 14 op/s
Dec 06 10:27:50 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.188 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.189 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.190 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.191 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.191 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5052 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.191 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.192 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:51.194 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:51 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "format": "json"}]: dispatch
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:27:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:27:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:27:52 np0005548789.localdomain ceph-mon[298582]: pgmap v672: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 166 KiB/s wr, 10 op/s
Dec 06 10:27:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:27:53 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:27:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:27:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:27:53 np0005548789.localdomain ceph-mon[298582]: pgmap v673: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 235 KiB/s wr, 14 op/s
Dec 06 10:27:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:27:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:27:53 np0005548789.localdomain systemd[1]: tmp-crun.0nafcR.mount: Deactivated successfully.
Dec 06 10:27:54 np0005548789.localdomain systemd[1]: tmp-crun.HBBVU2.mount: Deactivated successfully.
Dec 06 10:27:54 np0005548789.localdomain podman[337048]: 2025-12-06 10:27:54.022203466 +0000 UTC m=+0.171150073 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec 06 10:27:54 np0005548789.localdomain podman[337048]: 2025-12-06 10:27:54.033543422 +0000 UTC m=+0.182489999 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git)
Dec 06 10:27:54 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:27:54 np0005548789.localdomain podman[337049]: 2025-12-06 10:27:53.985002429 +0000 UTC m=+0.131372677 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:27:54 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:27:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19277 "" "Go-http-client/1.1"
Dec 06 10:27:54 np0005548789.localdomain podman[337049]: 2025-12-06 10:27:54.11719889 +0000 UTC m=+0.263569218 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 06 10:27:54 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/1591869a-dd1e-4430-a3b9-ab5fdaf392f8/647cfa54-e318-4c22-b8b7-7c0ead2d396a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:27:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3684322741' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:55 np0005548789.localdomain ceph-mon[298582]: pgmap v674: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:27:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1957927647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.195 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.197 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.198 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.198 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.231 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:27:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:27:56.232 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:27:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1864458173' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:27:57 np0005548789.localdomain systemd[1]: tmp-crun.RH67zk.mount: Deactivated successfully.
Dec 06 10:27:57 np0005548789.localdomain podman[337085]: 2025-12-06 10:27:57.925707284 +0000 UTC m=+0.086874157 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:27:57 np0005548789.localdomain podman[337085]: 2025-12-06 10:27:57.940407115 +0000 UTC m=+0.101574028 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:27:57 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2284770246' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: pgmap v675: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1591869a-dd1e-4430-a3b9-ab5fdaf392f8", "force": true, "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:27:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:27:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:00 np0005548789.localdomain ceph-mon[298582]: pgmap v676: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 252 KiB/s wr, 15 op/s
Dec 06 10:28:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:28:00 np0005548789.localdomain podman[337105]: 2025-12-06 10:28:00.925582905 +0000 UTC m=+0.082068049 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:28:00 np0005548789.localdomain podman[337105]: 2025-12-06 10:28:00.936348275 +0000 UTC m=+0.092833399 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:28:00 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.233 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.234 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.234 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.235 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.271 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:01.272 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:02 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:02 np0005548789.localdomain ceph-mon[298582]: pgmap v677: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 171 KiB/s wr, 10 op/s
Dec 06 10:28:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: pgmap v678: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 235 KiB/s wr, 15 op/s
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:04 np0005548789.localdomain systemd[1]: tmp-crun.z9fq4k.mount: Deactivated successfully.
Dec 06 10:28:04 np0005548789.localdomain podman[337128]: 2025-12-06 10:28:04.916148941 +0000 UTC m=+0.081854983 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ovn_controller)
Dec 06 10:28:04 np0005548789.localdomain podman[337128]: 2025-12-06 10:28:04.958124599 +0000 UTC m=+0.123830691 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:28:04 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:28:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:05 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.272 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.274 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.274 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.275 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.318 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:06.319 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:06 np0005548789.localdomain sshd[337153]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:28:06 np0005548789.localdomain ceph-mon[298582]: pgmap v679: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 167 KiB/s wr, 10 op/s
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: pgmap v680: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 167 KiB/s wr, 10 op/s
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:07 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:09 np0005548789.localdomain ceph-mon[298582]: pgmap v681: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 263 KiB/s wr, 17 op/s
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:10 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.320 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.322 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.323 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.323 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.487 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:11.488 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:11 np0005548789.localdomain sudo[337154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:28:11 np0005548789.localdomain sudo[337154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:11 np0005548789.localdomain sudo[337154]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:11 np0005548789.localdomain sudo[337172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:28:11 np0005548789.localdomain sudo[337172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:11 np0005548789.localdomain ceph-mon[298582]: pgmap v682: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 160 KiB/s wr, 11 op/s
Dec 06 10:28:12 np0005548789.localdomain sudo[337172]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:12 np0005548789.localdomain sudo[337221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:28:12 np0005548789.localdomain sudo[337221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:28:12 np0005548789.localdomain sudo[337221]: pam_unix(sudo:session): session closed for user root
Dec 06 10:28:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:28:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:28:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:28:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:14 np0005548789.localdomain sshd[337239]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:28:14 np0005548789.localdomain ceph-mon[298582]: pgmap v683: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 219 KiB/s wr, 14 op/s
Dec 06 10:28:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:15 np0005548789.localdomain sshd[337239]: Received disconnect from 118.219.234.233 port 43120:11: Bye Bye [preauth]
Dec 06 10:28:15 np0005548789.localdomain sshd[337239]: Disconnected from authenticating user root 118.219.234.233 port 43120 [preauth]
Dec 06 10:28:16 np0005548789.localdomain ceph-mon[298582]: pgmap v684: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 155 KiB/s wr, 9 op/s
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.489 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.491 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.492 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.492 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.526 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:16.527 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:28:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:28:16 np0005548789.localdomain sshd[337153]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:28:16 np0005548789.localdomain sshd[337153]: banner exchange: Connection from 123.160.164.187 port 53886: Connection timed out
Dec 06 10:28:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:28:17 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:28:17 np0005548789.localdomain podman[337241]: 2025-12-06 10:28:17.775254298 +0000 UTC m=+0.102042801 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:28:17 np0005548789.localdomain podman[337241]: 2025-12-06 10:28:17.809259711 +0000 UTC m=+0.136048214 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:28:17 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:28:17 np0005548789.localdomain podman[337242]: 2025-12-06 10:28:17.81899549 +0000 UTC m=+0.145566457 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:28:17 np0005548789.localdomain podman[337242]: 2025-12-06 10:28:17.903242945 +0000 UTC m=+0.229813892 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:28:17 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: pgmap v685: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 155 KiB/s wr, 9 op/s
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:20 np0005548789.localdomain ceph-mon[298582]: pgmap v686: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 254 KiB/s wr, 16 op/s
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "tenant_id": "407388521bb04f21b3ced239438a361c", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-956934797", "caps": ["mds", "allow rw path=/volumes/_nogroup/b113f2fd-9e34-49b1-8d3c-8099c23d423a/eeefc94a-19bd-4e4c-8e5e-d71eabbfaa73", "osd", "allow rw pool=manila_data namespace=fsvolumens_b113f2fd-9e34-49b1-8d3c-8099c23d423a", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:21 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.529 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.531 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.564 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:21.565 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:22 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:22 np0005548789.localdomain ceph-mon[298582]: pgmap v687: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 158 KiB/s wr, 10 op/s
Dec 06 10:28:23 np0005548789.localdomain sshd[337282]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:28:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:28:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:28:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:28:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:28:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:28:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19279 "" "Go-http-client/1.1"
Dec 06 10:28:24 np0005548789.localdomain sshd[337282]: Received disconnect from 14.194.101.210 port 49728:11: Bye Bye [preauth]
Dec 06 10:28:24 np0005548789.localdomain sshd[337282]: Disconnected from authenticating user root 14.194.101.210 port 49728 [preauth]
Dec 06 10:28:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:28:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:28:24 np0005548789.localdomain podman[337285]: 2025-12-06 10:28:24.890025021 +0000 UTC m=+0.088179897 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: pgmap v688: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 217 KiB/s wr, 14 op/s
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-956934797", "format": "json"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-956934797"}]': finished
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:24 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:24 np0005548789.localdomain podman[337284]: 2025-12-06 10:28:24.934914478 +0000 UTC m=+0.137195900 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:28:24 np0005548789.localdomain podman[337284]: 2025-12-06 10:28:24.951094044 +0000 UTC m=+0.153375476 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:28:24 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:28:25 np0005548789.localdomain podman[337285]: 2025-12-06 10:28:25.003590115 +0000 UTC m=+0.201744991 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:28:25 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:28:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "auth_id": "tempest-cephx-id-956934797", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:25 np0005548789.localdomain ceph-mon[298582]: pgmap v689: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 158 KiB/s wr, 11 op/s
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.566 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.598 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:26.598 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: pgmap v690: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 158 KiB/s wr, 11 op/s
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:28 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:28:28 np0005548789.localdomain podman[337324]: 2025-12-06 10:28:28.928695714 +0000 UTC m=+0.085209565 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:28:28 np0005548789.localdomain podman[337324]: 2025-12-06 10:28:28.939293719 +0000 UTC m=+0.095807580 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:28:28 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:28:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "format": "json"}]: dispatch
Dec 06 10:28:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b113f2fd-9e34-49b1-8d3c-8099c23d423a", "force": true, "format": "json"}]: dispatch
Dec 06 10:28:30 np0005548789.localdomain ceph-mon[298582]: pgmap v691: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 16 op/s
Dec 06 10:28:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:28:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:28:31 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.599 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.600 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.600 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.600 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.627 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:31.628 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:28:31 np0005548789.localdomain podman[337344]: 2025-12-06 10:28:31.931113623 +0000 UTC m=+0.082159022 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:28:31 np0005548789.localdomain podman[337344]: 2025-12-06 10:28:31.944154933 +0000 UTC m=+0.095200322 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:28:31 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:28:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:32.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:32.510 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:28:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:32.512 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:28:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:32 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:28:32 np0005548789.localdomain ceph-mon[298582]: pgmap v692: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 141 KiB/s wr, 9 op/s
Dec 06 10:28:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:34 np0005548789.localdomain ceph-mon[298582]: pgmap v693: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 168 KiB/s wr, 12 op/s
Dec 06 10:28:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:34 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:35 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:35.515 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:28:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:28:35 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:35 np0005548789.localdomain ceph-mon[298582]: pgmap v694: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:35 np0005548789.localdomain podman[337367]: 2025-12-06 10:28:35.927058536 +0000 UTC m=+0.090697205 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 06 10:28:35 np0005548789.localdomain podman[337367]: 2025-12-06 10:28:35.996337101 +0000 UTC m=+0.159975790 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:28:36 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:28:36 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:36.666 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:38.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:38.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:28:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:38.208 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: pgmap v695: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:38 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:28:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/12719951' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:28:40 np0005548789.localdomain ceph-mon[298582]: pgmap v696: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 172 KiB/s wr, 11 op/s
Dec 06 10:28:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:41 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:41 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:28:41.518 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:28:41Z, description=, device_id=2b4c1404-e334-461c-914c-573c510f7280, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e8250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9e86d0>], id=1f3176da-af83-4428-90ea-651fd6a69b2f, ip_allocation=immediate, mac_address=fa:16:3e:ad:3a:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3885, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:28:41Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.668 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.670 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.671 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.671 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.696 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:41 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:41.697 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.209 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.234 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.235 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.235 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.235 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.236 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:28:42 np0005548789.localdomain podman[337412]: 2025-12-06 10:28:42.414257493 +0000 UTC m=+0.062791317 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:28:42 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:28:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:28:42 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:28:42 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:42 np0005548789.localdomain ceph-mon[298582]: pgmap v697: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 90 KiB/s wr, 6 op/s
Dec 06 10:28:42 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:28:42 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1883521674' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.716 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.789 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:28:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:42.790 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:28:42 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:28:42.801 263652 INFO neutron.agent.dhcp.agent [None req-05cfcc95-78bf-45e4-872f-210f4aae8746 - - - - - -] DHCP configuration for ports {'1f3176da-af83-4428-90ea-651fd6a69b2f'} is completed
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.011 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.013 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11098MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.013 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.014 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.091 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.315 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.316 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.316 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.380 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing inventories for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:28:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.472 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating ProviderTree inventory for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.472 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Updating inventory in ProviderTree for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.486 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing aggregate associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.506 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Refreshing trait associations for resource provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad, traits: HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,HW_CPU_X86_BMI2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_AVX,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,HW_CPU_X86_FMA3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:28:43 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1883521674' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.539 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:28:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:28:43 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3637666271' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.972 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:28:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:43.978 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.012 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.014 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.015 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: pgmap v698: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 123 KiB/s wr, 8 op/s
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3637666271' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Dec 06 10:28:44 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.983 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.984 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.984 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:28:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:44.984 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.081 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.082 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.082 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.083 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:28:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:45 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice", "format": "json"}]: dispatch
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.620 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.658 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:28:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:45.658 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:28:46 np0005548789.localdomain ceph-mon[298582]: pgmap v699: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 96 KiB/s wr, 5 op/s
Dec 06 10:28:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:46.571 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:28:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:28:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:28:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:46.700 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:46.702 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:47.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:47.345 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:28:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:47.346 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:28:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:28:47.347 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: pgmap v700: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 96 KiB/s wr, 5 op/s
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:48 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:28:48 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:28:48 np0005548789.localdomain podman[337476]: 2025-12-06 10:28:48.935207868 +0000 UTC m=+0.095411518 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:28:48 np0005548789.localdomain podman[337476]: 2025-12-06 10:28:48.970339746 +0000 UTC m=+0.130543336 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:28:48 np0005548789.localdomain systemd[1]: tmp-crun.gmgSA1.mount: Deactivated successfully.
Dec 06 10:28:48 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:28:49 np0005548789.localdomain podman[337477]: 2025-12-06 10:28:49.002005587 +0000 UTC m=+0.158958438 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:28:49 np0005548789.localdomain podman[337477]: 2025-12-06 10:28:49.015344437 +0000 UTC m=+0.172297328 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:28:49 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:28:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:49.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.602485) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929602559, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2758, "num_deletes": 254, "total_data_size": 3125500, "memory_usage": 3182664, "flush_reason": "Manual Compaction"}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929616524, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2002207, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36472, "largest_seqno": 39224, "table_properties": {"data_size": 1992001, "index_size": 6139, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26470, "raw_average_key_size": 22, "raw_value_size": 1969480, "raw_average_value_size": 1637, "num_data_blocks": 266, "num_entries": 1203, "num_filter_entries": 1203, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016805, "oldest_key_time": 1765016805, "file_creation_time": 1765016929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 14151 microseconds, and 6437 cpu microseconds.
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.616643) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2002207 bytes OK
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.616701) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.618955) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.618977) EVENT_LOG_v1 {"time_micros": 1765016929618970, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.619004) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3112336, prev total WAL file size 3112336, number of live WAL files 2.
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.620175) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1955KB)], [63(19MB)]
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929620239, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 22514863, "oldest_snapshot_seqno": -1}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14612 keys, 20699727 bytes, temperature: kUnknown
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929723856, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 20699727, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20614414, "index_size": 47708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36549, "raw_key_size": 391733, "raw_average_key_size": 26, "raw_value_size": 20364613, "raw_average_value_size": 1393, "num_data_blocks": 1774, "num_entries": 14612, "num_filter_entries": 14612, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016929, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.724170) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 20699727 bytes
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.726043) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.1 rd, 199.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 19.6 +0.0 blob) out(19.7 +0.0 blob), read-write-amplify(21.6) write-amplify(10.3) OK, records in: 15143, records dropped: 531 output_compression: NoCompression
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.726072) EVENT_LOG_v1 {"time_micros": 1765016929726058, "job": 38, "event": "compaction_finished", "compaction_time_micros": 103712, "compaction_time_cpu_micros": 51170, "output_level": 6, "num_output_files": 1, "total_output_size": 20699727, "num_input_records": 15143, "num_output_records": 14612, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929726463, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016929729063, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.620044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.729192) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.729201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.729204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.729207) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:49 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:28:49.729210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:28:50 np0005548789.localdomain ceph-mon[298582]: pgmap v701: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 149 KiB/s wr, 9 op/s
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.197 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.197 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.198 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:28:51 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:28:51 np0005548789.localdomain podman[337535]: 2025-12-06 10:28:51.563043824 +0000 UTC m=+0.066206643 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 06 10:28:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:28:51 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:28:51 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:51 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.718 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.720 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.720 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5019 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.721 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.721 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.725 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:51 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:28:51Z|00527|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:28:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:51.818 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:52.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:52.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:28:52 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:52 np0005548789.localdomain ceph-mon[298582]: pgmap v702: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:28:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:53.208 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:53.208 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:53 np0005548789.localdomain ceph-mon[298582]: pgmap v703: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:28:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:28:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:28:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:28:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:28:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:28:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1"
Dec 06 10:28:54 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:28:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:28:54 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:28:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:55.521 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:28:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:28:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:28:55 np0005548789.localdomain podman[337556]: 2025-12-06 10:28:55.913856365 +0000 UTC m=+0.072346880 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec 06 10:28:55 np0005548789.localdomain podman[337556]: 2025-12-06 10:28:55.921424608 +0000 UTC m=+0.079915203 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350)
Dec 06 10:28:55 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:28:55 np0005548789.localdomain ceph-mon[298582]: pgmap v704: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:28:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1248594145' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:55 np0005548789.localdomain podman[337557]: 2025-12-06 10:28:55.982953396 +0000 UTC m=+0.138387278 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:28:55 np0005548789.localdomain podman[337557]: 2025-12-06 10:28:55.992330153 +0000 UTC m=+0.147764055 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 06 10:28:56 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:28:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:28:56.767 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:28:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4015030383' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: pgmap v705: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Dec 06 10:28:57 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice_bob", "format": "json"}]: dispatch
Dec 06 10:28:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:28:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1148555436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:28:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:28:59 np0005548789.localdomain podman[337596]: 2025-12-06 10:28:59.910381526 +0000 UTC m=+0.081390048 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 10:28:59 np0005548789.localdomain podman[337596]: 2025-12-06 10:28:59.925167599 +0000 UTC m=+0.096176121 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 10:28:59 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:29:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3936924066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:00 np0005548789.localdomain ceph-mon[298582]: pgmap v706: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 128 KiB/s wr, 8 op/s
Dec 06 10:29:01 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:01 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.232 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.802 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.804 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.804 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.805 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.806 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:01.807 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:02 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:02 np0005548789.localdomain ceph-mon[298582]: pgmap v707: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:29:02 np0005548789.localdomain podman[337615]: 2025-12-06 10:29:02.927896117 +0000 UTC m=+0.089487127 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:29:02 np0005548789.localdomain podman[337615]: 2025-12-06 10:29:02.935291014 +0000 UTC m=+0.096882044 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:29:02 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:29:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:03 np0005548789.localdomain ceph-mon[298582]: pgmap v708: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:04 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:05 np0005548789.localdomain ceph-mon[298582]: pgmap v709: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.810 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.811 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.812 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.812 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.846 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:06 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:06.846 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:29:06 np0005548789.localdomain podman[337638]: 2025-12-06 10:29:06.946554985 +0000 UTC m=+0.081101899 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_managed=true)
Dec 06 10:29:06 np0005548789.localdomain podman[337638]: 2025-12-06 10:29:06.985430428 +0000 UTC m=+0.119977352 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 06 10:29:07 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.941 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.942 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a7a5fcb-b411-45d4-91b4-49bfbf5faf8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:07.918623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '658d1234-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': 'b334fe7de1022e4eb9f546f48486192f7d59a8adbb836d7513ab7764a9d0c944'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:07.918623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '658d26ac-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '8054b2c4acd4a5ac1aaccab06758509bbc2044e9de642ac145d5da0cfd77951a'}]}, 'timestamp': '2025-12-06 10:29:07.942616', '_unique_id': '5ac0da0aee134467b98abc350513b062'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.944 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.945 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.945 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.946 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.946 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '087b0cf6-5c20-4819-9fb0-9883416a8388', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:07.946203', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '658dc90e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '66b114c322cee4a6ce4b8f805b44a43220ee80c5ca6711039cd3b51c61ed1728'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:07.946203', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '658dde4e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '8d22cfd7127d99c29258b77bbc712cccd52759b75af96ecffe5566c69204b449'}]}, 'timestamp': '2025-12-06 10:29:07.947355', '_unique_id': '27726bba2cd24315826d2d17ac1e7198'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.948 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.950 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.959 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.960 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c9d1ba7-428e-46b8-bbb6-2f1cd9739551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:07.950453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '658fd230-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': '8a0ece50415883060a3668f616a5b25baf7fd3827fc5234fb38ab0a2db405e26'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:07.950453', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '658fe022-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': 'e7cb433905f8ee3562d53273784a81a66649af638e02768e92849f55b0b6be64'}]}, 'timestamp': '2025-12-06 10:29:07.960421', '_unique_id': '0c9598aa134e4fed932ac7f050e629b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.961 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.965 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2806ffc-5a9d-4bc9-8383-8a8194908ffa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:07.962138', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '6590bb8c-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '98e80dc50c5bd045eedd23b5881810a0e4d90a8845de5a94ea54b5b169bae601'}]}, 'timestamp': '2025-12-06 10:29:07.966043', '_unique_id': '12e3a7ad5b6a4544a59eca66393cd688'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.966 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.967 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.983 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e80c309-c564-496a-ab98-fe5f475e8c68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:29:07.967512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '65937bd8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.232812474, 'message_signature': 'cec775fc671789b15da071168dd4d6ceb4816d07ff3954fc2fed55d1d0b0397a'}]}, 'timestamp': '2025-12-06 10:29:07.984096', '_unique_id': '4a0837264e53450eb9b21a0a1f55f0b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.985 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe9d9cc7-e1ea-4c77-a159-c473e10dab7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:07.986279', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '6593e1fe-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': 'b3442ed5174017c9b8ebea29b4c36053dfaa3b82bc8b5efe1849259767ed642c'}]}, 'timestamp': '2025-12-06 10:29:07.986700', '_unique_id': '32a447c27fb54a46b5d9ee0f1594638e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.988 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fafa9e4f-b83e-435c-ac31-32b42aafc8b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:07.988605', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '65943dde-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': 'acef3873af564a433ba5fe86097e9e2baf8e038fe05d2a0097f1398b9b4346ff'}]}, 'timestamp': '2025-12-06 10:29:07.989048', '_unique_id': 'b0e83e76bac74a1f9d1f0293b4b0d4e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.990 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ca4637d-7b8c-41a9-8b7c-55dabef42938', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:07.990930', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '65949784-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '1124270376220935bd081fbd0269ce8c13c364684b07a970bd1173ac82595625'}]}, 'timestamp': '2025-12-06 10:29:07.991343', '_unique_id': '21d331b549864e0ea874c87cddf4931e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ab05cbc-3e97-41e3-b962-7e4e19e75988', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:07.993259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6594f242-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': 'a1391f20f6dd30eef93b013558fefe94a529375b769aaad51e7e394af0662148'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:07.993259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '65950160-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': '3f5681ae9bae10d2d6c3128d59c94fb073a6c271a690690c2ea31ddc777fcedb'}]}, 'timestamp': '2025-12-06 10:29:07.994030', '_unique_id': '56887e24f51d45039b3b585da41cea85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.994 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.995 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e11e87d9-c22f-4468-a32f-ab6627414c40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:07.995927', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '65955ae8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': 'f9062f3eaaee030ce961c231fee3f944c820afaa1679842491b76cef16ddebd7'}]}, 'timestamp': '2025-12-06 10:29:07.996360', '_unique_id': 'cde9772fcc874d0ba9781a44e1811685'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '342e2349-a7c7-4e4f-a23f-dd9465c07a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:07.998252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6595b574-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': 'c5a940fd7ea5d91f36d271b082fc2943eed641702bf613ed1e6644db5ee4f959'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:07.998252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6595c4a6-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '4c5f8e85a702e42f6ed775a9503ef79fcce057c836cb1e464a48887b3f1326ed'}]}, 'timestamp': '2025-12-06 10:29:07.999021', '_unique_id': '633b4fc87e7f417d956a9bfd07569ad6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.000 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfcdc643-9e7c-4d98-b125-ec71ea78ae8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:08.000959', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '65961f78-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '583f53d6d6aa5290571589b2fb77340ac77799bbd7205918c0762bd2d0038d50'}]}, 'timestamp': '2025-12-06 10:29:08.001377', '_unique_id': 'c6284f5da7b04a59bd973e30ea195c60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.002 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.003 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.003 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 20700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '071912ff-5efe-4f6d-bdd0-5cc8db7b8d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20700000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:29:08.003472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '65968116-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.232812474, 'message_signature': 'ae697f0a731b0a86d8f372b656c6752917cb1cbb04a4315c859c0036e2dce979'}]}, 'timestamp': '2025-12-06 10:29:08.003880', '_unique_id': '6b73474de01a470baa7b0a4805fd96f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.004 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.005 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.006 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1529245b-ecb1-444d-bd7d-dfec61174941', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:08.005750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6596db84-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '3ce05acad23084007ffb304b0a24f614a07e574989b8809ef247121d980eaea3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:08.005750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6596e9d0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': 'ebaf8c7494a2bb140ddf9b84ccacafca626c74ca08b65fe09a54575f974ffc7c'}]}, 'timestamp': '2025-12-06 10:29:08.006535', '_unique_id': '10e1eee3fbc4467da6e07442933ace69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.007 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.008 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.008 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44ab393-6220-49dc-9521-07cf1858d7a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:08.008427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6597434e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': 'a5353723a3c168279a561d127c8b5211f2ed6cd76433b4e9c7efe0c10e8924fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:08.008427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6597538e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '02ef1f5571d70bd64209dc23f42799b39ecf1acd3016221a954607e6a8b0c252'}]}, 'timestamp': '2025-12-06 10:29:08.009237', '_unique_id': '69f9a811c24c4dde810c03aeea1c08f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.010 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.011 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aae60064-0ad2-4dd9-b4b1-6b7e833b5434', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:08.011146', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '6597acf8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '181771abf698e61d48f498c96bcb77f38135275e8d41080c729deb280b7a8152'}]}, 'timestamp': '2025-12-06 10:29:08.011551', '_unique_id': '5040e0d1932c42cfa5381363b193a7db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.012 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.013 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.013 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d4025d0-9444-4e1b-8dd1-b89e497aaded', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:08.013441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6598068a-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': '62c7cc59650d59fd6b85e20292e956dc3725d3beb09e27535f76eeb2c2a3dd21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:08.013441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '65981670-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.167983894, 'message_signature': 'b229d03425bc8978e52237887110f08dbf451f03932613888e3ffdc1ba538d0b'}]}, 'timestamp': '2025-12-06 10:29:08.014225', '_unique_id': '1156e8ea2edc44dfbc6275c27c0118c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.015 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.016 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.016 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95be7a3a-c2fd-4342-8663-96a27bb11a53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:08.016241', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '659874c6-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '08eccc366aae6257342e55678004b87bc311cbcc1ff80757976c97935e7c10b3'}]}, 'timestamp': '2025-12-06 10:29:08.016662', '_unique_id': '27fe8cf776fd470ba52f0b35fd163364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.017 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.018 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.018 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.019 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc029b07-0357-4fb1-b138-84358ab0e5b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:29:08.018685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6598d51a-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': '7fc6a8f9b9e2974315bdd546fd107859d88d168f5a4882893407cb1f823da9c4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:29:08.018685', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6598e456-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.199861453, 'message_signature': '28707555586c8f3d0db31eaac4709a1aca714b5d9179bdf8bd62d1f42ce3be0f'}]}, 'timestamp': '2025-12-06 10:29:08.019506', '_unique_id': '7afdf52faeff47d6b678ef994473f2f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.020 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.021 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.021 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52d1f3cb-3ac5-4917-b260-3718a4fde6ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:08.021669', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '659949b4-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': '87c5077fec81bff787167f18759787f7b8d74b4551106c20679f2eaa7934b802'}]}, 'timestamp': '2025-12-06 10:29:08.022128', '_unique_id': '8ebcad51a2c94dadbac5a61604ea2aec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.022 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.024 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb80391e-fa4a-4edf-bdbc-2395414eaf5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:29:08.024006', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': '6599a40e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13366.21149235, 'message_signature': 'cdf1bd4937c990e7db1988fd7f31e44c0cd4d3ee9ce1f778ae4c5deb47cf72c2'}]}, 'timestamp': '2025-12-06 10:29:08.024458', '_unique_id': '23fd8847fe9e4c838f970894cdc9b497'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:29:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:29:08.025 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: pgmap v710: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "r", "format": "json"}]: dispatch
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:08 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow r pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:10 np0005548789.localdomain ceph-mon[298582]: pgmap v711: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 138 KiB/s wr, 9 op/s
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.060 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.149 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Triggering sync for uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.150 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.150 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.174 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Dec 06 10:29:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Dec 06 10:29:11 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.847 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.850 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.895 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:11 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:11.896 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:12 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "alice bob", "format": "json"}]: dispatch
Dec 06 10:29:12 np0005548789.localdomain ceph-mon[298582]: pgmap v712: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 86 KiB/s wr, 5 op/s
Dec 06 10:29:12 np0005548789.localdomain sudo[337662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:29:12 np0005548789.localdomain sudo[337662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:12 np0005548789.localdomain sudo[337662]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:12 np0005548789.localdomain sudo[337680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:29:12 np0005548789.localdomain sudo[337680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:13 np0005548789.localdomain sudo[337680]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: pgmap v713: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 109 KiB/s wr, 7 op/s
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:13 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:29:13 np0005548789.localdomain sudo[337729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:29:13 np0005548789.localdomain sudo[337729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:29:13 np0005548789.localdomain sudo[337729]: pam_unix(sudo:session): session closed for user root
Dec 06 10:29:14 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"} : dispatch
Dec 06 10:29:14 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723", "mon", "allow r"], "format": "json"}]': finished
Dec 06 10:29:15 np0005548789.localdomain ceph-mon[298582]: pgmap v714: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:29:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.897 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.899 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.899 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.899 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.938 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:16 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:16.938 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:29:18 np0005548789.localdomain ceph-mon[298582]: pgmap v715: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 76 KiB/s wr, 5 op/s
Dec 06 10:29:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:29:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "format": "json"}]: dispatch
Dec 06 10:29:19 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:29:19 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:29:19 np0005548789.localdomain podman[337747]: 2025-12-06 10:29:19.950165594 +0000 UTC m=+0.098336638 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent)
Dec 06 10:29:19 np0005548789.localdomain podman[337747]: 2025-12-06 10:29:19.961159231 +0000 UTC m=+0.109330305 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 06 10:29:19 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:29:20 np0005548789.localdomain podman[337748]: 2025-12-06 10:29:20.056718723 +0000 UTC m=+0.202281397 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:29:20 np0005548789.localdomain podman[337748]: 2025-12-06 10:29:20.065461071 +0000 UTC m=+0.211023765 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:29:20 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:29:20 np0005548789.localdomain ceph-mon[298582]: pgmap v716: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 108 KiB/s wr, 7 op/s
Dec 06 10:29:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:21.939 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:21 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:21.941 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: pgmap v717: 177 pgs: 177 active+clean; 232 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 55 KiB/s wr, 3 op/s
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} : dispatch
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]} : dispatch
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b,allow rw path=/volumes/_nogroup/ac611655-851d-48c2-9d00-93668f6ff5e1/35c0f9c9-609d-4a42-9e75-505856ad36fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723,allow rw pool=manila_data namespace=fsvolumens_ac611655-851d-48c2-9d00-93668f6ff5e1"]}]': finished
Dec 06 10:29:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:22 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:29:22Z|00528|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec 06 10:29:22 np0005548789.localdomain sshd[337789]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:29:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:23 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "tenant_id": "8223febae67d4b58a139c7a23382ebf9", "access_level": "rw", "format": "json"}]: dispatch
Dec 06 10:29:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:29:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:29:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:29:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:29:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:29:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Dec 06 10:29:24 np0005548789.localdomain sshd[337789]: Received disconnect from 101.47.49.180 port 50836:11: Bye Bye [preauth]
Dec 06 10:29:24 np0005548789.localdomain sshd[337789]: Disconnected from authenticating user root 101.47.49.180 port 50836 [preauth]
Dec 06 10:29:24 np0005548789.localdomain ceph-mon[298582]: pgmap v718: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 95 KiB/s wr, 5 op/s
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: pgmap v719: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 71 KiB/s wr, 4 op/s
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} : dispatch
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]} : dispatch
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/6fddad9a-edda-44e9-b738-5688693ea723/6646f798-4ba5-4e2d-868f-aafbfee9e73b", "osd", "allow rw pool=manila_data namespace=fsvolumens_6fddad9a-edda-44e9-b738-5688693ea723"]}]': finished
Dec 06 10:29:26 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:29:26 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:29:26 np0005548789.localdomain podman[337791]: 2025-12-06 10:29:26.931847004 +0000 UTC m=+0.080933894 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public)
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.942 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.944 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.944 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.945 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.978 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:26 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:26.979 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:27 np0005548789.localdomain podman[337792]: 2025-12-06 10:29:27.018029538 +0000 UTC m=+0.164424265 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:29:27 np0005548789.localdomain podman[337791]: 2025-12-06 10:29:27.028547061 +0000 UTC m=+0.177633961 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec 06 10:29:27 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:29:27 np0005548789.localdomain podman[337792]: 2025-12-06 10:29:27.083349543 +0000 UTC m=+0.229744230 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Dec 06 10:29:27 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:29:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:28 np0005548789.localdomain ceph-mon[298582]: pgmap v720: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 71 KiB/s wr, 4 op/s
Dec 06 10:29:29 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Dec 06 10:29:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 06 10:29:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Dec 06 10:29:29 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Dec 06 10:29:30 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "auth_id": "bob", "format": "json"}]: dispatch
Dec 06 10:29:30 np0005548789.localdomain ceph-mon[298582]: pgmap v721: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 105 KiB/s wr, 6 op/s
Dec 06 10:29:30 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:29:30 np0005548789.localdomain systemd[1]: tmp-crun.x8ViG9.mount: Deactivated successfully.
Dec 06 10:29:30 np0005548789.localdomain podman[337832]: 2025-12-06 10:29:30.952652619 +0000 UTC m=+0.116052622 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 06 10:29:30 np0005548789.localdomain podman[337832]: 2025-12-06 10:29:30.965652417 +0000 UTC m=+0.129052440 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:29:30 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:29:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:31.980 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:31.982 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:31.983 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:31.983 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:32.013 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:32.013 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:32 np0005548789.localdomain ceph-mon[298582]: pgmap v722: 177 pgs: 177 active+clean; 233 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s wr, 4 op/s
Dec 06 10:29:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "format": "json"}]: dispatch
Dec 06 10:29:33 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac611655-851d-48c2-9d00-93668f6ff5e1", "force": true, "format": "json"}]: dispatch
Dec 06 10:29:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:29:33 np0005548789.localdomain podman[337851]: 2025-12-06 10:29:33.926077898 +0000 UTC m=+0.088664541 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:29:33 np0005548789.localdomain podman[337851]: 2025-12-06 10:29:33.940272044 +0000 UTC m=+0.102858657 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:29:33 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:29:34 np0005548789.localdomain ceph-mon[298582]: pgmap v723: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 119 KiB/s wr, 6 op/s
Dec 06 10:29:36 np0005548789.localdomain ceph-mon[298582]: pgmap v724: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 79 KiB/s wr, 4 op/s
Dec 06 10:29:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6fddad9a-edda-44e9-b738-5688693ea723", "format": "json"}]: dispatch
Dec 06 10:29:36 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6fddad9a-edda-44e9-b738-5688693ea723", "force": true, "format": "json"}]: dispatch
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.014 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.040 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.041 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.041 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.042 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:37.043 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:29:37 np0005548789.localdomain podman[337874]: 2025-12-06 10:29:37.923068511 +0000 UTC m=+0.079180050 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:29:37 np0005548789.localdomain podman[337874]: 2025-12-06 10:29:37.96572278 +0000 UTC m=+0.121834319 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:29:37 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:29:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:38 np0005548789.localdomain ceph-mon[298582]: pgmap v725: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 79 KiB/s wr, 4 op/s
Dec 06 10:29:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:29:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3284518139' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:29:40 np0005548789.localdomain ceph-mon[298582]: pgmap v726: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 103 KiB/s wr, 5 op/s
Dec 06 10:29:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:40.689 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:29:40 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:40.690 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:29:40 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:40.723 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:42.076 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:42 np0005548789.localdomain ceph-mon[298582]: pgmap v727: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 69 KiB/s wr, 3 op/s
Dec 06 10:29:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:43 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:43.692 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:29:43 np0005548789.localdomain ceph-mon[298582]: pgmap v728: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 80 KiB/s wr, 4 op/s
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.218 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.218 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.218 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.219 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.220 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:29:44 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:29:44 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1217810012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.670 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.761 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.762 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:29:44 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1217810012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:44.998 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.000 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11098MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.000 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.001 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.126 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.127 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.127 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.166 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:29:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:29:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3042594742' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.618 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.624 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.650 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.653 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:29:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:45.653 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:45 np0005548789.localdomain ceph-mon[298582]: pgmap v729: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 35 KiB/s wr, 2 op/s
Dec 06 10:29:45 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3042594742' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:29:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:29:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:29:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:46.653 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:46.654 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:29:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:46.655 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.078 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.082 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.106 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.108 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.188 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.189 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.189 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.190 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:29:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:47.346 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:29:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:47.347 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:29:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:29:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.712 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.741 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.742 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:29:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:47.742 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.941672) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987941733, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1148, "num_deletes": 257, "total_data_size": 1153329, "memory_usage": 1177440, "flush_reason": "Manual Compaction"}
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987950922, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 741105, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39230, "largest_seqno": 40372, "table_properties": {"data_size": 736493, "index_size": 2083, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11423, "raw_average_key_size": 19, "raw_value_size": 726591, "raw_average_value_size": 1268, "num_data_blocks": 92, "num_entries": 573, "num_filter_entries": 573, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016930, "oldest_key_time": 1765016930, "file_creation_time": 1765016987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9296 microseconds, and 3253 cpu microseconds.
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.950969) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 741105 bytes OK
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.950992) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.953416) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.953439) EVENT_LOG_v1 {"time_micros": 1765016987953432, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.953461) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1147508, prev total WAL file size 1147832, number of live WAL files 2.
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.954409) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353232' seq:72057594037927935, type:22 .. '6C6F676D0034373735' seq:0, type:0; will stop at (end)
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(723KB)], [66(19MB)]
Dec 06 10:29:47 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016987954504, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 21440832, "oldest_snapshot_seqno": -1}
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14650 keys, 21308274 bytes, temperature: kUnknown
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988068274, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 21308274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21221249, "index_size": 49292, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36677, "raw_key_size": 393797, "raw_average_key_size": 26, "raw_value_size": 20969385, "raw_average_value_size": 1431, "num_data_blocks": 1837, "num_entries": 14650, "num_filter_entries": 14650, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765016987, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.068716) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 21308274 bytes
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.070655) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.2 rd, 187.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 19.7 +0.0 blob) out(20.3 +0.0 blob), read-write-amplify(57.7) write-amplify(28.8) OK, records in: 15185, records dropped: 535 output_compression: NoCompression
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.070685) EVENT_LOG_v1 {"time_micros": 1765016988070671, "job": 40, "event": "compaction_finished", "compaction_time_micros": 113906, "compaction_time_cpu_micros": 56197, "output_level": 6, "num_output_files": 1, "total_output_size": 21308274, "num_input_records": 15185, "num_output_records": 14650, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988071117, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016988074073, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:47.954261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.074145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.074154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.074158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.074163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:29:48.074167) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:48 np0005548789.localdomain ceph-mon[298582]: pgmap v730: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 35 KiB/s wr, 2 op/s
Dec 06 10:29:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:50 np0005548789.localdomain ceph-mon[298582]: pgmap v731: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 39 KiB/s wr, 2 op/s
Dec 06 10:29:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:29:50 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:29:50 np0005548789.localdomain podman[337944]: 2025-12-06 10:29:50.916143459 +0000 UTC m=+0.071634920 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:29:50 np0005548789.localdomain podman[337944]: 2025-12-06 10:29:50.928231089 +0000 UTC m=+0.083722550 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:29:50 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:29:51 np0005548789.localdomain podman[337943]: 2025-12-06 10:29:51.011584257 +0000 UTC m=+0.166926653 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:29:51 np0005548789.localdomain podman[337943]: 2025-12-06 10:29:51.016816898 +0000 UTC m=+0.172159294 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:29:51 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:29:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:51.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.111 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.111 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.111 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.141 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:52.142 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:52 np0005548789.localdomain ceph-mon[298582]: pgmap v732: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:29:53 np0005548789.localdomain sshd[337984]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:29:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:53.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:53.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:53.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:29:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:29:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:29:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:29:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:29:53 np0005548789.localdomain ceph-mon[298582]: pgmap v733: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:29:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:29:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19271 "" "Go-http-client/1.1"
Dec 06 10:29:54 np0005548789.localdomain sshd[337984]: Received disconnect from 118.219.234.233 port 44884:11: Bye Bye [preauth]
Dec 06 10:29:54 np0005548789.localdomain sshd[337984]: Disconnected from authenticating user root 118.219.234.233 port 44884 [preauth]
Dec 06 10:29:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:55.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:29:56 np0005548789.localdomain ceph-mon[298582]: pgmap v734: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s wr, 0 op/s
Dec 06 10:29:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/522862524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:56 np0005548789.localdomain sshd[337986]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.142 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.144 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.145 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.145 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.178 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:29:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:29:57.179 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:29:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1038709677' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:29:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:29:57 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:29:57 np0005548789.localdomain podman[337988]: 2025-12-06 10:29:57.940833856 +0000 UTC m=+0.092487638 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 06 10:29:57 np0005548789.localdomain podman[337988]: 2025-12-06 10:29:57.979297646 +0000 UTC m=+0.130951378 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41)
Dec 06 10:29:58 np0005548789.localdomain podman[337989]: 2025-12-06 10:29:58.001234629 +0000 UTC m=+0.147815576 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 06 10:29:58 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:29:58 np0005548789.localdomain podman[337989]: 2025-12-06 10:29:58.039285497 +0000 UTC m=+0.185866474 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:29:58 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:29:58 np0005548789.localdomain sshd[337986]: Received disconnect from 14.194.101.210 port 49788:11: Bye Bye [preauth]
Dec 06 10:29:58 np0005548789.localdomain sshd[337986]: Disconnected from authenticating user root 14.194.101.210 port 49788 [preauth]
Dec 06 10:29:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:29:58 np0005548789.localdomain ceph-mon[298582]: pgmap v735: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s wr, 0 op/s
Dec 06 10:29:58 np0005548789.localdomain ceph-mon[298582]: from='client.25237 172.18.0.34:0/2532600599' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 06 10:29:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Dec 06 10:29:59 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "format": "json"}]: dispatch
Dec 06 10:29:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4167972808' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:00 np0005548789.localdomain ceph-mon[298582]: pgmap v736: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Dec 06 10:30:00 np0005548789.localdomain ceph-mon[298582]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Dec 06 10:30:01 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:30:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2145475030' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:02 np0005548789.localdomain podman[338028]: 2025-12-06 10:30:02.066892249 +0000 UTC m=+0.078258462 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 06 10:30:02 np0005548789.localdomain podman[338028]: 2025-12-06 10:30:02.077303419 +0000 UTC m=+0.088669612 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:30:02 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.180 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.181 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.181 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.182 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.213 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:02.214 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:03 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1", "format": "json"}]: dispatch
Dec 06 10:30:03 np0005548789.localdomain ceph-mon[298582]: pgmap v737: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s wr, 0 op/s
Dec 06 10:30:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:04 np0005548789.localdomain ceph-mon[298582]: pgmap v738: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:04 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:30:04 np0005548789.localdomain podman[338045]: 2025-12-06 10:30:04.899061364 +0000 UTC m=+0.064539571 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:30:04 np0005548789.localdomain podman[338045]: 2025-12-06 10:30:04.935218363 +0000 UTC m=+0.100696480 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:30:04 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:30:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1_41e045a4-96eb-4b50-9798-8ab25b9deb95", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:06 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "snap_name": "651d1145-bde4-4592-9ceb-573b615a4af1", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:06 np0005548789.localdomain ceph-mon[298582]: pgmap v739: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.215 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.266 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:07.266 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:08 np0005548789.localdomain ceph-mon[298582]: pgmap v740: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:30:08 np0005548789.localdomain podman[338068]: 2025-12-06 10:30:08.895790719 +0000 UTC m=+0.061238869 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:30:08 np0005548789.localdomain podman[338068]: 2025-12-06 10:30:08.931503405 +0000 UTC m=+0.096951565 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 06 10:30:08 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:30:09 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 90K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8386 syncs, 2.88 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 35.98 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4381 syncs, 2.51 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:30:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "format": "json"}]: dispatch
Dec 06 10:30:09 np0005548789.localdomain ceph-mon[298582]: from='client.25237 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "caa1e0c3-0842-4cff-aec6-04859a719f92", "force": true, "format": "json"}]: dispatch
Dec 06 10:30:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e280 e280: 6 total, 6 up, 6 in
Dec 06 10:30:10 np0005548789.localdomain ceph-mon[298582]: pgmap v741: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 45 KiB/s wr, 2 op/s
Dec 06 10:30:11 np0005548789.localdomain ceph-mon[298582]: osdmap e280: 6 total, 6 up, 6 in
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.308 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.310 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.311 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:12.311 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:12 np0005548789.localdomain ceph-mon[298582]: pgmap v743: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 44 KiB/s wr, 3 op/s
Dec 06 10:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:30:12 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 87K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 8052 syncs, 2.84 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 48K keys, 12K commit groups, 1.0 writes per commit group, ingest: 44.21 MB, 0.07 MB/s
                                                          Interval WAL: 12K writes, 5039 syncs, 2.50 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:30:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:13 np0005548789.localdomain ceph-mon[298582]: pgmap v744: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:14 np0005548789.localdomain sudo[338094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:30:14 np0005548789.localdomain sudo[338094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:14 np0005548789.localdomain sudo[338094]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:14 np0005548789.localdomain sudo[338112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:30:14 np0005548789.localdomain sudo[338112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:14 np0005548789.localdomain sudo[338112]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:30:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:30:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:15 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:30:15 np0005548789.localdomain sudo[338161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:30:15 np0005548789.localdomain sudo[338161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:30:15 np0005548789.localdomain sudo[338161]: pam_unix(sudo:session): session closed for user root
Dec 06 10:30:16 np0005548789.localdomain ceph-mon[298582]: pgmap v745: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:30:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.312 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.314 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.314 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.315 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.339 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:17.340 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:17 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 e281: 6 total, 6 up, 6 in
Dec 06 10:30:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:30:18 np0005548789.localdomain ceph-mon[298582]: pgmap v746: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 62 KiB/s wr, 3 op/s
Dec 06 10:30:18 np0005548789.localdomain ceph-mon[298582]: osdmap e281: 6 total, 6 up, 6 in
Dec 06 10:30:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:20 np0005548789.localdomain ceph-mon[298582]: pgmap v748: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 230 B/s rd, 54 KiB/s wr, 2 op/s
Dec 06 10:30:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:20.769 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '72:6c:02', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:a8:2f:0c:cb:a1'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:20 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:20.769 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:20 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:20.772 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:30:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:30:21 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:30:21 np0005548789.localdomain podman[338179]: 2025-12-06 10:30:21.935335294 +0000 UTC m=+0.090484188 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:21 np0005548789.localdomain podman[338179]: 2025-12-06 10:30:21.94627967 +0000 UTC m=+0.101428564 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 06 10:30:21 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:30:22 np0005548789.localdomain podman[338180]: 2025-12-06 10:30:22.032139154 +0000 UTC m=+0.187114842 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:30:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:22.038 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:21Z, description=, device_id=574c884b-be04-4d2b-b229-29068bb8b5d2, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa5e4f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fa5e8e0>], id=2227d8ce-3e8d-404d-ba2e-378142bee228, ip_allocation=immediate, mac_address=fa:16:3e:cb:28:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3949, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:30:21Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:30:22 np0005548789.localdomain podman[338180]: 2025-12-06 10:30:22.043142791 +0000 UTC m=+0.198118479 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:30:22 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:30:22 np0005548789.localdomain podman[338237]: 2025-12-06 10:30:22.232939565 +0000 UTC m=+0.055050301 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 06 10:30:22 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:30:22 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:30:22 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:30:22 np0005548789.localdomain systemd[1]: tmp-crun.rmEZw9.mount: Deactivated successfully.
Dec 06 10:30:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:22.369 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: pgmap v749: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 48 KiB/s wr, 2 op/s
Dec 06 10:30:22 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:22.540 263652 INFO neutron.agent.dhcp.agent [None req-4fd6c565-f23c-4500-a7d2-8713d8462566 - - - - - -] DHCP configuration for ports {'2227d8ce-3e8d-404d-ba2e-378142bee228'} is completed
Dec 06 10:30:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:22.865 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.976052) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022976092, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 716, "num_deletes": 251, "total_data_size": 682557, "memory_usage": 695392, "flush_reason": "Manual Compaction"}
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022980012, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 346964, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40377, "largest_seqno": 41088, "table_properties": {"data_size": 344004, "index_size": 879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8408, "raw_average_key_size": 21, "raw_value_size": 337544, "raw_average_value_size": 843, "num_data_blocks": 39, "num_entries": 400, "num_filter_entries": 400, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016987, "oldest_key_time": 1765016987, "file_creation_time": 1765017022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 3995 microseconds, and 1250 cpu microseconds.
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.980046) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 346964 bytes OK
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.980066) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.982131) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.982143) EVENT_LOG_v1 {"time_micros": 1765017022982139, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.982154) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 678674, prev total WAL file size 678998, number of live WAL files 2.
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.984287) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323535' seq:72057594037927935, type:22 .. '6D6772737461740034353036' seq:0, type:0; will stop at (end)
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(338KB)], [69(20MB)]
Dec 06 10:30:22 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017022984330, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 21655238, "oldest_snapshot_seqno": -1}
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14539 keys, 19629687 bytes, temperature: kUnknown
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023086356, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 19629687, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19547811, "index_size": 44463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 391770, "raw_average_key_size": 26, "raw_value_size": 19302194, "raw_average_value_size": 1327, "num_data_blocks": 1635, "num_entries": 14539, "num_filter_entries": 14539, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765017022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.086702) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 19629687 bytes
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.088362) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.0 rd, 192.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 20.3 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(119.0) write-amplify(56.6) OK, records in: 15050, records dropped: 511 output_compression: NoCompression
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.088385) EVENT_LOG_v1 {"time_micros": 1765017023088373, "job": 42, "event": "compaction_finished", "compaction_time_micros": 102144, "compaction_time_cpu_micros": 52111, "output_level": 6, "num_output_files": 1, "total_output_size": 19629687, "num_input_records": 15050, "num_output_records": 14539, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023088560, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017023090886, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:22.984233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.090938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.090945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.090947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.090948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:30:23.090950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:30:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:30:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:30:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:30:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:30:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1"
Dec 06 10:30:23 np0005548789.localdomain ceph-mon[298582]: pgmap v750: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:26 np0005548789.localdomain ceph-mon[298582]: pgmap v751: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:27.428 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:27.947 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:28 np0005548789.localdomain ceph-mon[298582]: pgmap v752: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 18 KiB/s wr, 0 op/s
Dec 06 10:30:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:30:28 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:30:28 np0005548789.localdomain systemd[1]: tmp-crun.PU9Kej.mount: Deactivated successfully.
Dec 06 10:30:28 np0005548789.localdomain podman[338259]: 2025-12-06 10:30:28.925849355 +0000 UTC m=+0.082597935 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 10:30:28 np0005548789.localdomain podman[338259]: 2025-12-06 10:30:28.938149133 +0000 UTC m=+0.094897723 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 06 10:30:28 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:30:29 np0005548789.localdomain podman[338258]: 2025-12-06 10:30:29.032467986 +0000 UTC m=+0.187843054 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, vcs-type=git, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 06 10:30:29 np0005548789.localdomain podman[338258]: 2025-12-06 10:30:29.048170088 +0000 UTC m=+0.203545256 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 06 10:30:29 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:30:29 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:29.774 160509 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=b142a5ef-fbed-4e92-aa78-e3ad080c6370, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:30:30 np0005548789.localdomain ceph-mon[298582]: pgmap v753: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 16 KiB/s wr, 0 op/s
Dec 06 10:30:31 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:31.963 263652 INFO neutron.agent.linux.ip_lib [None req-8e4c2764-24e6-4cd8-b12e-c26a555e62ed - - - - - -] Device tap1edce767-a0 cannot be used as it has no MAC address
Dec 06 10:30:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:31.989 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:31 np0005548789.localdomain kernel: device tap1edce767-a0 entered promiscuous mode
Dec 06 10:30:31 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:31.998 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:31 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:31Z|00529|binding|INFO|Claiming lport 1edce767-a07b-4181-9f27-a5630d291dd5 for this chassis.
Dec 06 10:30:32 np0005548789.localdomain NetworkManager[5973]: <info>  [1765017032.0008] manager: (tap1edce767-a0): new Generic device (/org/freedesktop/NetworkManager/Devices/83)
Dec 06 10:30:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:31Z|00530|binding|INFO|1edce767-a07b-4181-9f27-a5630d291dd5: Claiming unknown
Dec 06 10:30:32 np0005548789.localdomain systemd-udevd[338307]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 10:30:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:32.015 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d2c3fc1d605488db2b4af2af7696c67', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4dcc78c6-bb1e-494e-a935-28e9e1421181, chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1edce767-a07b-4181-9f27-a5630d291dd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:32.016 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1edce767-a07b-4181-9f27-a5630d291dd5 in datapath 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf bound to our chassis
Dec 06 10:30:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:32.020 160509 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6199ee5b-335e-4714-8d26-80e7969ccc10 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Dec 06 10:30:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:32.020 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:30:32 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:32.021 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2e3e7f-e601-42da-821c-db3a5e9b86e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:32Z|00531|binding|INFO|Setting lport 1edce767-a07b-4181-9f27-a5630d291dd5 ovn-installed in OVS
Dec 06 10:30:32 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:32Z|00532|binding|INFO|Setting lport 1edce767-a07b-4181-9f27-a5630d291dd5 up in Southbound
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:32.043 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain virtnodedevd[230404]: ethtool ioctl error on tap1edce767-a0: No such device
Dec 06 10:30:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:32.081 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:32.109 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:32.241 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:31Z, description=, device_id=994662bc-2fe4-420e-80b2-94ab1d759954, device_owner=network:router_gateway, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2f760>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc2f400>], id=47c772c4-13e0-4bd1-bfd3-8aae8031c96d, ip_allocation=immediate, mac_address=fa:16:3e:2c:7a:f1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T08:43:55Z, description=, dns_domain=, id=8e238f59-5792-4ff4-95af-f993c8e9e14f, ipv4_address_scope=None, ipv6_address_scope=None, is_default=False, l2_adjacency=True, mtu=1350, name=public, port_security_enabled=True, project_id=3d603431c0bb4967bafc7a0aa6108bfe, provider:network_type=flat, provider:physical_network=datacentre, provider:segmentation_id=None, qos_policy_id=None, revision_number=2, router:external=True, shared=False, standard_attr_id=29, status=ACTIVE, subnets=['be7ff830-a3e2-49cd-a05f-63bc528d6d1c'], tags=[], tenant_id=3d603431c0bb4967bafc7a0aa6108bfe, updated_at=2025-12-06T08:44:01Z, vlan_transparent=None, network_id=8e238f59-5792-4ff4-95af-f993c8e9e14f, port_security_enabled=False, project_id=, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3963, status=DOWN, tags=[], tenant_id=, updated_at=2025-12-06T10:30:31Z on network 8e238f59-5792-4ff4-95af-f993c8e9e14f
Dec 06 10:30:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:32.471 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:32 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 3 addresses
Dec 06 10:30:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:30:32 np0005548789.localdomain podman[338365]: 2025-12-06 10:30:32.487349017 +0000 UTC m=+0.091999303 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:30:32 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:30:32 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:30:32 np0005548789.localdomain systemd[1]: tmp-crun.3DMVdk.mount: Deactivated successfully.
Dec 06 10:30:32 np0005548789.localdomain podman[338383]: 2025-12-06 10:30:32.567676722 +0000 UTC m=+0.063797909 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:30:32 np0005548789.localdomain podman[338383]: 2025-12-06 10:30:32.582094354 +0000 UTC m=+0.078215611 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 06 10:30:32 np0005548789.localdomain ceph-mon[298582]: pgmap v754: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:32 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:30:32 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:32.750 263652 INFO neutron.agent.dhcp.agent [None req-b4f64022-b44e-4bbb-b798-f5ee31f8a2a6 - - - - - -] DHCP configuration for ports {'47c772c4-13e0-4bd1-bfd3-8aae8031c96d'} is completed
Dec 06 10:30:32 np0005548789.localdomain systemd[1]: tmp-crun.HH2HRV.mount: Deactivated successfully.
Dec 06 10:30:32 np0005548789.localdomain podman[338434]: 
Dec 06 10:30:33 np0005548789.localdomain podman[338434]: 2025-12-06 10:30:33.001519143 +0000 UTC m=+0.088645191 container create d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:30:33 np0005548789.localdomain systemd[1]: Started libpod-conmon-d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a.scope.
Dec 06 10:30:33 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:30:33 np0005548789.localdomain podman[338434]: 2025-12-06 10:30:32.959635908 +0000 UTC m=+0.046761946 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Dec 06 10:30:33 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec5a68b272409c0780e9b34c01467356b790bdd39419926a1b87579a37fc65f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 10:30:33 np0005548789.localdomain podman[338434]: 2025-12-06 10:30:33.071602073 +0000 UTC m=+0.158728061 container init d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:30:33 np0005548789.localdomain podman[338434]: 2025-12-06 10:30:33.084246292 +0000 UTC m=+0.171372280 container start d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:30:33 np0005548789.localdomain dnsmasq[338452]: started, version 2.85 cachesize 150
Dec 06 10:30:33 np0005548789.localdomain dnsmasq[338452]: DNS service limited to local subnets
Dec 06 10:30:33 np0005548789.localdomain dnsmasq[338452]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Dec 06 10:30:33 np0005548789.localdomain dnsmasq[338452]: warning: no upstream servers configured
Dec 06 10:30:33 np0005548789.localdomain dnsmasq-dhcp[338452]: DHCP, static leases only on 10.100.0.0, lease time 1d
Dec 06 10:30:33 np0005548789.localdomain dnsmasq[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/addn_hosts - 0 addresses
Dec 06 10:30:33 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/host
Dec 06 10:30:33 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/opts
Dec 06 10:30:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:33 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:33.513 263652 INFO neutron.agent.dhcp.agent [None req-00b19acc-fd4b-4c0b-95cf-0187f8f6f303 - - - - - -] DHCP configuration for ports {'e6bf6217-acc3-4b45-a780-fd5e44cdc315'} is completed
Dec 06 10:30:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:33.590 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:33 np0005548789.localdomain ceph-mon[298582]: pgmap v755: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:35.062 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:34Z, description=, device_id=994662bc-2fe4-420e-80b2-94ab1d759954, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe3845a9c10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37fc17580>], id=902f33d2-09bd-47a8-8ed4-f71d87316d90, ip_allocation=immediate, mac_address=fa:16:3e:f7:32:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:28Z, description=, dns_domain=, id=3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1706947745-network, port_security_enabled=True, project_id=7d2c3fc1d605488db2b4af2af7696c67, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48346, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3956, status=ACTIVE, subnets=['b1a7cf81-c0c7-4652-b7ea-4819206fe79a'], tags=[], tenant_id=7d2c3fc1d605488db2b4af2af7696c67, updated_at=2025-12-06T10:30:29Z, vlan_transparent=None, network_id=3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, port_security_enabled=False, project_id=7d2c3fc1d605488db2b4af2af7696c67, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3964, status=DOWN, tags=[], tenant_id=7d2c3fc1d605488db2b4af2af7696c67, updated_at=2025-12-06T10:30:34Z on network 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf
Dec 06 10:30:35 np0005548789.localdomain dnsmasq[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/addn_hosts - 1 addresses
Dec 06 10:30:35 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/host
Dec 06 10:30:35 np0005548789.localdomain podman[338471]: 2025-12-06 10:30:35.27705554 +0000 UTC m=+0.064789629 container kill d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:30:35 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/opts
Dec 06 10:30:35 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:30:35 np0005548789.localdomain systemd[1]: tmp-crun.1FuqXw.mount: Deactivated successfully.
Dec 06 10:30:35 np0005548789.localdomain podman[338484]: 2025-12-06 10:30:35.392908625 +0000 UTC m=+0.088468395 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:30:35 np0005548789.localdomain podman[338484]: 2025-12-06 10:30:35.408514244 +0000 UTC m=+0.104074014 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 06 10:30:35 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:30:35 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:35.558 263652 INFO neutron.agent.dhcp.agent [None req-7d0c375b-c979-4247-bd8f-b550d5a21af3 - - - - - -] DHCP configuration for ports {'902f33d2-09bd-47a8-8ed4-f71d87316d90'} is completed
Dec 06 10:30:36 np0005548789.localdomain ceph-mon[298582]: pgmap v756: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:36 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:36.676 263652 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-06T10:30:34Z, description=, device_id=994662bc-2fe4-420e-80b2-94ab1d759954, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9de190>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7fe37f9de820>], id=902f33d2-09bd-47a8-8ed4-f71d87316d90, ip_allocation=immediate, mac_address=fa:16:3e:f7:32:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-06T10:30:28Z, description=, dns_domain=, id=3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1706947745-network, port_security_enabled=True, project_id=7d2c3fc1d605488db2b4af2af7696c67, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48346, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3956, status=ACTIVE, subnets=['b1a7cf81-c0c7-4652-b7ea-4819206fe79a'], tags=[], tenant_id=7d2c3fc1d605488db2b4af2af7696c67, updated_at=2025-12-06T10:30:29Z, vlan_transparent=None, network_id=3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, port_security_enabled=False, project_id=7d2c3fc1d605488db2b4af2af7696c67, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3964, status=DOWN, tags=[], tenant_id=7d2c3fc1d605488db2b4af2af7696c67, updated_at=2025-12-06T10:30:34Z on network 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf
Dec 06 10:30:36 np0005548789.localdomain dnsmasq[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/addn_hosts - 1 addresses
Dec 06 10:30:36 np0005548789.localdomain podman[338531]: 2025-12-06 10:30:36.889880335 +0000 UTC m=+0.058755944 container kill d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:30:36 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/host
Dec 06 10:30:36 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/opts
Dec 06 10:30:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:37.509 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:37 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:37.524 263652 INFO neutron.agent.dhcp.agent [None req-a29facaf-6969-4596-802f-917779c911a5 - - - - - -] DHCP configuration for ports {'902f33d2-09bd-47a8-8ed4-f71d87316d90'} is completed
Dec 06 10:30:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:38 np0005548789.localdomain ceph-mon[298582]: pgmap v757: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3426178596' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:30:39 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:30:39 np0005548789.localdomain podman[338551]: 2025-12-06 10:30:39.926013519 +0000 UTC m=+0.088278470 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:30:40 np0005548789.localdomain podman[338551]: 2025-12-06 10:30:40.01439398 +0000 UTC m=+0.176658921 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:30:40 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:30:40 np0005548789.localdomain ceph-mon[298582]: pgmap v758: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.510 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.512 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.513 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.513 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.555 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:42.556 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:42 np0005548789.localdomain ceph-mon[298582]: pgmap v759: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:44 np0005548789.localdomain ceph-mon[298582]: pgmap v760: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:44.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e282 e282: 6 total, 6 up, 6 in
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.180 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.366 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.367 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.367 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.368 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.368 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:30:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:30:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/510296205' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.825 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.894 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:30:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:45.895 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.100 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.102 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11094MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.102 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.103 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: osdmap e282: 6 total, 6 up, 6 in
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: pgmap v762: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/510296205' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e283 e283: 6 total, 6 up, 6 in
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.205 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.206 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.206 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.246 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:30:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:30:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:30:46 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/495746363' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.672 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.679 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.699 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.700 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:30:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:46.700 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:30:47 np0005548789.localdomain ceph-mon[298582]: osdmap e283: 6 total, 6 up, 6 in
Dec 06 10:30:47 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/495746363' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:47.347 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:30:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:47.347 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:30:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.587 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.589 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.589 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.590 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.591 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.595 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.701 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.702 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:30:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:47.702 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:30:48 np0005548789.localdomain ceph-mon[298582]: pgmap v764: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:30:48 np0005548789.localdomain sshd[338621]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:30:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:48.517 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:30:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:48.518 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:30:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:48.518 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:30:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:48.519 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:30:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:49.115 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:30:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:49.140 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:30:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:49.140 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:30:49 np0005548789.localdomain sshd[338621]: Received disconnect from 193.46.255.217 port 32158:11:  [preauth]
Dec 06 10:30:49 np0005548789.localdomain sshd[338621]: Disconnected from authenticating user root 193.46.255.217 port 32158 [preauth]
Dec 06 10:30:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:49.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:50 np0005548789.localdomain ceph-mon[298582]: pgmap v765: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:52 np0005548789.localdomain ceph-mon[298582]: pgmap v766: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.596 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.598 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.598 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.598 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.608 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:52.609 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:30:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:30:52 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:30:52 np0005548789.localdomain podman[338623]: 2025-12-06 10:30:52.91263296 +0000 UTC m=+0.076246421 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec 06 10:30:52 np0005548789.localdomain podman[338623]: 2025-12-06 10:30:52.921091309 +0000 UTC m=+0.084704760 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 10:30:52 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:30:52 np0005548789.localdomain podman[338624]: 2025-12-06 10:30:52.965905545 +0000 UTC m=+0.127611507 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:30:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 e284: 6 total, 6 up, 6 in
Dec 06 10:30:53 np0005548789.localdomain podman[338624]: 2025-12-06 10:30:53.008226083 +0000 UTC m=+0.169932095 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 06 10:30:53 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:30:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:53.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:53 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:53Z|00533|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:30:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:53.733 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:53 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 2 addresses
Dec 06 10:30:53 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:30:53 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:30:53 np0005548789.localdomain podman[338682]: 2025-12-06 10:30:53.761859725 +0000 UTC m=+0.084395690 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:30:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:30:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:30:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:30:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157928 "" "Go-http-client/1.1"
Dec 06 10:30:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:30:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19756 "" "Go-http-client/1.1"
Dec 06 10:30:54 np0005548789.localdomain ceph-mon[298582]: osdmap e284: 6 total, 6 up, 6 in
Dec 06 10:30:54 np0005548789.localdomain ceph-mon[298582]: pgmap v768: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Dec 06 10:30:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:54.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:54.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:30:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:55.183 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:30:55 np0005548789.localdomain dnsmasq[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/addn_hosts - 0 addresses
Dec 06 10:30:55 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/host
Dec 06 10:30:55 np0005548789.localdomain dnsmasq-dhcp[338452]: read /var/lib/neutron/dhcp/3ce5dfa7-fde4-49df-8cf8-aff98cb18adf/opts
Dec 06 10:30:55 np0005548789.localdomain podman[338722]: 2025-12-06 10:30:55.686848327 +0000 UTC m=+0.042017381 container kill d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:30:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:55.867 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:55Z|00534|binding|INFO|Releasing lport 1edce767-a07b-4181-9f27-a5630d291dd5 from this chassis (sb_readonly=0)
Dec 06 10:30:55 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:55Z|00535|binding|INFO|Setting lport 1edce767-a07b-4181-9f27-a5630d291dd5 down in Southbound
Dec 06 10:30:55 np0005548789.localdomain kernel: device tap1edce767-a0 left promiscuous mode
Dec 06 10:30:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:55.878 160509 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005548789.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpc2183f4b-9cdb-5853-8c79-53b0285a0de7-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d2c3fc1d605488db2b4af2af7696c67', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005548789.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4dcc78c6-bb1e-494e-a935-28e9e1421181, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>], logical_port=1edce767-a07b-4181-9f27-a5630d291dd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbac218c490>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:30:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:55.880 160509 INFO neutron.agent.ovn.metadata.agent [-] Port 1edce767-a07b-4181-9f27-a5630d291dd5 in datapath 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf unbound from our chassis
Dec 06 10:30:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:55.882 160509 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 06 10:30:55 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:30:55.882 160674 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9c1cf7-5e51-47a1-a94d-91553f4c24e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 10:30:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:55.892 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:56 np0005548789.localdomain ceph-mon[298582]: pgmap v769: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 2.2 MiB/s wr, 47 op/s
Dec 06 10:30:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/11701804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:57 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:30:57Z|00536|binding|INFO|Releasing lport 4fb81ffd-e198-4628-9bd0-0c0f0c89c33a from this chassis (sb_readonly=0)
Dec 06 10:30:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:57.535 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:57 np0005548789.localdomain dnsmasq[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/addn_hosts - 1 addresses
Dec 06 10:30:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/host
Dec 06 10:30:57 np0005548789.localdomain podman[338760]: 2025-12-06 10:30:57.550832616 +0000 UTC m=+0.047209260 container kill dd70518caed84fde347015325d6b35481123abc4a39df99f8bc9422c2468322f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8e238f59-5792-4ff4-95af-f993c8e9e14f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:30:57 np0005548789.localdomain dnsmasq-dhcp[314636]: read /var/lib/neutron/dhcp/8e238f59-5792-4ff4-95af-f993c8e9e14f/opts
Dec 06 10:30:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4191358629' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:30:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:30:57.610 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:30:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:30:58 np0005548789.localdomain ceph-mon[298582]: pgmap v770: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.0 MiB/s wr, 44 op/s
Dec 06 10:30:59 np0005548789.localdomain dnsmasq[338452]: exiting on receipt of SIGTERM
Dec 06 10:30:59 np0005548789.localdomain podman[338796]: 2025-12-06 10:30:59.463564992 +0000 UTC m=+0.060951112 container kill d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: libpod-d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a.scope: Deactivated successfully.
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:30:59 np0005548789.localdomain podman[338811]: 2025-12-06 10:30:59.512304537 +0000 UTC m=+0.040880065 container died d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: tmp-crun.1GOr4b.mount: Deactivated successfully.
Dec 06 10:30:59 np0005548789.localdomain podman[338828]: 2025-12-06 10:30:59.560902188 +0000 UTC m=+0.068818363 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:30:59 np0005548789.localdomain podman[338828]: 2025-12-06 10:30:59.571219704 +0000 UTC m=+0.079135899 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:30:59 np0005548789.localdomain podman[338811]: 2025-12-06 10:30:59.598482641 +0000 UTC m=+0.127058129 container cleanup d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: libpod-conmon-d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a.scope: Deactivated successfully.
Dec 06 10:30:59 np0005548789.localdomain podman[338818]: 2025-12-06 10:30:59.661282678 +0000 UTC m=+0.176835647 container remove d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ce5dfa7-fde4-49df-8cf8-aff98cb18adf, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:30:59 np0005548789.localdomain podman[338819]: 2025-12-06 10:30:59.729637435 +0000 UTC m=+0.237026244 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec 06 10:30:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:59.744 263652 INFO neutron.agent.dhcp.agent [None req-0ba61d5b-4921-4b96-a54b-dc004b1b0cee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:30:59 np0005548789.localdomain podman[338819]: 2025-12-06 10:30:59.751260639 +0000 UTC m=+0.258649438 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:30:59 np0005548789.localdomain neutron_dhcp_agent[263648]: 2025-12-06 10:30:59.760 263652 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Dec 06 10:30:59 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:31:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-7ec5a68b272409c0780e9b34c01467356b790bdd39419926a1b87579a37fc65f-merged.mount: Deactivated successfully.
Dec 06 10:31:00 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d68a3aac5ad9280c33e780fc50141c75b969f9713ad15986c7963925dcbf4e3a-userdata-shm.mount: Deactivated successfully.
Dec 06 10:31:00 np0005548789.localdomain systemd[1]: run-netns-qdhcp\x2d3ce5dfa7\x2dfde4\x2d49df\x2d8cf8\x2daff98cb18adf.mount: Deactivated successfully.
Dec 06 10:31:00 np0005548789.localdomain ceph-mon[298582]: pgmap v771: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3107954852' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:01 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:01.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3944337570' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.613 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.615 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.615 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.616 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:02 np0005548789.localdomain ceph-mon[298582]: pgmap v772: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.656 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:02 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:02.658 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:31:02 np0005548789.localdomain podman[338877]: 2025-12-06 10:31:02.918507335 +0000 UTC m=+0.082022977 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:31:02 np0005548789.localdomain podman[338877]: 2025-12-06 10:31:02.934050262 +0000 UTC m=+0.097565844 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:31:02 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:31:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:04 np0005548789.localdomain ceph-mon[298582]: pgmap v773: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:31:05 np0005548789.localdomain podman[338897]: 2025-12-06 10:31:05.924919378 +0000 UTC m=+0.085719152 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:31:05 np0005548789.localdomain podman[338897]: 2025-12-06 10:31:05.933717458 +0000 UTC m=+0.094517272 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:31:05 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:31:06 np0005548789.localdomain ceph-mon[298582]: pgmap v774: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:07 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:07.659 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.916 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.924 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.924 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775f0c41-2cc3-4cdb-a824-8d04d062e89d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.917121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad10f3d2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': '4a2f7a942ed623cdf75db3c2fd61369b2ae835cdf3e50f5f94f06ad13db67764'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.917121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad10ffa8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': '9e245cde37c4cc0fe03e2a0a1f4aa247497e91bcddf4d44a21618b2b5cf39d19'}]}, 'timestamp': '2025-12-06 10:31:07.924806', '_unique_id': '03f9762598d44edbb0f6558e9a72baff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '560337ff-c0de-4966-b5da-08cf7248ca7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.926507', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad11aebc-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': 'f802640ec9976e1be59a82994db3b72676983db61dbab4c028e85bae5654b3ae'}]}, 'timestamp': '2025-12-06 10:31:07.929276', '_unique_id': 'a523e61d035e4eb6935ae68c8a15e620'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.929 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98409ea8-9698-43e0-8116-fa953a9370e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.930359', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad11e1ca-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '6b09e2340de317ccf32f72013fa52a78b6cf81f03f714da5ddbfc55f029f54ec'}]}, 'timestamp': '2025-12-06 10:31:07.930570', '_unique_id': 'e2d6432194c149f68c4472cf1fe3bf86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.930 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.931 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09f7a8bf-fccc-410e-82f6-58ad00fe69ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.931554', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad121078-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': 'eab5c8ce225f0c8ee2c577810aff05269a32c19cd9a776734debeec2dded61a2'}]}, 'timestamp': '2025-12-06 10:31:07.931786', '_unique_id': 'f7321eab08314cfb971a82c84c4f9555'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.932 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f96c3131-9897-4b68-918e-490016eaf7c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.932822', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad124214-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '52fcb853fd3ecc5d41f85ff03caa13410c3dd55b196985e50c385e32623e1523'}]}, 'timestamp': '2025-12-06 10:31:07.933038', '_unique_id': '003e551d0dd04fb89557ba53410f8c51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8503cda-83c1-4529-9ce9-ade6b682c23c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.934013', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad127086-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '2da75099d3340401c766a38484881ebe79a2f8ed8d0f76919353395c22491636'}]}, 'timestamp': '2025-12-06 10:31:07.934251', '_unique_id': '177a191b6f164a88b53ebd5b6c5df656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.934 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.954 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.955 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bafb5fa-a862-4fe2-a9a7-0ff7c4e74a97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.935214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad159d7e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': 'b4e34862be15baeb471edc43e474f55a7813785999c9e868ac9d6c78dd24b6a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.935214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad15a7a6-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '34a22aeb3263196390e206e9e020b06f9dfe5680a6ea2c97d56ae5c80f88ce82'}]}, 'timestamp': '2025-12-06 10:31:07.955301', '_unique_id': '568c04d7587144948e96d852d14cfcbc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.956 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.957 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.957 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.957 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.957 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b82d3ff0-cde2-4054-880f-5a1f38888134', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.957179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad15f986-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': 'a78e772cf5fa37dccbc771c60973b3c0fc65658b67db31aa6163539119412739'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.957179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad16012e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': '07b5212ab75e826d956b818d14422060752650946c07b820144425369b3f0e1f'}]}, 'timestamp': '2025-12-06 10:31:07.957576', '_unique_id': '6a2571aa804a4c848a44c60dc325237c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.958 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd516ff0d-fc9c-4320-91aa-4603792aa29f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.958624', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad16322a-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '8e1f9dcb5733f4e11832ddda1362653999bfb3d63f71b776dee05ca39c93392b'}]}, 'timestamp': '2025-12-06 10:31:07.958860', '_unique_id': '152a4ea525dc4da0bcace3060a73b78a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.959 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 21340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6560707c-b25d-4a0a-9b63-f9586c00cc47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21340000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:31:07.959833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ad180c44-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.219738069, 'message_signature': 'd3a9c44c4bfced9ae22049d2e1c0db3842cc83257dd0cf134303ff55c33d2da6'}]}, 'timestamp': '2025-12-06 10:31:07.971143', '_unique_id': '7701cd13a78b42cfa261842e8644de6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.973 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c0ee22f-9807-49cd-b78d-b6a1cbd2d69a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.973619', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad1881ec-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '81aae15694f8dd36807140bccf4f9f484b6be743597bfaa396235eb87f27bda6'}]}, 'timestamp': '2025-12-06 10:31:07.974124', '_unique_id': 'db3396b47f4b4177a94cd4a90989b559'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.976 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ed5a95-3678-4d98-a268-2b2bfdcd33fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.976373', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad18ee8e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': 'c12ce71cec64f2cc688ea1625deba56ccda66a92ba96f4f2ef222d9540dd996d'}]}, 'timestamp': '2025-12-06 10:31:07.976952', '_unique_id': 'aad346e3f153439a9ab96c94de14fb01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.977 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.979 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '834a7dbc-2251-4cb5-b700-6413d6a5f914', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.979432', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad196440-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': 'e84ffeebf070043a762a7a14525ea7b314260fe2993593b54c30d80a9591387f'}]}, 'timestamp': '2025-12-06 10:31:07.979960', '_unique_id': '9a3294d9e59248c9a49bcb6fb27e560d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.981 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.982 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77d360b0-e76f-45ee-a2e1-bf0d2fd9d4ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.982504', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad19db78-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '88752c2cbacfc900ff45a5238397248be4f2e8cc320b1bff3914a33ecb9cf4a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.982504', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad19ed3e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '7a8daa934729653b8ebaa7355700cb7691728b6a0029521da6938601ace1b2fb'}]}, 'timestamp': '2025-12-06 10:31:07.983410', '_unique_id': '04b41d4edd9143a6b232cce47674ab0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.984 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.985 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.986 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '800f285e-8c47-474e-ad42-9c446b5795c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.985658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad1a5792-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '9fb376f7d7c6514497f7a8b16642e553e9aa7c103d6a3906d3d4f848093e9e70'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.985658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad1a69a8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '360ec3341ee197bca6dc928a7e5173e4087c429fd89b2c1a391c024735e2a949'}]}, 'timestamp': '2025-12-06 10:31:07.986576', '_unique_id': '90661cd375d043e78bb137a28739254b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.987 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.988 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.989 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74061940-5d7a-46f4-a732-f1cd1a24b58b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.988797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad1ad122-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '2e5028532a5ea915b0885b6e17411a5b4e65e9b67c719b04e63e15b5f176db35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.988797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad1ae0f4-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': 'e0f49567bc45426630b3b91b5ccfac970de1511017f4c262871d548d5d86c15f'}]}, 'timestamp': '2025-12-06 10:31:07.989660', '_unique_id': 'd3cc75a9dad8444789c0f6fe28850d52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.991 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.991 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '913ec8f8-d78d-4b42-9d4f-5859a2be69bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:31:07.992062', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'ad1b5084-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.175849412, 'message_signature': '3a4b90f6df8b649ea6f51f121badd7ca8660e5e7b1c7ca3c146ab46bd55b55a1'}]}, 'timestamp': '2025-12-06 10:31:07.992543', '_unique_id': '342d8da976354a3f9ab60fe265bd70ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8906853-560d-405a-b492-ea5c8cbee71d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.994141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad1b9ec2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': 'e33f624d0764844c1081e0453b363408b8f8845708d008281faf306ca18b2b40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.994141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad1ba962-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '2c8813068f80d363ae88c3b1724e02ba9c08dfd214afa8ee91dff01cb2bd0839'}]}, 'timestamp': '2025-12-06 10:31:07.994692', '_unique_id': '87a989f534f94596bdee918bb10fa6cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.996 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d721893-7405-48c5-8f3e-9bc5f85d52cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.996132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad1bec7e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '2b6472bbab95cf07745e2628f724758a96e65abdec8a81607ede934f08bedb20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.996132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad1bf7a0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.184549049, 'message_signature': '5b359e0a731d09a126d82fc5edfdab8076fababb920bb7f3dfa33e7d4324c0cd'}]}, 'timestamp': '2025-12-06 10:31:07.996694', '_unique_id': 'cb6044a385e14408ac61c4d02f1ac1f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.997 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.998 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c19f284-44e6-43ed-912a-ed0c6dac6b19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:31:07.998173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ad1c3c60-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': 'fdb3f0e55fffad2ef706b97334e4b263e91b0b13799adf32cfe1d5c8c049febf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:31:07.998173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ad1c46ec-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.166449224, 'message_signature': '599781b5c2daf5ad1b64665fe7479bced5e286260736c54c757ceb042c732f3e'}]}, 'timestamp': '2025-12-06 10:31:07.998739', '_unique_id': '0e46dcb292514842a95662aa4ee0b7b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:07.999 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.000 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.000 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.000 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '139f1451-f2c3-4024-8e60-ed2d8d93dc6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:31:08.000448', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ad1c9836-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13486.219738069, 'message_signature': '604c18ee5858edf9bf6324a73166860d0faf061c24bb0e36f3607aea3f0734d1'}]}, 'timestamp': '2025-12-06 10:31:08.000841', '_unique_id': 'ec2ff55762174195a96ba994f57a89b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:31:08 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:31:08.001 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:31:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:08 np0005548789.localdomain ceph-mon[298582]: pgmap v775: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:10 np0005548789.localdomain ceph-mon[298582]: pgmap v776: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:10 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:31:10 np0005548789.localdomain podman[338920]: 2025-12-06 10:31:10.918217479 +0000 UTC m=+0.079019485 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:31:10 np0005548789.localdomain podman[338920]: 2025-12-06 10:31:10.981925264 +0000 UTC m=+0.142727230 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 06 10:31:10 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:31:12 np0005548789.localdomain ceph-mon[298582]: pgmap v777: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.662 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.664 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.695 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:12 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:12.696 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:14 np0005548789.localdomain ceph-mon[298582]: pgmap v778: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:15 np0005548789.localdomain ceph-mon[298582]: mgrmap e55: np0005548790.kvkfyr(active, since 19m), standbys: np0005548785.vhqlsq, np0005548788.yvwbqq, np0005548789.mzhmje
Dec 06 10:31:15 np0005548789.localdomain sudo[338946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:31:15 np0005548789.localdomain sudo[338946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:15 np0005548789.localdomain sudo[338946]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:15 np0005548789.localdomain sudo[338964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:31:15 np0005548789.localdomain sudo[338964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:16 np0005548789.localdomain ceph-mon[298582]: pgmap v779: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:16 np0005548789.localdomain sudo[338964]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:16 np0005548789.localdomain sudo[339013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:31:16 np0005548789.localdomain sudo[339013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:31:16 np0005548789.localdomain sudo[339013]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:31:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:31:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:31:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:31:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:17 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.697 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.735 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.736 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.736 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.737 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:17 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:17.738 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:31:18 np0005548789.localdomain ceph-mon[298582]: pgmap v780: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:20 np0005548789.localdomain ceph-mon[298582]: pgmap v781: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:22 np0005548789.localdomain ceph-mon[298582]: pgmap v782: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.739 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.741 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.741 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.741 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.770 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:22 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:22.771 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:31:23 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:31:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:31:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:31:23 np0005548789.localdomain systemd[1]: tmp-crun.zR8VFh.mount: Deactivated successfully.
Dec 06 10:31:23 np0005548789.localdomain podman[339031]: 2025-12-06 10:31:23.925546898 +0000 UTC m=+0.080017805 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec 06 10:31:24 np0005548789.localdomain podman[339031]: 2025-12-06 10:31:24.011158956 +0000 UTC m=+0.165629843 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:31:24 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:31:24 np0005548789.localdomain podman[339032]: 2025-12-06 10:31:23.980135224 +0000 UTC m=+0.134306462 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:31:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:31:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:31:24 np0005548789.localdomain ceph-mon[298582]: pgmap v783: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Dec 06 10:31:24 np0005548789.localdomain podman[339032]: 2025-12-06 10:31:24.114677392 +0000 UTC m=+0.268848680 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:31:24 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:31:24 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19269 "" "Go-http-client/1.1"
Dec 06 10:31:24 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:31:26 np0005548789.localdomain ceph-mon[298582]: pgmap v784: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.772 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.774 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.774 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.775 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.823 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:27 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:27.824 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:28 np0005548789.localdomain ceph-mon[298582]: pgmap v785: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:31:29 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:31:29 np0005548789.localdomain podman[339072]: 2025-12-06 10:31:29.917624036 +0000 UTC m=+0.076910960 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:31:29 np0005548789.localdomain podman[339072]: 2025-12-06 10:31:29.927284953 +0000 UTC m=+0.086571877 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:31:29 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:31:29 np0005548789.localdomain ovn_controller[154851]: 2025-12-06T10:31:29Z|00537|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Dec 06 10:31:29 np0005548789.localdomain podman[339071]: 2025-12-06 10:31:29.968485847 +0000 UTC m=+0.129906047 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm)
Dec 06 10:31:29 np0005548789.localdomain podman[339071]: 2025-12-06 10:31:29.981237108 +0000 UTC m=+0.142657338 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 06 10:31:29 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:31:30 np0005548789.localdomain ceph-mon[298582]: pgmap v786: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Dec 06 10:31:32 np0005548789.localdomain ceph-mon[298582]: pgmap v787: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.825 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.827 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.827 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.827 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.862 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:32 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:32.863 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:33 np0005548789.localdomain sshd[339111]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:33 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:31:33 np0005548789.localdomain sshd[339111]: Accepted publickey for zuul from 38.102.83.114 port 43768 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:33 np0005548789.localdomain systemd-logind[766]: New session 79 of user zuul.
Dec 06 10:31:33 np0005548789.localdomain systemd[1]: Started Session 79 of User zuul.
Dec 06 10:31:33 np0005548789.localdomain sshd[339111]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:33 np0005548789.localdomain podman[339113]: 2025-12-06 10:31:33.607956649 +0000 UTC m=+0.092836239 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:31:33 np0005548789.localdomain podman[339113]: 2025-12-06 10:31:33.619796763 +0000 UTC m=+0.104676393 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:31:33 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:31:33 np0005548789.localdomain sudo[339151]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-draqflreqpgzjgowrxqripwezkdnghwm ; /usr/bin/python3
Dec 06 10:31:33 np0005548789.localdomain sudo[339151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:33 np0005548789.localdomain python3[339153]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-7de2-0762-00000000000c-1-overcloudnovacompute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 10:31:34 np0005548789.localdomain ceph-mon[298582]: pgmap v788: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:34 np0005548789.localdomain sudo[339151]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:34 np0005548789.localdomain sshd[339156]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:34 np0005548789.localdomain sshd[339158]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:36 np0005548789.localdomain sshd[339158]: Received disconnect from 118.219.234.233 port 46656:11: Bye Bye [preauth]
Dec 06 10:31:36 np0005548789.localdomain sshd[339158]: Disconnected from authenticating user root 118.219.234.233 port 46656 [preauth]
Dec 06 10:31:36 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:31:36 np0005548789.localdomain podman[339160]: 2025-12-06 10:31:36.284856282 +0000 UTC m=+0.087471915 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:31:36 np0005548789.localdomain podman[339160]: 2025-12-06 10:31:36.298309085 +0000 UTC m=+0.100924718 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:31:36 np0005548789.localdomain sshd[339156]: Received disconnect from 14.194.101.210 port 45080:11: Bye Bye [preauth]
Dec 06 10:31:36 np0005548789.localdomain sshd[339156]: Disconnected from authenticating user root 14.194.101.210 port 45080 [preauth]
Dec 06 10:31:36 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:31:36 np0005548789.localdomain ceph-mon[298582]: pgmap v789: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.581944) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097581988, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1135, "num_deletes": 252, "total_data_size": 1870586, "memory_usage": 2068016, "flush_reason": "Manual Compaction"}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097591866, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1214849, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41093, "largest_seqno": 42223, "table_properties": {"data_size": 1210299, "index_size": 2149, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10698, "raw_average_key_size": 20, "raw_value_size": 1200818, "raw_average_value_size": 2291, "num_data_blocks": 96, "num_entries": 524, "num_filter_entries": 524, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765017022, "oldest_key_time": 1765017022, "file_creation_time": 1765017097, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 9974 microseconds, and 4412 cpu microseconds.
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.591917) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1214849 bytes OK
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.591946) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.594161) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.594185) EVENT_LOG_v1 {"time_micros": 1765017097594178, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.594207) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1865029, prev total WAL file size 1865029, number of live WAL files 2.
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.595118) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end)
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1186KB)], [72(18MB)]
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097595176, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 20844536, "oldest_snapshot_seqno": -1}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14535 keys, 19226756 bytes, temperature: kUnknown
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097697591, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 19226756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19145609, "index_size": 43734, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36357, "raw_key_size": 392178, "raw_average_key_size": 26, "raw_value_size": 18900741, "raw_average_value_size": 1300, "num_data_blocks": 1601, "num_entries": 14535, "num_filter_entries": 14535, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015623, "oldest_key_time": 0, "file_creation_time": 1765017097, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1ec2a3a6-c7d0-4f1e-9923-5a3b782725ae", "db_session_id": "JMBO5KX1IJCJ8FWC64EX", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.698021) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 19226756 bytes
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.699748) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.3 rd, 187.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 18.7 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(33.0) write-amplify(15.8) OK, records in: 15063, records dropped: 528 output_compression: NoCompression
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.699814) EVENT_LOG_v1 {"time_micros": 1765017097699797, "job": 44, "event": "compaction_finished", "compaction_time_micros": 102507, "compaction_time_cpu_micros": 51879, "output_level": 6, "num_output_files": 1, "total_output_size": 19226756, "num_input_records": 15063, "num_output_records": 14535, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097700197, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005548789/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765017097702951, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.595004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.702986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.702993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.702996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.702999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain ceph-mon[298582]: rocksdb: (Original Log Time 2025/12/06-10:31:37.703002) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.864 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.901 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.902 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.902 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.903 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:37 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:37.903 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:38 np0005548789.localdomain ceph-mon[298582]: pgmap v790: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:31:39 np0005548789.localdomain sshd[339111]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:39 np0005548789.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Dec 06 10:31:39 np0005548789.localdomain systemd-logind[766]: Session 79 logged out. Waiting for processes to exit.
Dec 06 10:31:39 np0005548789.localdomain systemd-logind[766]: Removed session 79.
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:31:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/1455173782' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:31:40 np0005548789.localdomain ceph-mon[298582]: pgmap v791: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:41 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:31:41 np0005548789.localdomain systemd[1]: tmp-crun.6YDqm6.mount: Deactivated successfully.
Dec 06 10:31:41 np0005548789.localdomain podman[339184]: 2025-12-06 10:31:41.93118899 +0000 UTC m=+0.096171321 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:31:41 np0005548789.localdomain podman[339184]: 2025-12-06 10:31:41.965224915 +0000 UTC m=+0.130207306 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:31:41 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:31:42 np0005548789.localdomain ceph-mon[298582]: pgmap v792: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.904 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.906 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.906 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.906 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.925 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:42 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:42.926 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:44 np0005548789.localdomain ceph-mon[298582]: pgmap v793: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:44 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:44.216 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.222 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.223 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.223 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.223 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.224 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:31:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:31:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1627710630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.644 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.726 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.726 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.922 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.924 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11097MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.924 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:45.924 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.028 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.029 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.029 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.087 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:31:46 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:31:46 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3484471563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.551 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.555 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:31:46 np0005548789.localdomain ceph-mon[298582]: pgmap v794: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1627710630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3484471563' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.574 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.575 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:31:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:46.575 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.651s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:31:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:31:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:31:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:31:47.347 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:31:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:31:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:31:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:31:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.927 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.930 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.930 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.930 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.943 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:47.944 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.575 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.576 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.576 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:31:48 np0005548789.localdomain ceph-mon[298582]: pgmap v795: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.885 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.886 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.886 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:31:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:48.887 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:31:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:49.326 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:31:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:49.343 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:31:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:49.344 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:31:50 np0005548789.localdomain ceph-mon[298582]: pgmap v796: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:51.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:51 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:51.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:52 np0005548789.localdomain sshd[339253]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:52 np0005548789.localdomain sshd[339253]: Accepted publickey for zuul from 38.102.83.114 port 48858 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:52 np0005548789.localdomain systemd-logind[766]: New session 80 of user zuul.
Dec 06 10:31:52 np0005548789.localdomain systemd[1]: Started Session 80 of User zuul.
Dec 06 10:31:52 np0005548789.localdomain sshd[339253]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:52 np0005548789.localdomain ceph-mon[298582]: pgmap v797: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:52 np0005548789.localdomain sudo[339257]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Dec 06 10:31:52 np0005548789.localdomain sudo[339257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.945 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.947 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.947 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.947 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.992 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:52.992 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:31:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:31:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:31:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:31:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:31:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19278 "" "Go-http-client/1.1"
Dec 06 10:31:54 np0005548789.localdomain sudo[339257]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:54 np0005548789.localdomain sshd[339256]: Received disconnect from 38.102.83.114 port 48858:11: disconnected by user
Dec 06 10:31:54 np0005548789.localdomain sshd[339256]: Disconnected from user zuul 38.102.83.114 port 48858
Dec 06 10:31:54 np0005548789.localdomain sshd[339253]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:54 np0005548789.localdomain ceph-mon[298582]: pgmap v798: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:54 np0005548789.localdomain systemd-logind[766]: Session 80 logged out. Waiting for processes to exit.
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: session-80.scope: Consumed 1.025s CPU time.
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:31:54 np0005548789.localdomain systemd-logind[766]: Removed session 80.
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:31:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:54.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:54.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:54.181 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:31:54 np0005548789.localdomain podman[339281]: 2025-12-06 10:31:54.282983413 +0000 UTC m=+0.127249236 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: tmp-crun.ZVj9TE.mount: Deactivated successfully.
Dec 06 10:31:54 np0005548789.localdomain podman[339275]: 2025-12-06 10:31:54.444600921 +0000 UTC m=+0.299574233 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 10:31:54 np0005548789.localdomain podman[339281]: 2025-12-06 10:31:54.510264756 +0000 UTC m=+0.354530529 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:31:54 np0005548789.localdomain podman[339275]: 2025-12-06 10:31:54.665124987 +0000 UTC m=+0.520098309 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:31:54 np0005548789.localdomain sshd[339317]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:54 np0005548789.localdomain sshd[339317]: Accepted publickey for zuul from 38.102.83.114 port 48874 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:54 np0005548789.localdomain systemd-logind[766]: New session 81 of user zuul.
Dec 06 10:31:54 np0005548789.localdomain systemd[1]: Started Session 81 of User zuul.
Dec 06 10:31:54 np0005548789.localdomain sshd[339317]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:54 np0005548789.localdomain sudo[339321]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Dec 06 10:31:54 np0005548789.localdomain sudo[339321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:55 np0005548789.localdomain sudo[339321]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:55 np0005548789.localdomain sshd[339320]: Received disconnect from 38.102.83.114 port 48874:11: disconnected by user
Dec 06 10:31:55 np0005548789.localdomain sshd[339320]: Disconnected from user zuul 38.102.83.114 port 48874
Dec 06 10:31:55 np0005548789.localdomain sshd[339317]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:55 np0005548789.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Dec 06 10:31:55 np0005548789.localdomain systemd-logind[766]: Session 81 logged out. Waiting for processes to exit.
Dec 06 10:31:55 np0005548789.localdomain systemd-logind[766]: Removed session 81.
Dec 06 10:31:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:55.184 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:55 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:55.185 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:31:55 np0005548789.localdomain sshd[339339]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:55 np0005548789.localdomain sshd[339339]: Accepted publickey for zuul from 38.102.83.114 port 48884 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:55 np0005548789.localdomain systemd-logind[766]: New session 82 of user zuul.
Dec 06 10:31:55 np0005548789.localdomain systemd[1]: Started Session 82 of User zuul.
Dec 06 10:31:55 np0005548789.localdomain sshd[339339]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:55 np0005548789.localdomain sudo[339343]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Dec 06 10:31:55 np0005548789.localdomain sudo[339343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:55 np0005548789.localdomain sudo[339343]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:55 np0005548789.localdomain sshd[339342]: Received disconnect from 38.102.83.114 port 48884:11: disconnected by user
Dec 06 10:31:55 np0005548789.localdomain sshd[339342]: Disconnected from user zuul 38.102.83.114 port 48884
Dec 06 10:31:55 np0005548789.localdomain sshd[339339]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:55 np0005548789.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Dec 06 10:31:55 np0005548789.localdomain systemd-logind[766]: Session 82 logged out. Waiting for processes to exit.
Dec 06 10:31:55 np0005548789.localdomain systemd-logind[766]: Removed session 82.
Dec 06 10:31:56 np0005548789.localdomain sshd[339361]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:56 np0005548789.localdomain sshd[339361]: Accepted publickey for zuul from 38.102.83.114 port 39878 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: New session 83 of user zuul.
Dec 06 10:31:56 np0005548789.localdomain systemd[1]: Started Session 83 of User zuul.
Dec 06 10:31:56 np0005548789.localdomain sshd[339361]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:56 np0005548789.localdomain sudo[339365]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Dec 06 10:31:56 np0005548789.localdomain sudo[339365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:56 np0005548789.localdomain sudo[339365]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:56 np0005548789.localdomain sshd[339364]: Received disconnect from 38.102.83.114 port 39878:11: disconnected by user
Dec 06 10:31:56 np0005548789.localdomain sshd[339364]: Disconnected from user zuul 38.102.83.114 port 39878
Dec 06 10:31:56 np0005548789.localdomain sshd[339361]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:56 np0005548789.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: Session 83 logged out. Waiting for processes to exit.
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: Removed session 83.
Dec 06 10:31:56 np0005548789.localdomain ceph-mon[298582]: pgmap v799: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:56 np0005548789.localdomain sshd[339383]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:56 np0005548789.localdomain sshd[339383]: Accepted publickey for zuul from 38.102.83.114 port 39886 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: New session 84 of user zuul.
Dec 06 10:31:56 np0005548789.localdomain systemd[1]: Started Session 84 of User zuul.
Dec 06 10:31:56 np0005548789.localdomain sshd[339383]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:56 np0005548789.localdomain sudo[339387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Dec 06 10:31:56 np0005548789.localdomain sudo[339387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:56 np0005548789.localdomain sudo[339387]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:56 np0005548789.localdomain sshd[339386]: Received disconnect from 38.102.83.114 port 39886:11: disconnected by user
Dec 06 10:31:56 np0005548789.localdomain sshd[339386]: Disconnected from user zuul 38.102.83.114 port 39886
Dec 06 10:31:56 np0005548789.localdomain sshd[339383]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:56 np0005548789.localdomain systemd[1]: session-84.scope: Deactivated successfully.
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: Session 84 logged out. Waiting for processes to exit.
Dec 06 10:31:56 np0005548789.localdomain systemd-logind[766]: Removed session 84.
Dec 06 10:31:57 np0005548789.localdomain sshd[339405]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:57 np0005548789.localdomain sshd[339405]: Accepted publickey for zuul from 38.102.83.114 port 39900 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:57 np0005548789.localdomain systemd-logind[766]: New session 85 of user zuul.
Dec 06 10:31:57 np0005548789.localdomain systemd[1]: Started Session 85 of User zuul.
Dec 06 10:31:57 np0005548789.localdomain sshd[339405]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3668081092' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1158284102' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:31:57 np0005548789.localdomain sudo[339409]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Dec 06 10:31:57 np0005548789.localdomain sudo[339409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:57 np0005548789.localdomain sudo[339409]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:57 np0005548789.localdomain sshd[339408]: Received disconnect from 38.102.83.114 port 39900:11: disconnected by user
Dec 06 10:31:57 np0005548789.localdomain sshd[339408]: Disconnected from user zuul 38.102.83.114 port 39900
Dec 06 10:31:57 np0005548789.localdomain sshd[339405]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:57 np0005548789.localdomain systemd[1]: session-85.scope: Deactivated successfully.
Dec 06 10:31:57 np0005548789.localdomain systemd-logind[766]: Session 85 logged out. Waiting for processes to exit.
Dec 06 10:31:57 np0005548789.localdomain systemd-logind[766]: Removed session 85.
Dec 06 10:31:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:57.994 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:57.997 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:31:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:57.997 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:31:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:57.998 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:58.030 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:31:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:31:58.031 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:31:58 np0005548789.localdomain sshd[339427]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:58 np0005548789.localdomain sshd[339427]: Accepted publickey for zuul from 38.102.83.114 port 39902 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:58 np0005548789.localdomain systemd-logind[766]: New session 86 of user zuul.
Dec 06 10:31:58 np0005548789.localdomain systemd[1]: Started Session 86 of User zuul.
Dec 06 10:31:58 np0005548789.localdomain sshd[339427]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:58 np0005548789.localdomain sudo[339431]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Dec 06 10:31:58 np0005548789.localdomain sudo[339431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:58 np0005548789.localdomain sudo[339431]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:58 np0005548789.localdomain sshd[339430]: Received disconnect from 38.102.83.114 port 39902:11: disconnected by user
Dec 06 10:31:58 np0005548789.localdomain sshd[339430]: Disconnected from user zuul 38.102.83.114 port 39902
Dec 06 10:31:58 np0005548789.localdomain sshd[339427]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:58 np0005548789.localdomain systemd[1]: session-86.scope: Deactivated successfully.
Dec 06 10:31:58 np0005548789.localdomain systemd-logind[766]: Session 86 logged out. Waiting for processes to exit.
Dec 06 10:31:58 np0005548789.localdomain systemd-logind[766]: Removed session 86.
Dec 06 10:31:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:31:58 np0005548789.localdomain ceph-mon[298582]: pgmap v800: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:31:58 np0005548789.localdomain sshd[339449]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:58 np0005548789.localdomain sshd[339449]: Accepted publickey for zuul from 38.102.83.114 port 39918 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:58 np0005548789.localdomain systemd-logind[766]: New session 87 of user zuul.
Dec 06 10:31:58 np0005548789.localdomain systemd[1]: Started Session 87 of User zuul.
Dec 06 10:31:58 np0005548789.localdomain sshd[339449]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:58 np0005548789.localdomain sudo[339453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Dec 06 10:31:58 np0005548789.localdomain sudo[339453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:58 np0005548789.localdomain sudo[339453]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:58 np0005548789.localdomain sshd[339452]: Received disconnect from 38.102.83.114 port 39918:11: disconnected by user
Dec 06 10:31:58 np0005548789.localdomain sshd[339452]: Disconnected from user zuul 38.102.83.114 port 39918
Dec 06 10:31:58 np0005548789.localdomain sshd[339449]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:58 np0005548789.localdomain systemd[1]: session-87.scope: Deactivated successfully.
Dec 06 10:31:59 np0005548789.localdomain systemd-logind[766]: Session 87 logged out. Waiting for processes to exit.
Dec 06 10:31:59 np0005548789.localdomain systemd-logind[766]: Removed session 87.
Dec 06 10:31:59 np0005548789.localdomain sshd[339471]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:31:59 np0005548789.localdomain sshd[339471]: Accepted publickey for zuul from 38.102.83.114 port 39932 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:31:59 np0005548789.localdomain systemd-logind[766]: New session 88 of user zuul.
Dec 06 10:31:59 np0005548789.localdomain systemd[1]: Started Session 88 of User zuul.
Dec 06 10:31:59 np0005548789.localdomain sshd[339471]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:31:59 np0005548789.localdomain sudo[339475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Dec 06 10:31:59 np0005548789.localdomain sudo[339475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:31:59 np0005548789.localdomain sudo[339475]: pam_unix(sudo:session): session closed for user root
Dec 06 10:31:59 np0005548789.localdomain sshd[339474]: Received disconnect from 38.102.83.114 port 39932:11: disconnected by user
Dec 06 10:31:59 np0005548789.localdomain sshd[339474]: Disconnected from user zuul 38.102.83.114 port 39932
Dec 06 10:31:59 np0005548789.localdomain sshd[339471]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:31:59 np0005548789.localdomain systemd[1]: session-88.scope: Deactivated successfully.
Dec 06 10:31:59 np0005548789.localdomain systemd-logind[766]: Session 88 logged out. Waiting for processes to exit.
Dec 06 10:31:59 np0005548789.localdomain systemd-logind[766]: Removed session 88.
Dec 06 10:32:00 np0005548789.localdomain ceph-mon[298582]: pgmap v801: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:32:00 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:32:00 np0005548789.localdomain podman[339494]: 2025-12-06 10:32:00.918788888 +0000 UTC m=+0.078237001 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:32:00 np0005548789.localdomain podman[339494]: 2025-12-06 10:32:00.927247648 +0000 UTC m=+0.086695771 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 06 10:32:00 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:32:01 np0005548789.localdomain podman[339493]: 2025-12-06 10:32:01.013127583 +0000 UTC m=+0.172054550 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Dec 06 10:32:01 np0005548789.localdomain podman[339493]: 2025-12-06 10:32:01.027999469 +0000 UTC m=+0.186926406 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 06 10:32:01 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:32:01 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/70586077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:02 np0005548789.localdomain ceph-mon[298582]: pgmap v802: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1583459833' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:03.033 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:03 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:32:03 np0005548789.localdomain systemd[1]: tmp-crun.DTD92Z.mount: Deactivated successfully.
Dec 06 10:32:03 np0005548789.localdomain podman[339533]: 2025-12-06 10:32:03.936248411 +0000 UTC m=+0.093033967 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 06 10:32:03 np0005548789.localdomain podman[339533]: 2025-12-06 10:32:03.944737111 +0000 UTC m=+0.101522667 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:32:03 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:32:04 np0005548789.localdomain ceph-mon[298582]: pgmap v803: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:06 np0005548789.localdomain ceph-mon[298582]: pgmap v804: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:06 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:32:06 np0005548789.localdomain podman[339551]: 2025-12-06 10:32:06.92394506 +0000 UTC m=+0.083764972 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:32:06 np0005548789.localdomain podman[339551]: 2025-12-06 10:32:06.933946537 +0000 UTC m=+0.093766449 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:32:06 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.036 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.057 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.057 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.057 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.059 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:08.060 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:08 np0005548789.localdomain ceph-mon[298582]: pgmap v805: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:10 np0005548789.localdomain ceph-mon[298582]: pgmap v806: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:12 np0005548789.localdomain ceph-mon[298582]: pgmap v807: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:12 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:32:12 np0005548789.localdomain podman[339574]: 2025-12-06 10:32:12.924220598 +0000 UTC m=+0.084698429 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:32:13 np0005548789.localdomain podman[339574]: 2025-12-06 10:32:13.019244654 +0000 UTC m=+0.179722525 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:32:13 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:32:13 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:13.061 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:13 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:14 np0005548789.localdomain ceph-mon[298582]: pgmap v808: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:14 np0005548789.localdomain sshd[339600]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:32:16 np0005548789.localdomain sudo[339602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:16 np0005548789.localdomain sudo[339602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:16 np0005548789.localdomain ceph-mon[298582]: pgmap v809: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:16 np0005548789.localdomain sudo[339602]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:16 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:16 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:16 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:16 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:32:16 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:32:16 np0005548789.localdomain sudo[339620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Dec 06 10:32:16 np0005548789.localdomain sudo[339620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:17 np0005548789.localdomain sshd[339600]: Received disconnect from 101.47.49.180 port 42428:11: Bye Bye [preauth]
Dec 06 10:32:17 np0005548789.localdomain sshd[339600]: Disconnected from authenticating user root 101.47.49.180 port 42428 [preauth]
Dec 06 10:32:17 np0005548789.localdomain podman[339710]: 2025-12-06 10:32:17.467908846 +0000 UTC m=+0.094460760 container exec ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:32:17 np0005548789.localdomain podman[339710]: 2025-12-06 10:32:17.573212226 +0000 UTC m=+0.199764150 container exec_died ecefb99ff092b207f760d60e8fd97152c624df0e1016549024435ae641ee2e06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-1939e851-b10c-5c3b-9bb7-8e7f380233e8-crash-np0005548789, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.41.4, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git)
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.064 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.067 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.067 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.067 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.102 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:18 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:18.103 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:18 np0005548789.localdomain sudo[339620]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:18 np0005548789.localdomain sudo[339832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:18 np0005548789.localdomain sudo[339832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:18 np0005548789.localdomain sudo[339832]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:18 np0005548789.localdomain sudo[339850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Dec 06 10:32:18 np0005548789.localdomain sudo[339850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: pgmap v810: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:18 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:19 np0005548789.localdomain sudo[339850]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:19 np0005548789.localdomain sudo[339899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:32:19 np0005548789.localdomain sudo[339899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:19 np0005548789.localdomain sudo[339899]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:19 np0005548789.localdomain sudo[339917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/1939e851-b10c-5c3b-9bb7-8e7f380233e8/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 1939e851-b10c-5c3b-9bb7-8e7f380233e8 -- inventory --format=json-pretty --filter-for-batch
Dec 06 10:32:19 np0005548789.localdomain sudo[339917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.90046392 +0000 UTC m=+0.064120608 container create 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1763362218, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True)
Dec 06 10:32:19 np0005548789.localdomain systemd[1]: Started libpod-conmon-7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162.scope.
Dec 06 10:32:19 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.964797274 +0000 UTC m=+0.128453952 container init 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64)
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.972701767 +0000 UTC m=+0.136358485 container start 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7)
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.973060748 +0000 UTC m=+0.136717426 container attach 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container)
Dec 06 10:32:19 np0005548789.localdomain laughing_stonebraker[339992]: 167 167
Dec 06 10:32:19 np0005548789.localdomain systemd[1]: libpod-7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162.scope: Deactivated successfully.
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.978836055 +0000 UTC m=+0.142492743 container died 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main)
Dec 06 10:32:19 np0005548789.localdomain podman[339976]: 2025-12-06 10:32:19.881269271 +0000 UTC m=+0.044925969 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:32:20 np0005548789.localdomain podman[339997]: 2025-12-06 10:32:20.079029429 +0000 UTC m=+0.087011280 container remove 7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_stonebraker, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2025-11-26T19:44:28Z, RELEASE=main)
Dec 06 10:32:20 np0005548789.localdomain systemd[1]: libpod-conmon-7de12f4b8e5351ecb3e23cf67fe4ac0e5f55d06c61702aa10a535f3f43a85162.scope: Deactivated successfully.
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:20.305494368 +0000 UTC m=+0.078606463 container create 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True)
Dec 06 10:32:20 np0005548789.localdomain systemd[1]: Started libpod-conmon-2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508.scope.
Dec 06 10:32:20 np0005548789.localdomain systemd[1]: Started libcrun container.
Dec 06 10:32:20 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5ec19e81d307a0477d38c55830f6955127450d3e8f2632ae4c747f6e89f5d2b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5ec19e81d307a0477d38c55830f6955127450d3e8f2632ae4c747f6e89f5d2b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:20.272707211 +0000 UTC m=+0.045819346 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Dec 06 10:32:20 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5ec19e81d307a0477d38c55830f6955127450d3e8f2632ae4c747f6e89f5d2b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548789.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5ec19e81d307a0477d38c55830f6955127450d3e8f2632ae4c747f6e89f5d2b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:20.380263211 +0000 UTC m=+0.153375296 container init 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1763362218, build-date=2025-11-26T19:44:28Z, name=rhceph, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, architecture=x86_64, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:20.387863424 +0000 UTC m=+0.160975519 container start 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, release=1763362218, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc.)
Dec 06 10:32:20 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:20.388182164 +0000 UTC m=+0.161294279 container attach 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, com.redhat.component=rhceph-container, release=1763362218, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=)
Dec 06 10:32:20 np0005548789.localdomain ceph-mon[298582]: pgmap v811: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:20 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-c943bee9cdab6729faccb8c09bd133ac2bcd8952e0ae93bde6028bd7291b0243-merged.mount: Deactivated successfully.
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]: [
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:     {
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "available": false,
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "ceph_device": false,
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "lsm_data": {},
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "lvs": [],
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "path": "/dev/sr0",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "rejected_reasons": [
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "Has a FileSystem",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "Insufficient space (<5GB)"
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         ],
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         "sys_api": {
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "actuators": null,
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "device_nodes": "sr0",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "human_readable_size": "482.00 KB",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "id_bus": "ata",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "model": "QEMU DVD-ROM",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "nr_requests": "2",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "partitions": {},
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "path": "/dev/sr0",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "removable": "1",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "rev": "2.5+",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "ro": "0",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "rotational": "1",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "sas_address": "",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "sas_device_handle": "",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "scheduler_mode": "mq-deadline",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "sectors": 0,
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "sectorsize": "2048",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "size": 493568.0,
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "support_discard": "0",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "type": "disk",
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:             "vendor": "QEMU"
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:         }
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]:     }
Dec 06 10:32:21 np0005548789.localdomain tender_keller[340034]: ]
Dec 06 10:32:21 np0005548789.localdomain systemd[1]: libpod-2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508.scope: Deactivated successfully.
Dec 06 10:32:21 np0005548789.localdomain systemd[1]: libpod-2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508.scope: Consumed 1.018s CPU time.
Dec 06 10:32:21 np0005548789.localdomain podman[340019]: 2025-12-06 10:32:21.420055134 +0000 UTC m=+1.193167269 container died 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, io.buildah.version=1.41.4, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, architecture=x86_64, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0)
Dec 06 10:32:21 np0005548789.localdomain systemd[1]: tmp-crun.qffW6I.mount: Deactivated successfully.
Dec 06 10:32:21 np0005548789.localdomain systemd[1]: var-lib-containers-storage-overlay-e5ec19e81d307a0477d38c55830f6955127450d3e8f2632ae4c747f6e89f5d2b-merged.mount: Deactivated successfully.
Dec 06 10:32:21 np0005548789.localdomain podman[342139]: 2025-12-06 10:32:21.504862756 +0000 UTC m=+0.073192247 container remove 2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_keller, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=1763362218, version=7, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True)
Dec 06 10:32:21 np0005548789.localdomain systemd[1]: libpod-conmon-2e67d0adbc9ce78467f67e359357e3813aea1ac4b6774a4d393f5b635f7ec508.scope: Deactivated successfully.
Dec 06 10:32:21 np0005548789.localdomain sudo[339917]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:22 np0005548789.localdomain sudo[342153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:32:22 np0005548789.localdomain sudo[342153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Dec 06 10:32:22 np0005548789.localdomain sudo[342153]: pam_unix(sudo:session): session closed for user root
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: pgmap v812: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548789.localdomain to 836.6M
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548789.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548790.localdomain to 836.6M
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548790.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Adjusting osd_memory_target on np0005548788.localdomain to 836.6M
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: Unable to set osd_memory_target on np0005548788.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 172.18.0.108:0/2122066654' entity='mgr.np0005548790.kvkfyr' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 06 10:32:22 np0005548789.localdomain ceph-mon[298582]: from='mgr.27020 ' entity='mgr.np0005548790.kvkfyr' 
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.104 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.107 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.107 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.107 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.149 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:23 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:23.150 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:23 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:23 np0005548789.localdomain podman[241090]: time="2025-12-06T10:32:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:32:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:32:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:32:23 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:32:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1"
Dec 06 10:32:24 np0005548789.localdomain ceph-mon[298582]: pgmap v813: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:32:24 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:32:24 np0005548789.localdomain podman[342172]: 2025-12-06 10:32:24.948102178 +0000 UTC m=+0.100084402 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 06 10:32:24 np0005548789.localdomain podman[342172]: 2025-12-06 10:32:24.98727188 +0000 UTC m=+0.139254114 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 06 10:32:25 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:32:25 np0005548789.localdomain podman[342171]: 2025-12-06 10:32:25.040162253 +0000 UTC m=+0.195955023 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 06 10:32:25 np0005548789.localdomain podman[342171]: 2025-12-06 10:32:25.07526401 +0000 UTC m=+0.231056760 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:32:25 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:32:26 np0005548789.localdomain ceph-mon[298582]: pgmap v814: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.151 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.180 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.180 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.181 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.182 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:28 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:28.183 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:28 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:28 np0005548789.localdomain ceph-mon[298582]: pgmap v815: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:30 np0005548789.localdomain ceph-mon[298582]: pgmap v816: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:30 np0005548789.localdomain sshd[342214]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:32:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:32:31 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:32:31 np0005548789.localdomain podman[342215]: 2025-12-06 10:32:31.912730434 +0000 UTC m=+0.070219355 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350)
Dec 06 10:32:31 np0005548789.localdomain podman[342215]: 2025-12-06 10:32:31.92661522 +0000 UTC m=+0.084104091 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Dec 06 10:32:31 np0005548789.localdomain systemd[1]: tmp-crun.KBycUn.mount: Deactivated successfully.
Dec 06 10:32:31 np0005548789.localdomain podman[342216]: 2025-12-06 10:32:31.945915302 +0000 UTC m=+0.096623615 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:32:31 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:32:31 np0005548789.localdomain podman[342216]: 2025-12-06 10:32:31.957230429 +0000 UTC m=+0.107938802 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 06 10:32:31 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:32:32 np0005548789.localdomain ceph-mon[298582]: pgmap v817: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.183 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.186 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.186 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.186 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.217 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:33 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:33.218 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:33 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:34 np0005548789.localdomain ceph-mon[298582]: pgmap v818: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:34 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:32:34 np0005548789.localdomain podman[342255]: 2025-12-06 10:32:34.918268529 +0000 UTC m=+0.078106837 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0)
Dec 06 10:32:34 np0005548789.localdomain podman[342255]: 2025-12-06 10:32:34.933055483 +0000 UTC m=+0.092893761 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:32:34 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:32:36 np0005548789.localdomain ceph-mon[298582]: pgmap v819: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:37 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:32:37 np0005548789.localdomain systemd[1]: tmp-crun.e46oTA.mount: Deactivated successfully.
Dec 06 10:32:37 np0005548789.localdomain podman[342275]: 2025-12-06 10:32:37.922325497 +0000 UTC m=+0.082764809 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:32:37 np0005548789.localdomain podman[342275]: 2025-12-06 10:32:37.929299372 +0000 UTC m=+0.089738714 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:32:37 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.219 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.220 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.221 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.221 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.245 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:38 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:38.246 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:38 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:38 np0005548789.localdomain ceph-mon[298582]: pgmap v820: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3405830284' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 06 10:32:39 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.32:0/3405830284' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 06 10:32:40 np0005548789.localdomain ceph-mon[298582]: pgmap v821: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:41 np0005548789.localdomain sshd[342214]: error: kex_exchange_identification: read: Connection timed out
Dec 06 10:32:41 np0005548789.localdomain sshd[342214]: banner exchange: Connection from 123.160.164.187 port 47516: Connection timed out
Dec 06 10:32:42 np0005548789.localdomain ceph-mon[298582]: pgmap v822: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.247 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.249 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.249 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.250 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.283 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:43 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:43.283 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:43 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:43 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.
Dec 06 10:32:43 np0005548789.localdomain podman[342299]: 2025-12-06 10:32:43.91918826 +0000 UTC m=+0.080932704 container health_status 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:32:43 np0005548789.localdomain podman[342299]: 2025-12-06 10:32:43.986178496 +0000 UTC m=+0.147922970 container exec_died 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 06 10:32:44 np0005548789.localdomain systemd[1]: 0ecfa85b1bac9a037a1ec468c5f8efd50bfc5b1966b6d96b971519ffca6a66f5.service: Deactivated successfully.
Dec 06 10:32:44 np0005548789.localdomain ceph-mon[298582]: pgmap v823: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:44 np0005548789.localdomain sshd[342324]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:32:45 np0005548789.localdomain sshd[342324]: Accepted publickey for zuul from 192.168.122.10 port 34882 ssh2: RSA SHA256:4DkDDFgR7XMiN5v7tZau5snVtHmG3ulto18u1RwA0us
Dec 06 10:32:45 np0005548789.localdomain systemd-logind[766]: New session 89 of user zuul.
Dec 06 10:32:45 np0005548789.localdomain systemd[1]: Started Session 89 of User zuul.
Dec 06 10:32:45 np0005548789.localdomain sshd[342324]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.203 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.204 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.204 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.204 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Auditing locally available compute resources for np0005548789.localdomain (node: np0005548789.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.205 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:32:45 np0005548789.localdomain sudo[342328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Dec 06 10:32:45 np0005548789.localdomain sudo[342328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Dec 06 10:32:45 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:32:45 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1680123915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.671 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.729 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.729 282197 DEBUG nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.967 282197 WARNING nova.virt.libvirt.driver [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.968 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Hypervisor/Node resource view: name=np0005548789.localdomain free_ram=11061MB free_disk=41.83699035644531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.969 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:45 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:45.969 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.037 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Instance b7ed0a2e-9350-4933-9334-4e5e08d3e6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.038 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.039 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Final resource view: name=np0005548789.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.092 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:32:46 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:32:46 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2760621750' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.547 282197 DEBUG oslo_concurrency.processutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.555 282197 DEBUG nova.compute.provider_tree [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed in ProviderTree for provider: 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.581 282197 DEBUG nova.scheduler.client.report [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Inventory has not changed for provider 0d33e88e-6335-4a94-8f21-32ba5b8bb7ad based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:32:46 np0005548789.localdomain ceph-mon[298582]: pgmap v824: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1680123915' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:46 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2760621750' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.584 282197 DEBUG nova.compute.resource_tracker [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Compute_service record updated for np0005548789.localdomain:np0005548789.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:32:46 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:46.585 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:46 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:46 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:46 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: ERROR   10:32:46 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 06 10:32:46 np0005548789.localdomain openstack_network_exporter[243110]: 
Dec 06 10:32:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:32:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:32:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:32:47.348 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:32:47 np0005548789.localdomain ovn_metadata_agent[160504]: 2025-12-06 10:32:47.350 160509 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:32:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:47.581 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:47.581 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:47.582 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:32:47 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:47.582 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.284 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.286 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.286 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.286 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.324 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.325 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.531 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquiring lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.531 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Acquired lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.532 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 06 10:32:48 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:48.532 282197 DEBUG nova.objects.instance [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7ed0a2e-9350-4933-9334-4e5e08d3e6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: pgmap v825: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: from='client.49656 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: from='client.59071 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "status"} v 0)
Dec 06 10:32:48 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4047748327' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:49.111 282197 DEBUG nova.network.neutron [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updating instance_info_cache with network_info: [{"id": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "address": "fa:16:3e:64:77:f3", "network": {"id": "652b6bdc-40ce-45b7-8aa5-3bca79987993", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "3d603431c0bb4967bafc7a0aa6108bfe", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86fc0b7a-fb", "ovs_interfaceid": "86fc0b7a-fbc5-4d40-a0d4-e1a3d455af5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 06 10:32:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:49.127 282197 DEBUG oslo_concurrency.lockutils [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Releasing lock "refresh_cache-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:32:49 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:49.127 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] [instance: b7ed0a2e-9350-4933-9334-4e5e08d3e6aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.69257 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.49665 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.59077 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.69266 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1062379559' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4047748327' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:49 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3542862083' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 06 10:32:50 np0005548789.localdomain ceph-mon[298582]: pgmap v826: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:51 np0005548789.localdomain ovs-vsctl[342622]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 06 10:32:52 np0005548789.localdomain virtqemud[203911]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548789.localdomain virtqemud[203911]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:52.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:52 np0005548789.localdomain virtqemud[203911]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 06 10:32:52 np0005548789.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 342775 (lsinitrd)
Dec 06 10:32:52 np0005548789.localdomain systemd[1]: Mounting EFI System Partition Automount...
Dec 06 10:32:52 np0005548789.localdomain systemd[1]: Mounted EFI System Partition Automount.
Dec 06 10:32:52 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: cache status {prefix=cache status} (starting...)
Dec 06 10:32:52 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:52 np0005548789.localdomain ceph-mon[298582]: pgmap v827: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:52 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: client ls {prefix=client ls} (starting...)
Dec 06 10:32:52 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:52 np0005548789.localdomain lvm[342865]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 10:32:52 np0005548789.localdomain lvm[342865]: VG ceph_vg0 finished
Dec 06 10:32:52 np0005548789.localdomain lvm[342869]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 06 10:32:52 np0005548789.localdomain lvm[342869]: VG ceph_vg1 finished
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.326 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.327 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.369 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:53 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:53.370 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: damage ls {prefix=damage ls} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump loads {prefix=dump loads} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain ceph-mon[298582]: from='client.49686 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:53 np0005548789.localdomain ceph-mon[298582]: from='client.59095 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "report"} v 0)
Dec 06 10:32:53 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/595936019' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain podman[241090]: time="2025-12-06T10:32:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 06 10:32:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:32:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156104 "" "Go-http-client/1.1"
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 06 10:32:53 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:53 np0005548789.localdomain podman[241090]: @ - - [06/Dec/2025:10:32:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1"
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2116789574' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:54.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:54.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:54 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:54.182 282197 DEBUG nova.compute.manager [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "config log"} v 0)
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1477964806' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: ops {prefix=ops} (starting...)
Dec 06 10:32:54 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3501209297' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.49692 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.69284 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.69287 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: pgmap v828: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.69299 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2201466626' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/595936019' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4106160759' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2581391178' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2116789574' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3723108171' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1477964806' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3383880708' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1949286175' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3501209297' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2031437587' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:32:54 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/26526715' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: session ls {prefix=session ls} (starting...)
Dec 06 10:32:55 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq Can't run that command on an inactive MDS!
Dec 06 10:32:55 np0005548789.localdomain ceph-mds[287313]: mds.mds.np0005548789.vxwwsq asok_command: status {prefix=status} (starting...)
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/512599547' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.49716 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.59125 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.69323 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1390228284' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4083339819' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2031437587' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3256193873' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/26526715' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1960715726' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3484329055' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/439920032' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3151208344' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/512599547' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2354599686' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.
Dec 06 10:32:55 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:55 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1988228710' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:55 np0005548789.localdomain podman[343287]: 2025-12-06 10:32:55.925235134 +0000 UTC m=+0.079901983 container health_status b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 06 10:32:56 np0005548789.localdomain podman[343287]: 2025-12-06 10:32:56.025356705 +0000 UTC m=+0.180023584 container exec_died b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 06 10:32:56 np0005548789.localdomain systemd[1]: b72b7f6308415cf6475230cb9b0cbf85f87f74051929730ede4e25c3d87f5941.service: Deactivated successfully.
Dec 06 10:32:56 np0005548789.localdomain podman[343286]: 2025-12-06 10:32:56.026856851 +0000 UTC m=+0.181121847 container health_status 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 10:32:56 np0005548789.localdomain podman[343286]: 2025-12-06 10:32:56.105821894 +0000 UTC m=+0.260086860 container exec_died 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:32:56 np0005548789.localdomain systemd[1]: 5a95a31a3d61d2f948a3383baa17e8fe9e0336cbbe8cd1193992f8830dfe5999.service: Deactivated successfully.
Dec 06 10:32:56 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:56.182 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3394250906' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2522593498' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4026296822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.59170 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.49767 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: pgmap v829: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.59182 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.69380 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2396717601' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.49785 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1988228710' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/923202201' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3829669704' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3306622931' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3917801459' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3394250906' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1911255415' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/608376543' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2522593498' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1497560563' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2669572884' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:56 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/558313208' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:57.181 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2399987487' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1461476266' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.69395 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4026296822' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/192676122' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3282262568' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1118821392' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2399987487' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2365653258' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1461476266' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1880444688' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:57 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1069869138' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1654648070' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.371 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.373 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.373 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.373 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.416 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:32:58 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:32:58.417 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.022740+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.022928+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.023069+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.023244+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.023384+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.023550+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.023710+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.024361+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.024504+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.024670+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.024834+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93413376 unmapped: 1441792 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.025034+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.025183+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.025477+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.025690+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.025829+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.025983+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.026221+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.026390+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.026568+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.026698+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.026882+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.027002+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.027155+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.027366+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.027496+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.027641+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.027852+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.028016+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.028210+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.028382+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.028558+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.028691+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.028950+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.029098+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.029260+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.029449+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.029572+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.029722+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.029878+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.030006+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.030218+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.030363+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.030527+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.030661+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.030862+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.031038+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.031235+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.031391+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.031533+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.031704+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.031876+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.032049+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.032257+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.032415+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.032580+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.032802+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.032945+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.033090+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.033292+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.033437+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.033620+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.033810+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.034042+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.034224+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.034367+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.034566+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.034772+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.034919+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.035861+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.036074+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.036245+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.036411+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.036636+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.036831+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.036973+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.037134+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.037344+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.037515+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.037704+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.037859+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9948000/0x0/0x1bfc00000, data 0x20ba68d/0x2146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 874032 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.038019+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.038139+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.038329+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.038474+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 93429760 unmapped: 1425408 heap: 94855168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148261000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 89.576988220s of 89.654701233s, submitted: 17
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 heartbeat osd_stat(store_statfs(0x1b9947000/0x0/0x1bfc00000, data 0x20ba69d/0x2147000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.038622+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 94609408 unmapped: 15982592 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 932783 data_alloc: 285212672 data_used: 11661312
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.038837+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 46
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 96 ms_handle_reset con 0x55e148261000 session 0x55e14b95a1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 94896128 unmapped: 15695872 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.038959+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 94896128 unmapped: 15695872 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 96 heartbeat osd_stat(store_statfs(0x1b9141000/0x0/0x1bfc00000, data 0x28bcd0d/0x294c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.039090+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 ms_handle_reset con 0x55e148da2000 session 0x55e14b353860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95019008 unmapped: 15572992 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.039167+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95019008 unmapped: 15572992 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.039335+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 973967 data_alloc: 285212672 data_used: 11685888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.039533+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.039663+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.040029+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.040173+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.040335+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 973967 data_alloc: 285212672 data_used: 11685888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.040463+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.040620+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.040800+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.040965+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.041119+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 973967 data_alloc: 285212672 data_used: 11685888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.041270+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.041432+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.041615+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.041780+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.041980+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 973967 data_alloc: 285212672 data_used: 11685888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.042096+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.042255+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.042442+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.042585+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 heartbeat osd_stat(store_statfs(0x1b8ccc000/0x0/0x1bfc00000, data 0x2d2f263/0x2dc1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.042827+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 973967 data_alloc: 285212672 data_used: 11685888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.043026+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.043179+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 27.778034210s of 27.920854568s, submitted: 23
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95027200 unmapped: 15564800 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.043337+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 98 ms_handle_reset con 0x55e14a5df400 session 0x55e14a8e32c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95117312 unmapped: 15474688 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.043510+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95117312 unmapped: 15474688 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.043659+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95158272 unmapped: 15433728 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 98 heartbeat osd_stat(store_statfs(0x1b8cc5000/0x0/0x1bfc00000, data 0x2d31f74/0x2dc8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 981366 data_alloc: 285212672 data_used: 11698176
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.043832+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95297536 unmapped: 15294464 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 99 ms_handle_reset con 0x55e14a8de000 session 0x55e14b95a960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.044000+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.044168+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.044294+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.044470+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 982481 data_alloc: 285212672 data_used: 11710464
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 99 heartbeat osd_stat(store_statfs(0x1b8cc4000/0x0/0x1bfc00000, data 0x2d33d1d/0x2dc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.044691+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.044855+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.045065+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 99 heartbeat osd_stat(store_statfs(0x1b8cc4000/0x0/0x1bfc00000, data 0x2d33d1d/0x2dc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95330304 unmapped: 15261696 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.066386223s of 11.211336136s, submitted: 40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.045225+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 100 ms_handle_reset con 0x55e14a8e0800 session 0x55e149db6780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95444992 unmapped: 15147008 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.045372+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95444992 unmapped: 15147008 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 991859 data_alloc: 285212672 data_used: 11722752
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.045542+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95510528 unmapped: 15081472 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.045739+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95510528 unmapped: 15081472 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 101 heartbeat osd_stat(store_statfs(0x1b8cba000/0x0/0x1bfc00000, data 0x2d386b8/0x2dd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.046875+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95510528 unmapped: 15081472 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.047019+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95510528 unmapped: 15081472 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.047178+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95535104 unmapped: 15056896 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 991387 data_alloc: 285212672 data_used: 11726848
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.047341+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 102 ms_handle_reset con 0x55e14a8dec00 session 0x55e14b95b0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.047496+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 102 heartbeat osd_stat(store_statfs(0x1b8cb6000/0x0/0x1bfc00000, data 0x2d3ac0c/0x2dd5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.047639+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.047834+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.047969+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 993948 data_alloc: 285212672 data_used: 11735040
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.048165+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 102 heartbeat osd_stat(store_statfs(0x1b8cb6000/0x0/0x1bfc00000, data 0x2d3ac0c/0x2dd5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.048301+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.048470+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 102 heartbeat osd_stat(store_statfs(0x1b8cb6000/0x0/0x1bfc00000, data 0x2d3ac0c/0x2dd5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.048618+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95322112 unmapped: 15269888 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.048792+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 16.529624939s of 16.683536530s, submitted: 53
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95354880 unmapped: 15237120 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 996758 data_alloc: 285212672 data_used: 11735040
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.048950+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95354880 unmapped: 15237120 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.049126+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95354880 unmapped: 15237120 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.049332+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95354880 unmapped: 15237120 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.049468+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b8cb4000/0x0/0x1bfc00000, data 0x2d3d025/0x2dd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95240192 unmapped: 15351808 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.049629+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b8cb4000/0x0/0x1bfc00000, data 0x2d3d025/0x2dd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95240192 unmapped: 15351808 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 996758 data_alloc: 285212672 data_used: 11735040
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.049797+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95240192 unmapped: 15351808 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.049955+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b8cb4000/0x0/0x1bfc00000, data 0x2d3d025/0x2dd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95240192 unmapped: 15351808 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.050093+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95248384 unmapped: 15343616 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e148da2000 session 0x55e14a8e32c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.050240+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a5df400 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 95248384 unmapped: 15343616 heap: 110592000 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.050374+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8de000 session 0x55e14a3543c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 96026624 unmapped: 22962176 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1109152 data_alloc: 285212672 data_used: 11743232
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a2552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.050590+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8e0800 session 0x55e14a07c3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 96026624 unmapped: 22962176 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.050853+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e148da2000 session 0x55e14a07d2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.979315758s of 12.154012680s, submitted: 41
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a5df400 session 0x55e14a24da40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 96059392 unmapped: 22929408 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b7f33000/0x0/0x1bfc00000, data 0x3abe048/0x3b5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.051073+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 96133120 unmapped: 22855680 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.051202+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 98426880 unmapped: 20561920 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.051353+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 101711872 unmapped: 17276928 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1156101 data_alloc: 301989888 data_used: 18100224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.051535+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102572032 unmapped: 16416768 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.051700+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102572032 unmapped: 16416768 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.052060+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b7f33000/0x0/0x1bfc00000, data 0x3abe048/0x3b5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.052277+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.052436+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1156101 data_alloc: 301989888 data_used: 18100224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.052587+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.052821+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.053003+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 102621184 unmapped: 16367616 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.053177+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.725117683s of 11.745197296s, submitted: 5
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b7f33000/0x0/0x1bfc00000, data 0x3abe048/0x3b5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 106823680 unmapped: 12165120 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.053360+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 105668608 unmapped: 13320192 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1293209 data_alloc: 301989888 data_used: 18452480
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.053557+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 105840640 unmapped: 13148160 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.053701+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108503040 unmapped: 10485760 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.053854+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108503040 unmapped: 10485760 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.054026+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8dec00 session 0x55e14b6d52c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8de000 session 0x55e14a24c3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108535808 unmapped: 10452992 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a8e0800 session 0x55e14b6d5680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.054339+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b6d0e000/0x0/0x1bfc00000, data 0x4cdd048/0x4d7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 heartbeat osd_stat(store_statfs(0x1b6d0e000/0x0/0x1bfc00000, data 0x4cdd048/0x4d7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [0,0,1,0,6,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110223360 unmapped: 8765440 heap: 118988800 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e148da2000 session 0x55e149096000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1364898 data_alloc: 301989888 data_used: 18415616
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.054478+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 ms_handle_reset con 0x55e14a5df400 session 0x55e14a0cf4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 109895680 unmapped: 13295616 heap: 123191296 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.054619+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 104 ms_handle_reset con 0x55e14a8de000 session 0x55e149db63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 104 ms_handle_reset con 0x55e14a8dec00 session 0x55e149db65a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a95f800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119676928 unmapped: 3514368 heap: 123191296 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.054792+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 104 ms_handle_reset con 0x55e14a95f800 session 0x55e149db74a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123453440 unmapped: 24510464 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.054926+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 104 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.963225365s of 10.001587868s, submitted: 249
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 105 ms_handle_reset con 0x55e14a5df400 session 0x55e14b6d5860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 105 ms_handle_reset con 0x55e148da2000 session 0x55e149ddd860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123199488 unmapped: 24764416 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.055125+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123207680 unmapped: 24756224 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 106 heartbeat osd_stat(store_statfs(0x1b4109000/0x0/0x1bfc00000, data 0x78dfba7/0x7984000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1690124 data_alloc: 301989888 data_used: 25976832
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.055257+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 106 heartbeat osd_stat(store_statfs(0x1b4104000/0x0/0x1bfc00000, data 0x78e20f6/0x7988000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a95f800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123224064 unmapped: 24739840 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.055387+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 106 ms_handle_reset con 0x55e14a95f800 session 0x55e149224780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108044288 unmapped: 39919616 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.055556+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108044288 unmapped: 39919616 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.055709+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 106 heartbeat osd_stat(store_statfs(0x1b60a6000/0x0/0x1bfc00000, data 0x59420d3/0x59e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 107978752 unmapped: 39985152 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.055844+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 107479040 unmapped: 40484864 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 heartbeat osd_stat(store_statfs(0x1b60a2000/0x0/0x1bfc00000, data 0x59444ec/0x59eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1389750 data_alloc: 285212672 data_used: 14528512
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.056031+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 108691456 unmapped: 39272448 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badb800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14badb800 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148c8c800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.056231+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e148c8c800 session 0x55e14a0cb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e148da2000 session 0x55e1490de1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14a5df400 session 0x55e1490df0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a95f800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14a95f800 session 0x55e149ddcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badb800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116056064 unmapped: 31907840 heap: 147963904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.056417+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14badb800 session 0x55e149ddd0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124895232 unmapped: 30769152 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bc34000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14bc34000 session 0x55e149ddcf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 heartbeat osd_stat(store_statfs(0x1b45c9000/0x0/0x1bfc00000, data 0x741e4ec/0x74c5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.056594+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e148da2000 session 0x55e149ddcb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124895232 unmapped: 30769152 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14a5df400 session 0x55e149ddc000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a95f800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.403194427s of 10.813942909s, submitted: 83
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 ms_handle_reset con 0x55e14a95f800 session 0x55e149ddc780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.056791+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badb800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124919808 unmapped: 30744576 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1660751 data_alloc: 301989888 data_used: 28323840
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.056925+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115007488 unmapped: 40656896 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 ms_handle_reset con 0x55e14772d400 session 0x55e14b351680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.057072+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115105792 unmapped: 40558592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 heartbeat osd_stat(store_statfs(0x1b690a000/0x0/0x1bfc00000, data 0x50daa86/0x5183000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.057230+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115105792 unmapped: 40558592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.057339+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115105792 unmapped: 40558592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.057473+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110551040 unmapped: 45113344 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1407120 data_alloc: 301989888 data_used: 14585856
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.057606+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 heartbeat osd_stat(store_statfs(0x1b6436000/0x0/0x1bfc00000, data 0x55afa86/0x5658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 109158400 unmapped: 46505984 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 heartbeat osd_stat(store_statfs(0x1b63d4000/0x0/0x1bfc00000, data 0x5611a86/0x56ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [0,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 ms_handle_reset con 0x55e14a8de000 session 0x55e1490963c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a8e3a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.057716+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 109330432 unmapped: 46333952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.107481+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110002176 unmapped: 45662208 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.107612+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110002176 unmapped: 45662208 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.693490982s of 10.092030525s, submitted: 105
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.107725+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110034944 unmapped: 45629440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1422384 data_alloc: 301989888 data_used: 14688256
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.107976+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 109 heartbeat osd_stat(store_statfs(0x1b63a0000/0x0/0x1bfc00000, data 0x5642e9f/0x56ed000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110100480 unmapped: 45563904 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.108101+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110657536 unmapped: 45006848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.108226+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110944256 unmapped: 44720128 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.108381+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110944256 unmapped: 44720128 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.108610+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110944256 unmapped: 44720128 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 109 ms_handle_reset con 0x55e14badb800 session 0x55e14b3525a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1432472 data_alloc: 301989888 data_used: 16293888
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.108840+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 109 heartbeat osd_stat(store_statfs(0x1b639e000/0x0/0x1bfc00000, data 0x5645e9f/0x56f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110960640 unmapped: 44703744 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 109 ms_handle_reset con 0x55e14772d400 session 0x55e14c7eb2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 ms_handle_reset con 0x55e148da2000 session 0x55e14a33e000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.108967+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a95f800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 ms_handle_reset con 0x55e14a95f800 session 0x55e14a366000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 ms_handle_reset con 0x55e14a5df400 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118366208 unmapped: 37298176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.109118+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 ms_handle_reset con 0x55e14772d400 session 0x55e1490961e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 heartbeat osd_stat(store_statfs(0x1b4ea9000/0x0/0x1bfc00000, data 0x6b38434/0x6be5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115736576 unmapped: 39927808 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 111 ms_handle_reset con 0x55e148da2000 session 0x55e14a07dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.109258+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115752960 unmapped: 39911424 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.537727356s of 10.074085236s, submitted: 130
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.109416+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a0cbe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115761152 unmapped: 39903232 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1632647 data_alloc: 301989888 data_used: 20926464
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.109613+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badb800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 heartbeat osd_stat(store_statfs(0x1b4e9e000/0x0/0x1bfc00000, data 0x6b3cf6c/0x6bee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111362048 unmapped: 44302336 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 ms_handle_reset con 0x55e14badb800 session 0x55e1490def00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.109877+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 ms_handle_reset con 0x55e14772d400 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110919680 unmapped: 44744704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets getting new tickets!
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.110116+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _finish_auth 0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.112095+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110919680 unmapped: 44744704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.110328+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 heartbeat osd_stat(store_statfs(0x1b63a2000/0x0/0x1bfc00000, data 0x563af49/0x56eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 110919680 unmapped: 44744704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.110471+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e148da2000 session 0x55e14a8e25a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a5df400 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113754112 unmapped: 41910272 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1519631 data_alloc: 301989888 data_used: 21831680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.111625+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a72d800 session 0x55e14a2e23c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a02f400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a02f400 session 0x55e14a2e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a8dec00 session 0x55e148c25e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14772d400 session 0x55e149096780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122519552 unmapped: 33144832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.111812+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a5df400 session 0x55e14826c3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a72d800 session 0x55e14826c780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122519552 unmapped: 33144832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a02e800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14a02e800 session 0x55e14826da40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.111975+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 ms_handle_reset con 0x55e14772d400 session 0x55e14a0cf860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122535936 unmapped: 33128448 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.112114+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a5df400 session 0x55e149ddc780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 heartbeat osd_stat(store_statfs(0x1b53e1000/0x0/0x1bfc00000, data 0x65fa333/0x66ad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116236288 unmapped: 39428096 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a8dec00 session 0x55e14b3514a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a785400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a785400 session 0x55e14a366000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a72d000 session 0x55e14a33e000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14772d400 session 0x55e14a33fc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.112242+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.555500984s of 10.126294136s, submitted: 145
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a5df400 session 0x55e14b3525a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a785400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a785400 session 0x55e14a3672c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a8c6b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a72d800 session 0x55e14c7ea780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113049600 unmapped: 42614784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a8e0c00 session 0x55e149db6960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a72d800 session 0x55e14a8e3a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14772d800 session 0x55e149ddcb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a168000 session 0x55e14c7eb0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.112376+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1502646 data_alloc: 285212672 data_used: 14004224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14772d400 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111001600 unmapped: 44662784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a5df400 session 0x55e148c241e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.112473+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 heartbeat osd_stat(store_statfs(0x1b77e3000/0x0/0x1bfc00000, data 0x3c007d3/0x3cb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111001600 unmapped: 44662784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14772d800 session 0x55e14a07dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.112624+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 heartbeat osd_stat(store_statfs(0x1b77e3000/0x0/0x1bfc00000, data 0x3c007d3/0x3cb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111001600 unmapped: 44662784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.112801+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 ms_handle_reset con 0x55e14a168000 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111001600 unmapped: 44662784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a72d800 session 0x55e14a0cbe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.112936+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a8e0c00 session 0x55e14c7eb2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111099904 unmapped: 44564480 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.113087+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1263246 data_alloc: 285212672 data_used: 11280384
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 111853568 unmapped: 43810816 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.113211+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.113408+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b7dac000/0x0/0x1bfc00000, data 0x3c2cbfc/0x3ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.113604+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.113824+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.114016+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1285326 data_alloc: 301989888 data_used: 14336000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b7dac000/0x0/0x1bfc00000, data 0x3c2cbfc/0x3ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.114122+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.114263+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.114410+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a5df400 session 0x55e1490df4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e148da2000 session 0x55e149ddcf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b7dac000/0x0/0x1bfc00000, data 0x3c2cbfc/0x3ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b7dac000/0x0/0x1bfc00000, data 0x3c2cbfc/0x3ce1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.114545+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.114661+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1285486 data_alloc: 301989888 data_used: 14340096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112508928 unmapped: 43155456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 16.439128876s of 16.763751984s, submitted: 93
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.114791+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a785400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113786880 unmapped: 41877504 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.114934+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114860032 unmapped: 40804352 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.115048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114917376 unmapped: 40747008 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b6c3a000/0x0/0x1bfc00000, data 0x4d91bfc/0x4e46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.115208+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114745344 unmapped: 40919040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.115398+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1428198 data_alloc: 301989888 data_used: 14364672
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114745344 unmapped: 40919040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.115561+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114745344 unmapped: 40919040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.115717+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b6c0e000/0x0/0x1bfc00000, data 0x4dcbbfc/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114745344 unmapped: 40919040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.115920+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b6c0e000/0x0/0x1bfc00000, data 0x4dcbbfc/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114761728 unmapped: 40902656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.116091+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14772d800 session 0x55e1490df680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a168000 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b6c0e000/0x0/0x1bfc00000, data 0x4dcbbfc/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114761728 unmapped: 40902656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.116273+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1357430 data_alloc: 301989888 data_used: 14311424
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a785400 session 0x55e14a0cf680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a72d800 session 0x55e148d38f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a8e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114761728 unmapped: 40902656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.389623642s of 10.030131340s, submitted: 174
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.116402+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 ms_handle_reset con 0x55e14772d800 session 0x55e1490de5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.116544+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.116752+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.116929+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.117036+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.117169+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.117344+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.117521+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.117697+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.117822+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.117933+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.118080+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.118225+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.118407+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.118538+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.118714+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.118821+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.118991+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.119134+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.119274+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.119479+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.119666+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.119843+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.119975+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.120107+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.120246+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.120407+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.120589+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.120743+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.120941+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.121090+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.121271+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.121412+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.121577+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.121728+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.121841+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.122061+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.122259+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.122406+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.124402+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.124589+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.124818+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.124977+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.125124+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.125289+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.125499+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112189440 unmapped: 43474944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.125745+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112197632 unmapped: 43466752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.125977+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112197632 unmapped: 43466752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.126196+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112197632 unmapped: 43466752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.126397+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112197632 unmapped: 43466752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.126592+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.126806+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.126997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.127145+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.127326+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.127894+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112205824 unmapped: 43458560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.128084+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112214016 unmapped: 43450368 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.128285+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112214016 unmapped: 43450368 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.128508+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112222208 unmapped: 43442176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.128890+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112222208 unmapped: 43442176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1151214 data_alloc: 285212672 data_used: 10108928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 heartbeat osd_stat(store_statfs(0x1b8c84000/0x0/0x1bfc00000, data 0x2d58ba9/0x2e09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.129086+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112222208 unmapped: 43442176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:20.129303+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112222208 unmapped: 43442176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.129515+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 112222208 unmapped: 43442176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 62.633754730s of 62.701347351s, submitted: 22
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.129951+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113278976 unmapped: 42385408 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 116 ms_handle_reset con 0x55e148da2000 session 0x55e14b6d4f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.130141+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 116 heartbeat osd_stat(store_statfs(0x1b8c7d000/0x0/0x1bfc00000, data 0x2d5b8a6/0x2e10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113344512 unmapped: 42319872 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1167502 data_alloc: 285212672 data_used: 10121216
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 117 ms_handle_reset con 0x55e14a5df400 session 0x55e1490dda40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 117 ms_handle_reset con 0x55e14a168000 session 0x55e14b6d41e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.130319+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113418240 unmapped: 42246144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 118 ms_handle_reset con 0x55e14a5df400 session 0x55e14b6d52c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 118 heartbeat osd_stat(store_statfs(0x1b8c70000/0x0/0x1bfc00000, data 0x2d60b3f/0x2e1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.130477+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113410048 unmapped: 42254336 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 119 ms_handle_reset con 0x55e14772d800 session 0x55e14b6d5680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 119 heartbeat osd_stat(store_statfs(0x1b8c6c000/0x0/0x1bfc00000, data 0x2d63810/0x2e21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.130635+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113459200 unmapped: 42205184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 120 ms_handle_reset con 0x55e148da2000 session 0x55e14a07c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.130908+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113508352 unmapped: 42156032 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.131048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113573888 unmapped: 42090496 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190812 data_alloc: 285212672 data_used: 10137600
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 121 handle_osd_map epochs [120,121], i have 121, src has [1,121]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 121 heartbeat osd_stat(store_statfs(0x1b8863000/0x0/0x1bfc00000, data 0x2d67aae/0x2e28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.131233+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113598464 unmapped: 42065920 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 122 ms_handle_reset con 0x55e14a72d800 session 0x55e14a8c7e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.131461+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113606656 unmapped: 42057728 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 122 heartbeat osd_stat(store_statfs(0x1b885e000/0x0/0x1bfc00000, data 0x2d6ac74/0x2e2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.131676+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113606656 unmapped: 42057728 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.131864+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.090685844s of 10.435892105s, submitted: 92
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113623040 unmapped: 42041344 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 123 ms_handle_reset con 0x55e14772d800 session 0x55e14b353c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.132040+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113639424 unmapped: 42024960 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1197605 data_alloc: 285212672 data_used: 10153984
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.132190+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113729536 unmapped: 41934848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 124 ms_handle_reset con 0x55e148da2000 session 0x55e14b3521e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 124 heartbeat osd_stat(store_statfs(0x1b8859000/0x0/0x1bfc00000, data 0x2d6eb4e/0x2e35000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.132369+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113729536 unmapped: 41934848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.132523+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 ms_handle_reset con 0x55e14a5df400 session 0x55e14b3514a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113745920 unmapped: 41918464 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 heartbeat osd_stat(store_statfs(0x1b8859000/0x0/0x1bfc00000, data 0x2d6eb4e/0x2e35000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 ms_handle_reset con 0x55e14a72d800 session 0x55e14a0cf680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 ms_handle_reset con 0x55e14a168000 session 0x55e14a2e2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 ms_handle_reset con 0x55e14772d800 session 0x55e1490e1c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.132682+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113893376 unmapped: 41771008 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.132834+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 126 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113909760 unmapped: 41754624 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 127 ms_handle_reset con 0x55e148da2000 session 0x55e14a2e30e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1209733 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a72d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 127 ms_handle_reset con 0x55e14a5df400 session 0x55e14a0cf0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.132979+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 113958912 unmapped: 41705472 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 128 heartbeat osd_stat(store_statfs(0x1b884e000/0x0/0x1bfc00000, data 0x2d75521/0x2e3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 128 ms_handle_reset con 0x55e14a8dec00 session 0x55e149ddcf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 128 ms_handle_reset con 0x55e14a72d800 session 0x55e14a31a780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.133162+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114098176 unmapped: 41566208 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 ms_handle_reset con 0x55e14772d800 session 0x55e148c25e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 heartbeat osd_stat(store_statfs(0x1b884a000/0x0/0x1bfc00000, data 0x2d7a079/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.133330+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114163712 unmapped: 41500672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 heartbeat osd_stat(store_statfs(0x1b884a000/0x0/0x1bfc00000, data 0x2d7a079/0x2e43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 heartbeat osd_stat(store_statfs(0x1b884b000/0x0/0x1bfc00000, data 0x2d798bf/0x2e41000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.133509+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114237440 unmapped: 41426944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.046469688s of 10.244983673s, submitted: 326
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.133648+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114253824 unmapped: 41410560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1215438 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.133839+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114253824 unmapped: 41410560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 130 heartbeat osd_stat(store_statfs(0x1b8846000/0x0/0x1bfc00000, data 0x2d7bd34/0x2e45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.133985+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114253824 unmapped: 41410560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.134190+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114253824 unmapped: 41410560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.134410+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 heartbeat osd_stat(store_statfs(0x1b8846000/0x0/0x1bfc00000, data 0x2d7bd34/0x2e45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.134582+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1217592 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.134741+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.134968+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.135116+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.135237+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 heartbeat osd_stat(store_statfs(0x1b8844000/0x0/0x1bfc00000, data 0x2d7e18d/0x2e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.135398+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1217592 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.135557+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.135739+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 heartbeat osd_stat(store_statfs(0x1b8844000/0x0/0x1bfc00000, data 0x2d7e18d/0x2e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.135924+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114294784 unmapped: 41369600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.136114+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114302976 unmapped: 41361408 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.136240+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1217592 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.136376+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.136591+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:01.136740+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 heartbeat osd_stat(store_statfs(0x1b8844000/0x0/0x1bfc00000, data 0x2d7e18d/0x2e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:02.136914+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 heartbeat osd_stat(store_statfs(0x1b8844000/0x0/0x1bfc00000, data 0x2d7e18d/0x2e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:03.137046+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1217592 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:04.137192+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:05.137342+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:06.137463+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 23.843885422s of 23.893196106s, submitted: 23
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114311168 unmapped: 41353216 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:07.137592+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114360320 unmapped: 41304064 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 ms_handle_reset con 0x55e148da2000 session 0x55e14b95a000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 heartbeat osd_stat(store_statfs(0x1b883a000/0x0/0x1bfc00000, data 0x2d82ccc/0x2e54000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:08.137742+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114368512 unmapped: 41295872 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1228157 data_alloc: 285212672 data_used: 10182656
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 heartbeat osd_stat(store_statfs(0x1b883a000/0x0/0x1bfc00000, data 0x2d82ccc/0x2e54000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:09.137951+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 134 ms_handle_reset con 0x55e14a5df400 session 0x55e14b95b680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114376704 unmapped: 41287680 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:10.138173+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 114376704 unmapped: 41287680 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 134 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a355860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a784800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:11.138314+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115073024 unmapped: 40591360 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 135 ms_handle_reset con 0x55e14a784800 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:12.138452+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 ms_handle_reset con 0x55e14772d800 session 0x55e149ddc000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115154944 unmapped: 40509440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 handle_osd_map epochs [135,136], i have 136, src has [1,136]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 handle_osd_map epochs [135,136], i have 136, src has [1,136]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 ms_handle_reset con 0x55e148da2000 session 0x55e149ddda40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:13.138598+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115130368 unmapped: 40534016 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 heartbeat osd_stat(store_statfs(0x1b882c000/0x0/0x1bfc00000, data 0x2d89be7/0x2e60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1238929 data_alloc: 285212672 data_used: 10211328
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:14.138740+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115130368 unmapped: 40534016 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:15.138933+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115130368 unmapped: 40534016 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:16.139094+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115130368 unmapped: 40534016 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:17.139317+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.635580063s of 10.870277405s, submitted: 77
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115146752 unmapped: 40517632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:18.139540+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 137 heartbeat osd_stat(store_statfs(0x1b882b000/0x0/0x1bfc00000, data 0x2d8bf8e/0x2e62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115146752 unmapped: 40517632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1241259 data_alloc: 285212672 data_used: 10211328
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:19.139717+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115146752 unmapped: 40517632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 137 heartbeat osd_stat(store_statfs(0x1b882b000/0x0/0x1bfc00000, data 0x2d8bf8e/0x2e62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:20.139955+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 137 heartbeat osd_stat(store_statfs(0x1b882b000/0x0/0x1bfc00000, data 0x2d8bf8e/0x2e62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115146752 unmapped: 40517632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:21.140116+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115146752 unmapped: 40517632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 137 ms_handle_reset con 0x55e14a5df400 session 0x55e14a2e23c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:22.140289+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115163136 unmapped: 40501248 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:23.140445+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a0cb0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115179520 unmapped: 40484864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1249830 data_alloc: 285212672 data_used: 10227712
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 heartbeat osd_stat(store_statfs(0x1b8824000/0x0/0x1bfc00000, data 0x2d8e595/0x2e69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:24.140607+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115179520 unmapped: 40484864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badac00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14a8e1000 session 0x55e14a8c61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14badac00 session 0x55e1490dde00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14a8e1000 session 0x55e1492241e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:25.140779+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115179520 unmapped: 40484864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:26.140952+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14772d800 session 0x55e14a312b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e148da2000 session 0x55e14a0cf860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115204096 unmapped: 40460288 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:27.141148+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 ms_handle_reset con 0x55e14a5df400 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.139903069s of 10.289731979s, submitted: 49
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115212288 unmapped: 40452096 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:28.141323+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14772d800 session 0x55e1490961e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115228672 unmapped: 40435712 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e148da2000 session 0x55e149096b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1257370 data_alloc: 285212672 data_used: 10240000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:29.141471+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115245056 unmapped: 40419328 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 heartbeat osd_stat(store_statfs(0x1b8820000/0x0/0x1bfc00000, data 0x2d90b2c/0x2e6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14a5df400 session 0x55e149db7680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:30.141648+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115261440 unmapped: 40402944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14a8e1000 session 0x55e149db6960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:31.141864+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14badac00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14badac00 session 0x55e149db63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 heartbeat osd_stat(store_statfs(0x1b8821000/0x0/0x1bfc00000, data 0x2d90b1c/0x2e6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115286016 unmapped: 40378368 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14772d800 session 0x55e14b95a000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:32.142089+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115294208 unmapped: 40370176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e148da2000 session 0x55e14c7ea780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:33.142250+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115294208 unmapped: 40370176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1255384 data_alloc: 285212672 data_used: 10240000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:34.142393+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 ms_handle_reset con 0x55e14a5df400 session 0x55e149ddcf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115310592 unmapped: 40353792 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:35.142536+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115310592 unmapped: 40353792 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:36.142736+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115310592 unmapped: 40353792 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:37.142935+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 140 heartbeat osd_stat(store_statfs(0x1b8821000/0x0/0x1bfc00000, data 0x2d90b1d/0x2e6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.278952599s of 10.002747536s, submitted: 82
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115376128 unmapped: 40288256 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:38.143098+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115376128 unmapped: 40288256 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1262670 data_alloc: 285212672 data_used: 10252288
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:39.143244+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115376128 unmapped: 40288256 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:40.143472+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 140 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a8c6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115384320 unmapped: 40280064 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/880484158' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 140 heartbeat osd_stat(store_statfs(0x1b881c000/0x0/0x1bfc00000, data 0x2d92f36/0x2e71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:41.143631+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 141 ms_handle_reset con 0x55e14a8e1000 session 0x55e14a0cf0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115392512 unmapped: 40271872 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 141 ms_handle_reset con 0x55e148da2000 session 0x55e14a8c63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 141 ms_handle_reset con 0x55e14a5df400 session 0x55e14a8c6b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:42.143832+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 142 ms_handle_reset con 0x55e14772d800 session 0x55e14a3554a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115466240 unmapped: 40198144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 142 ms_handle_reset con 0x55e14a8dec00 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:43.143981+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 142 ms_handle_reset con 0x55e148da2c00 session 0x55e14b95b860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115482624 unmapped: 40181760 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1269285 data_alloc: 285212672 data_used: 10268672
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:44.144175+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 142 ms_handle_reset con 0x55e148da2000 session 0x55e14a07cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 ms_handle_reset con 0x55e14772d800 session 0x55e14a2e2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 ms_handle_reset con 0x55e14a5df400 session 0x55e1490e1c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dec00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115548160 unmapped: 40116224 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 ms_handle_reset con 0x55e148da2c00 session 0x55e14c7eaf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:45.144535+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 ms_handle_reset con 0x55e14a8dec00 session 0x55e14b3521e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 heartbeat osd_stat(store_statfs(0x1b880b000/0x0/0x1bfc00000, data 0x2d9c4aa/0x2e81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 ms_handle_reset con 0x55e14772d800 session 0x55e14b353c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115556352 unmapped: 40108032 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 ms_handle_reset con 0x55e148da2c00 session 0x55e14c7eb0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:46.144701+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 144 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e148da2000 session 0x55e14a07c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a5df400 session 0x55e14c7ea000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115580928 unmapped: 40083456 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a2b2000 session 0x55e14b6d5680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e148da2000 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:47.144821+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14772d800 session 0x55e14b6d52c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e148da2c00 session 0x55e148d38f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115613696 unmapped: 40050688 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a5df400 session 0x55e14b6d41e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:48.144973+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8dfc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.354576111s of 10.805983543s, submitted: 123
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a8dfc00 session 0x55e1490dda40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115613696 unmapped: 40050688 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1281833 data_alloc: 285212672 data_used: 10285056
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:49.145167+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14772d800 session 0x55e1490de5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 115613696 unmapped: 40050688 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e148da2c00 session 0x55e14c1d3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 heartbeat osd_stat(store_statfs(0x1b880a000/0x0/0x1bfc00000, data 0x2d9e944/0x2e84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5de800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a5de800 session 0x55e14b95a000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e148da2000 session 0x55e14b95af00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:50.145373+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a5df400 session 0x55e14c1d2d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116129792 unmapped: 39534592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 ms_handle_reset con 0x55e14a5df400 session 0x55e14b95b680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:51.145535+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 146 ms_handle_reset con 0x55e14772d800 session 0x55e149db63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116006912 unmapped: 39657472 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:52.145850+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 ms_handle_reset con 0x55e148da2000 session 0x55e149db6960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 heartbeat osd_stat(store_statfs(0x1b5800000/0x0/0x1bfc00000, data 0x5da32fd/0x5e8c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116039680 unmapped: 39624704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5de800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 ms_handle_reset con 0x55e14a5de800 session 0x55e14c1d3680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5dfc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 heartbeat osd_stat(store_statfs(0x1b5800000/0x0/0x1bfc00000, data 0x5da32fd/0x5e8c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:53.146014+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 ms_handle_reset con 0x55e148da2c00 session 0x55e149db7680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 ms_handle_reset con 0x55e14a5dfc00 session 0x55e14c1d2f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 ms_handle_reset con 0x55e14a2b3000 session 0x55e14b95a5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116088832 unmapped: 39575552 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1732057 data_alloc: 285212672 data_used: 10309632
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 ms_handle_reset con 0x55e14772d800 session 0x55e1490963c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:54.146151+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 ms_handle_reset con 0x55e148da2c00 session 0x55e149096b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116097024 unmapped: 39567360 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5de800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 handle_osd_map epochs [148,149], i have 149, src has [1,149]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 ms_handle_reset con 0x55e14a5de800 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 heartbeat osd_stat(store_statfs(0x1b47fd000/0x0/0x1bfc00000, data 0x6da5974/0x6e8f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 ms_handle_reset con 0x55e148da2000 session 0x55e14c1d3e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:55.146299+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116162560 unmapped: 39501824 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 150 ms_handle_reset con 0x55e14772d800 session 0x55e14a3132c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:56.146461+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 151 ms_handle_reset con 0x55e148da2c00 session 0x55e14c1d23c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116211712 unmapped: 39452672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 151 ms_handle_reset con 0x55e14a2b3000 session 0x55e1492241e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5dfc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:57.146637+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 ms_handle_reset con 0x55e14a5dfc00 session 0x55e1490dde00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116244480 unmapped: 39419904 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b87f2000/0x0/0x1bfc00000, data 0x2dacab8/0x2e9b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:58.146879+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 ms_handle_reset con 0x55e14772d800 session 0x55e14a0cbe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 heartbeat osd_stat(store_statfs(0x1b87f2000/0x0/0x1bfc00000, data 0x2dacab8/0x2e9b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.225473404s of 10.010781288s, submitted: 158
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 ms_handle_reset con 0x55e148da2000 session 0x55e14a0cb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116285440 unmapped: 39378944 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1316054 data_alloc: 285212672 data_used: 10330112
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:59.147070+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116326400 unmapped: 39337984 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:00.147274+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116326400 unmapped: 39337984 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:01.147485+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116326400 unmapped: 39337984 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:02.147647+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116342784 unmapped: 39321600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:03.147866+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 153 heartbeat osd_stat(store_statfs(0x1b87eb000/0x0/0x1bfc00000, data 0x2db143a/0x2ea2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116342784 unmapped: 39321600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1318384 data_alloc: 285212672 data_used: 10330112
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:04.148040+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116342784 unmapped: 39321600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:05.148184+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116342784 unmapped: 39321600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:06.148333+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 153 ms_handle_reset con 0x55e148da2c00 session 0x55e14a2e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116342784 unmapped: 39321600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:07.148488+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116350976 unmapped: 39313408 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:08.148651+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 154 heartbeat osd_stat(store_statfs(0x1b87eb000/0x0/0x1bfc00000, data 0x2db149c/0x2ea3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.840661049s of 10.000985146s, submitted: 63
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 154 ms_handle_reset con 0x55e14a2b3000 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116367360 unmapped: 39297024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1328126 data_alloc: 285212672 data_used: 10342400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 154 ms_handle_reset con 0x55e14a5df400 session 0x55e14a355860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:09.148993+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116350976 unmapped: 39313408 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 154 ms_handle_reset con 0x55e14772d800 session 0x55e1490df860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:10.149212+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116367360 unmapped: 39297024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 154 handle_osd_map epochs [154,155], i have 154, src has [1,155]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 155 ms_handle_reset con 0x55e148da2000 session 0x55e1490dfe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 155 ms_handle_reset con 0x55e148da2c00 session 0x55e14c1d34a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:11.149367+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 155 ms_handle_reset con 0x55e14a2b3000 session 0x55e14a366000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a784000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116457472 unmapped: 39206912 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 ms_handle_reset con 0x55e14a784000 session 0x55e14b95ad20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 ms_handle_reset con 0x55e14772d800 session 0x55e14a0cf4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:12.149527+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 heartbeat osd_stat(store_statfs(0x1b87e1000/0x0/0x1bfc00000, data 0x2db5f66/0x2eac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 ms_handle_reset con 0x55e148da2000 session 0x55e14a0ceb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 ms_handle_reset con 0x55e148da2c00 session 0x55e1490e03c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116490240 unmapped: 39174144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 ms_handle_reset con 0x55e14a2b3000 session 0x55e14b3532c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:13.149641+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116490240 unmapped: 39174144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332513 data_alloc: 285212672 data_used: 10354688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:14.149887+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 heartbeat osd_stat(store_statfs(0x1b87de000/0x0/0x1bfc00000, data 0x2db848b/0x2eae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116490240 unmapped: 39174144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:15.150071+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116490240 unmapped: 39174144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:16.150259+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116498432 unmapped: 39165952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:17.150457+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116498432 unmapped: 39165952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14a63c000 session 0x55e14a354000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:18.150633+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 116498432 unmapped: 39165952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1336317 data_alloc: 285212672 data_used: 10354688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.235181808s of 10.692186356s, submitted: 145
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14772d800 session 0x55e149ddd860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:19.150786+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 47
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b87db000/0x0/0x1bfc00000, data 0x2dba8c4/0x2eb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2000 session 0x55e1490e7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117874688 unmapped: 37789696 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:20.151037+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117874688 unmapped: 37789696 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:21.156088+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117874688 unmapped: 37789696 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:22.156242+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117874688 unmapped: 37789696 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:23.156462+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117874688 unmapped: 37789696 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335780 data_alloc: 285212672 data_used: 10354688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:24.156645+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b87d6000/0x0/0x1bfc00000, data 0x2dc08c4/0x2eb8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 117899264 unmapped: 37765120 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:25.156838+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 48
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b87d1000/0x0/0x1bfc00000, data 0x2dc5f0d/0x2ebd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118177792 unmapped: 37486592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:26.157005+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2c00 session 0x55e1490963c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14bada400 session 0x55e14a0cf0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14a2b3000 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118226944 unmapped: 37437440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14a168c00 session 0x55e14a8c61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:27.157156+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b87cf000/0x0/0x1bfc00000, data 0x2dc7c68/0x2ebf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118251520 unmapped: 37412864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:28.157304+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118251520 unmapped: 37412864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335508 data_alloc: 285212672 data_used: 10350592
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:29.181036+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118251520 unmapped: 37412864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:30.181240+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.276064873s of 11.507046700s, submitted: 56
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14772d800 session 0x55e149db7860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2000 session 0x55e149ddc780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118259712 unmapped: 37404672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:31.181397+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118259712 unmapped: 37404672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:32.181553+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2c00 session 0x55e149db7e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118267904 unmapped: 37396480 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14a169c00 session 0x55e149ddda40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:33.181691+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14bada400 session 0x55e149db6000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b87cb000/0x0/0x1bfc00000, data 0x2dc9375/0x2ec3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14a169c00 session 0x55e149ddc000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118415360 unmapped: 37249024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14772d800 session 0x55e14a07d2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1436358 data_alloc: 285212672 data_used: 10350592
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:34.181816+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2000 session 0x55e14c7eb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2c00 session 0x55e149db7680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e14772d800 session 0x55e14a33e780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 ms_handle_reset con 0x55e148da2000 session 0x55e14a31a1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118464512 unmapped: 37199872 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:35.182013+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 heartbeat osd_stat(store_statfs(0x1b73de000/0x0/0x1bfc00000, data 0x41b82b1/0x42b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118431744 unmapped: 37232640 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e14a169c00 session 0x55e14a31a3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e14bada400 session 0x55e14c1d34a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:36.182179+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e14a168c00 session 0x55e14c1d3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 heartbeat osd_stat(store_statfs(0x1b7bba000/0x0/0x1bfc00000, data 0x39d8c56/0x3ad3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118423552 unmapped: 37240832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:37.182337+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e14772d800 session 0x55e14b95bc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118423552 unmapped: 37240832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:38.182499+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118423552 unmapped: 37240832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1443501 data_alloc: 285212672 data_used: 10362880
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e148da2000 session 0x55e14b95b680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:39.182700+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 ms_handle_reset con 0x55e14a169c00 session 0x55e1490df0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e14bada400 session 0x55e14a07c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118464512 unmapped: 37199872 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:40.182986+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e14a63d800 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.326197624s of 10.040191650s, submitted: 144
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e14a8e1400 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e14772d800 session 0x55e14a8e3a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 37150720 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:41.183108+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e148da2000 session 0x55e1490e0d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 ms_handle_reset con 0x55e14a169c00 session 0x55e149db7e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 118513664 unmapped: 37150720 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:42.183254+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 heartbeat osd_stat(store_statfs(0x1b7bb6000/0x0/0x1bfc00000, data 0x39db199/0x3ad8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 160 ms_handle_reset con 0x55e14bada400 session 0x55e14a07cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119562240 unmapped: 36102144 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:43.183358+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119570432 unmapped: 36093952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1460862 data_alloc: 285212672 data_used: 10387456
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:44.183510+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14772d800 session 0x55e14a07c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e148da2000 session 0x55e149db7680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14bada400 session 0x55e149db6000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14a169c00 session 0x55e14b3530e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119578624 unmapped: 36085760 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:45.183645+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e1400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14a8e1400 session 0x55e14b6d5860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14772d800 session 0x55e14c7eb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119570432 unmapped: 36093952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 heartbeat osd_stat(store_statfs(0x1b7bac000/0x0/0x1bfc00000, data 0x39dfce1/0x3ae2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:46.183851+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e148da2000 session 0x55e14a0cf4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14a169c00 session 0x55e14a0ceb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 heartbeat osd_stat(store_statfs(0x1b7bad000/0x0/0x1bfc00000, data 0x39dfcd1/0x3ae1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119578624 unmapped: 36085760 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:47.184000+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119586816 unmapped: 36077568 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14bada400 session 0x55e14a0cf860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:48.184162+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 heartbeat osd_stat(store_statfs(0x1b7bab000/0x0/0x1bfc00000, data 0x39dfd44/0x3ae3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119586816 unmapped: 36077568 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1462559 data_alloc: 285212672 data_used: 10391552
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:49.184286+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14b648400 session 0x55e14a312b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119603200 unmapped: 36061184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:50.184448+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14772d800 session 0x55e149097a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.861013412s of 10.135841370s, submitted: 55
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e148da2000 session 0x55e149096960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119603200 unmapped: 36061184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:51.184585+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14a169c00 session 0x55e14b95a780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14b648400 session 0x55e14c1d34a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119611392 unmapped: 36052992 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 ms_handle_reset con 0x55e14bada400 session 0x55e14a31a1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:52.184832+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 ms_handle_reset con 0x55e14772d800 session 0x55e14c7ea780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119611392 unmapped: 36052992 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 heartbeat osd_stat(store_statfs(0x1b77a7000/0x0/0x1bfc00000, data 0x39e218c/0x3ae6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 ms_handle_reset con 0x55e148da2000 session 0x55e14a31a3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:53.184968+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119627776 unmapped: 36036608 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 ms_handle_reset con 0x55e14a169c00 session 0x55e14a8c61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1468645 data_alloc: 285212672 data_used: 10403840
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:54.185134+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 heartbeat osd_stat(store_statfs(0x1b77a8000/0x0/0x1bfc00000, data 0x39e214c/0x3ae6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119627776 unmapped: 36036608 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:55.185259+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14b648400 session 0x55e1490e7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119644160 unmapped: 36020224 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:56.185389+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bada400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14bada400 session 0x55e14a354000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119644160 unmapped: 36020224 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:57.185527+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14772d800 session 0x55e148d38d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e148da2000 session 0x55e14a366780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119644160 unmapped: 36020224 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:58.185688+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b77a1000/0x0/0x1bfc00000, data 0x39e46f1/0x3aec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a169c00 session 0x55e14a8e23c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119652352 unmapped: 36012032 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14b648400 session 0x55e14a8c7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1472715 data_alloc: 285212672 data_used: 10416128
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:59.185847+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da3800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e148da3800 session 0x55e14a8c6f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14772d800 session 0x55e14b350960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119668736 unmapped: 35995648 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:00.186046+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119668736 unmapped: 35995648 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:01.186186+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119668736 unmapped: 35995648 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.145006180s of 11.488328934s, submitted: 89
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e148da2000 session 0x55e14a8e25a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:02.186389+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a169c00 session 0x55e14a31a5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119693312 unmapped: 35971072 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:03.186535+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b779f000/0x0/0x1bfc00000, data 0x39e99ad/0x3aef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14b648400 session 0x55e149db6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119717888 unmapped: 35946496 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1477539 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:04.186692+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119726080 unmapped: 35938304 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:05.186831+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a8e0400 session 0x55e14a08f860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 119775232 unmapped: 35889152 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14772d800 session 0x55e1490e0000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:06.186969+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e148da2000 session 0x55e14a3125a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120627200 unmapped: 35037184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a169c00 session 0x55e14b95a1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:07.187126+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b6ef5000/0x0/0x1bfc00000, data 0x4293a6c/0x4399000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120610816 unmapped: 35053568 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:08.187277+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a8e0400 session 0x55e14b350000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120684544 unmapped: 34979840 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1551084 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:09.187404+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b6ef4000/0x0/0x1bfc00000, data 0x4293a7c/0x439a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120684544 unmapped: 34979840 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:10.187583+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14b648400 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14772d800 session 0x55e14a07c3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120717312 unmapped: 34947072 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:11.187781+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b6eee000/0x0/0x1bfc00000, data 0x4299a83/0x439f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e148da2000 session 0x55e14b352960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120840192 unmapped: 34824192 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 ms_handle_reset con 0x55e14a169c00 session 0x55e14a2543c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:12.187949+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120840192 unmapped: 34824192 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:13.188137+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7780000/0x0/0x1bfc00000, data 0x3a08d1d/0x3b0e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.165637016s of 11.785794258s, submitted: 132
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120840192 unmapped: 34824192 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1485784 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:14.188288+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7780000/0x0/0x1bfc00000, data 0x3a08d1d/0x3b0e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120856576 unmapped: 34807808 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:15.188431+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120897536 unmapped: 34766848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:16.188607+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120897536 unmapped: 34766848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:17.188819+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120897536 unmapped: 34766848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:18.188938+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7778000/0x0/0x1bfc00000, data 0x3a11392/0x3b16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120897536 unmapped: 34766848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1483444 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:19.189058+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120897536 unmapped: 34766848 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:20.189259+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:21.189462+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:22.189651+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:23.189838+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7776000/0x0/0x1bfc00000, data 0x3a127ca/0x3b18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:24.190023+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1483892 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:25.190171+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.475461006s of 11.552788734s, submitted: 18
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:26.190368+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:27.190561+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7775000/0x0/0x1bfc00000, data 0x3a1382b/0x3b19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:28.190741+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:29.190961+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1486662 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:30.191197+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:31.191401+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b776f000/0x0/0x1bfc00000, data 0x3a18bf4/0x3b1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:32.191617+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b776f000/0x0/0x1bfc00000, data 0x3a18bf4/0x3b1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:33.191813+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b776e000/0x0/0x1bfc00000, data 0x3a19a6f/0x3b20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:34.192008+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1485974 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:35.192188+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b776e000/0x0/0x1bfc00000, data 0x3a19a6f/0x3b20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.942619324s of 10.003501892s, submitted: 15
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120905728 unmapped: 34758656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:36.192391+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8e0400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b776c000/0x0/0x1bfc00000, data 0x3a1c382/0x3b22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120922112 unmapped: 34742272 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:37.192544+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120922112 unmapped: 34742272 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:38.192712+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120930304 unmapped: 34734080 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:39.192889+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1487490 data_alloc: 285212672 data_used: 10420224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 heartbeat osd_stat(store_statfs(0x1b7764000/0x0/0x1bfc00000, data 0x3a23da4/0x3b2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120987648 unmapped: 34676736 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:40.193096+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 heartbeat osd_stat(store_statfs(0x1b775b000/0x0/0x1bfc00000, data 0x3a2cab5/0x3b33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 120995840 unmapped: 34668544 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:41.194448+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 ms_handle_reset con 0x55e14c17d800 session 0x55e14b3521e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122085376 unmapped: 33579008 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:42.194605+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.2 total, 600.0 interval
                                                          Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s
                                                          Cumulative WAL: 10K writes, 3013 syncs, 3.41 writes per sync, written: 0.03 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 5151 writes, 16K keys, 5151 commit groups, 1.0 writes per commit group, ingest: 14.16 MB, 0.02 MB/s
                                                          Interval WAL: 5151 writes, 2234 syncs, 2.31 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122118144 unmapped: 33546240 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:43.194818+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 heartbeat osd_stat(store_statfs(0x1b7748000/0x0/0x1bfc00000, data 0x3a3af15/0x3b46000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 164 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 165 ms_handle_reset con 0x55e14a8de400 session 0x55e14b353c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 165 heartbeat osd_stat(store_statfs(0x1b7742000/0x0/0x1bfc00000, data 0x3a3d4c6/0x3b4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 33505280 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:44.195012+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1507141 data_alloc: 285212672 data_used: 10444800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122159104 unmapped: 33505280 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:45.195192+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.661272049s of 10.000679970s, submitted: 95
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 166 ms_handle_reset con 0x55e14a8de400 session 0x55e14b351860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122167296 unmapped: 33497088 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:46.195360+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 166 ms_handle_reset con 0x55e14772d800 session 0x55e14b350f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122167296 unmapped: 33497088 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:47.195585+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 167 heartbeat osd_stat(store_statfs(0x1b7730000/0x0/0x1bfc00000, data 0x3a4cba3/0x3b5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 167 ms_handle_reset con 0x55e148da2000 session 0x55e1490e1c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122183680 unmapped: 33480704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:48.195741+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122183680 unmapped: 33480704 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:49.195940+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1510827 data_alloc: 285212672 data_used: 10457088
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 168 ms_handle_reset con 0x55e14a169c00 session 0x55e14a2e2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 168 ms_handle_reset con 0x55e14c17d800 session 0x55e14a2e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122208256 unmapped: 33456128 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:50.196151+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 168 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 169 ms_handle_reset con 0x55e14c17d800 session 0x55e14a2e2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122290176 unmapped: 33374208 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:51.196297+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 170 heartbeat osd_stat(store_statfs(0x1b7709000/0x0/0x1bfc00000, data 0x3a6c2f1/0x3b83000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 170 ms_handle_reset con 0x55e14772d800 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122273792 unmapped: 33390592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:52.196463+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 171 ms_handle_reset con 0x55e148da2000 session 0x55e14a3554a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122388480 unmapped: 33275904 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:53.196594+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 172 ms_handle_reset con 0x55e14a169c00 session 0x55e14a8c7680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122421248 unmapped: 33243136 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:54.196723+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1525171 data_alloc: 285212672 data_used: 10457088
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122421248 unmapped: 33243136 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:55.196821+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 handle_osd_map epochs [172,173], i have 173, src has [1,173]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.469422340s of 10.001818657s, submitted: 145
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 ms_handle_reset con 0x55e14a8de400 session 0x55e14a8c63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 heartbeat osd_stat(store_statfs(0x1b76f7000/0x0/0x1bfc00000, data 0x3a79c4d/0x3b95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:56.196971+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122421248 unmapped: 33243136 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 ms_handle_reset con 0x55e14a8de400 session 0x55e149db6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:57.197197+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122437632 unmapped: 33226752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 174 ms_handle_reset con 0x55e14772d800 session 0x55e149db7e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:58.197347+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122445824 unmapped: 33218560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 174 heartbeat osd_stat(store_statfs(0x1b76de000/0x0/0x1bfc00000, data 0x3a923d1/0x3baf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 174 ms_handle_reset con 0x55e148da2000 session 0x55e14a07c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:59.197564+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122454016 unmapped: 33210368 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1537681 data_alloc: 285212672 data_used: 10469376
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 ms_handle_reset con 0x55e14a169c00 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:00.197795+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122437632 unmapped: 33226752 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 ms_handle_reset con 0x55e14c17d800 session 0x55e14a07c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 ms_handle_reset con 0x55e14c17d800 session 0x55e1490e0000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:01.197937+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122445824 unmapped: 33218560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:02.198074+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122445824 unmapped: 33218560 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 176 heartbeat osd_stat(store_statfs(0x1b76d0000/0x0/0x1bfc00000, data 0x3aa1e89/0x3bbd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:03.198262+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122462208 unmapped: 33202176 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:04.198440+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123510784 unmapped: 32153600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1545094 data_alloc: 285212672 data_used: 10469376
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 176 heartbeat osd_stat(store_statfs(0x1b76b3000/0x0/0x1bfc00000, data 0x3abc699/0x3bda000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:05.198615+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123510784 unmapped: 32153600 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.556632996s of 10.004342079s, submitted: 135
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 49
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:06.198793+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122953728 unmapped: 32710656 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 176 ms_handle_reset con 0x55e14772d800 session 0x55e1490e0d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:07.198960+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122904576 unmapped: 32759808 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 177 heartbeat osd_stat(store_statfs(0x1b768e000/0x0/0x1bfc00000, data 0x3ae1bdf/0x3c00000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:08.199145+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 122896384 unmapped: 32768000 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 177 ms_handle_reset con 0x55e148da2000 session 0x55e14a8e25a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 177 ms_handle_reset con 0x55e14a169c00 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:09.199297+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123019264 unmapped: 32645120 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1547058 data_alloc: 285212672 data_used: 10481664
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:10.199521+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123117568 unmapped: 32546816 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 177 heartbeat osd_stat(store_statfs(0x1b767e000/0x0/0x1bfc00000, data 0x3af1817/0x3c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:11.199660+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123150336 unmapped: 32514048 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:12.199846+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123224064 unmapped: 32440320 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 178 heartbeat osd_stat(store_statfs(0x1b7671000/0x0/0x1bfc00000, data 0x3afc006/0x3c1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:13.200027+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123281408 unmapped: 32382976 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:14.200170+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123281408 unmapped: 32382976 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1552474 data_alloc: 285212672 data_used: 10493952
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:15.200317+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123289600 unmapped: 32374784 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:16.200526+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.402454376s of 10.740973473s, submitted: 108
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123297792 unmapped: 32366592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 178 ms_handle_reset con 0x55e14a8de400 session 0x55e148d38d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:17.200679+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123297792 unmapped: 32366592 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b765a000/0x0/0x1bfc00000, data 0x3b13adb/0x3c34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a8de400 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:18.200842+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14a2e2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123330560 unmapped: 32333824 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:19.200997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123330560 unmapped: 32333824 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559708 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:20.201268+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123330560 unmapped: 32333824 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:21.201389+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123338752 unmapped: 32325632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:22.201562+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123338752 unmapped: 32325632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:23.201811+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123338752 unmapped: 32325632 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b764b000/0x0/0x1bfc00000, data 0x3b22100/0x3c43000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:24.202006+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123363328 unmapped: 32301056 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559022 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:25.202147+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123363328 unmapped: 32301056 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b763c000/0x0/0x1bfc00000, data 0x3b31691/0x3c52000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:26.202294+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123363328 unmapped: 32301056 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.248447418s of 10.407351494s, submitted: 54
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148da2000 session 0x55e14b350960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b763d000/0x0/0x1bfc00000, data 0x3b315f6/0x3c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:27.202500+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123371520 unmapped: 32292864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a169c00 session 0x55e14b350000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:28.202666+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123379712 unmapped: 32284672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14c17d800 session 0x55e14b350f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:29.202819+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123379712 unmapped: 32284672 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1558577 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:30.203046+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b763c000/0x0/0x1bfc00000, data 0x3b315f6/0x3c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123396096 unmapped: 32268288 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:31.203273+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123396096 unmapped: 32268288 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14a8c7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:32.203449+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148da2000 session 0x55e14a366780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123404288 unmapped: 32260096 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a169c00 session 0x55e14a354000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:33.203609+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123420672 unmapped: 32243712 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a8de400 session 0x55e1490e7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148ba2400 session 0x55e14a24cb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:34.203805+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123428864 unmapped: 32235520 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1570490 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b7637000/0x0/0x1bfc00000, data 0x3b31b98/0x3c57000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:35.203957+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123428864 unmapped: 32235520 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148ba2400 session 0x55e14a24c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:36.204131+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123428864 unmapped: 32235520 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.989691734s of 10.091907501s, submitted: 26
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14a24d0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a169c00 session 0x55e14a8c61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8de400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148da2000 session 0x55e14a24c3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:37.204295+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a8de400 session 0x55e149224960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123502592 unmapped: 32161792 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14909c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148ba2400 session 0x55e14909de00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148da2000 session 0x55e14a8c70e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a169c00 session 0x55e14909da40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bc34c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a2b3800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:38.204430+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123535360 unmapped: 32129024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a2b3800 session 0x55e14b6d45a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14bc34c00 session 0x55e14909cb40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14c26f2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:39.204576+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b7633000/0x0/0x1bfc00000, data 0x3b392be/0x3c5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148ba2400 session 0x55e14c26e960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123535360 unmapped: 32129024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1569010 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:40.204838+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123535360 unmapped: 32129024 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:41.204992+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123346944 unmapped: 32317440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:42.205191+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123346944 unmapped: 32317440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:43.205379+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b7630000/0x0/0x1bfc00000, data 0x3b3d2f7/0x3c5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123346944 unmapped: 32317440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:44.205561+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 123346944 unmapped: 32317440 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1568406 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:45.205751+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124395520 unmapped: 31268864 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:46.205951+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124444672 unmapped: 31219712 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148da2000 session 0x55e14c26f4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:47.206138+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124444672 unmapped: 31219712 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.726561546s of 11.258630753s, submitted: 134
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14a169c00 session 0x55e14c26e780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:48.206306+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e14772d800 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124469248 unmapped: 31195136 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 heartbeat osd_stat(store_statfs(0x1b7621000/0x0/0x1bfc00000, data 0x3b4d3cc/0x3c6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:49.206463+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124567552 unmapped: 31096832 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1569976 data_alloc: 285212672 data_used: 10506240
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:50.206624+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 ms_handle_reset con 0x55e148ba2400 session 0x55e14a3554a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124575744 unmapped: 31088640 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:51.206853+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124633088 unmapped: 31031296 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:52.207048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124633088 unmapped: 31031296 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 ms_handle_reset con 0x55e148da2000 session 0x55e14b350960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bc34c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 ms_handle_reset con 0x55e14bc34c00 session 0x55e14b350000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:53.207243+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 31014912 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 ms_handle_reset con 0x55e14a5df800 session 0x55e14a2e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 heartbeat osd_stat(store_statfs(0x1b760b000/0x0/0x1bfc00000, data 0x3b60263/0x3c82000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:54.207398+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124649472 unmapped: 31014912 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1578616 data_alloc: 285212672 data_used: 10518528
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:55.207561+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 ms_handle_reset con 0x55e14a5df800 session 0x55e14a2e2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124690432 unmapped: 30973952 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 ms_handle_reset con 0x55e14772d800 session 0x55e148d38f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:56.207719+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124698624 unmapped: 30965760 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:57.207894+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124698624 unmapped: 30965760 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.765831947s of 10.120155334s, submitted: 123
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e148ba2400 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:58.208091+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124715008 unmapped: 30949376 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:59.208239+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 heartbeat osd_stat(store_statfs(0x1b75f5000/0x0/0x1bfc00000, data 0x3b72f3f/0x3c98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124715008 unmapped: 30949376 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1584976 data_alloc: 285212672 data_used: 10530816
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e148da2000 session 0x55e14a8e25a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bc34c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:00.208440+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124723200 unmapped: 30941184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e14bc34c00 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 heartbeat osd_stat(store_statfs(0x1b75f4000/0x0/0x1bfc00000, data 0x3b75cd1/0x3c9a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:01.208622+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14bc34c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e14bc34c00 session 0x55e1490e0d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124723200 unmapped: 30941184 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e14772d800 session 0x55e14a07c5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:02.208873+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124747776 unmapped: 30916608 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e148ba2400 session 0x55e14a07c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:03.209056+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 ms_handle_reset con 0x55e148da2000 session 0x55e149db6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 30892032 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:04.209212+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124772352 unmapped: 30892032 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1585935 data_alloc: 285212672 data_used: 10530816
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8df000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:05.209374+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 181 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124788736 unmapped: 30875648 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:06.209533+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 182 heartbeat osd_stat(store_statfs(0x1b6438000/0x0/0x1bfc00000, data 0x3b8ef41/0x3cb5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 124829696 unmapped: 30834688 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 183 ms_handle_reset con 0x55e14a8df000 session 0x55e14b352d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:07.209667+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 183 handle_osd_map epochs [183,184], i have 183, src has [1,184]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 125894656 unmapped: 29769728 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 184 ms_handle_reset con 0x55e148ba2400 session 0x55e14b350f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:08.209901+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.203419685s of 10.435039520s, submitted: 73
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 125911040 unmapped: 29753344 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 185 ms_handle_reset con 0x55e14772d800 session 0x55e14b6d4960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 185 ms_handle_reset con 0x55e14a5df800 session 0x55e14a313e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:09.210498+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 125911040 unmapped: 29753344 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1609022 data_alloc: 285212672 data_used: 10555392
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:10.210726+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 125919232 unmapped: 29745152 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 186 ms_handle_reset con 0x55e148da2000 session 0x55e14909d4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:11.210859+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 186 heartbeat osd_stat(store_statfs(0x1b6405000/0x0/0x1bfc00000, data 0x3bb889c/0x3ce8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 125984768 unmapped: 29679616 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8df000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:12.211010+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 ms_handle_reset con 0x55e14a8df000 session 0x55e1490dfe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8df000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 126787584 unmapped: 28876800 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 ms_handle_reset con 0x55e14a8df000 session 0x55e1490df680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:13.212284+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 126787584 unmapped: 28876800 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 50
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:14.212448+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 heartbeat osd_stat(store_statfs(0x1b63f6000/0x0/0x1bfc00000, data 0x3bc9042/0x3cf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 127000576 unmapped: 28663808 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1615754 data_alloc: 285212672 data_used: 10579968
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:15.212607+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128049152 unmapped: 27615232 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:16.212847+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128049152 unmapped: 27615232 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:17.213055+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128049152 unmapped: 27615232 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:18.213295+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 189 heartbeat osd_stat(store_statfs(0x1b63d4000/0x0/0x1bfc00000, data 0x3be9213/0x3d19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128049152 unmapped: 27615232 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.061913490s of 10.517796516s, submitted: 444
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:19.213501+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128057344 unmapped: 27607040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1621508 data_alloc: 285212672 data_used: 10608640
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:20.213682+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128057344 unmapped: 27607040 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 189 ms_handle_reset con 0x55e14772d800 session 0x55e149097c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 189 ms_handle_reset con 0x55e148ba2400 session 0x55e14b95af00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:21.213938+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 189 heartbeat osd_stat(store_statfs(0x1b5fce000/0x0/0x1bfc00000, data 0x3beed7f/0x3d1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 128090112 unmapped: 27574272 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:22.214063+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129155072 unmapped: 26509312 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e148da2000 session 0x55e14a0cef00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:23.214278+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129155072 unmapped: 26509312 heap: 155664384 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e14a5df800 session 0x55e14c7eab40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e14a5df800 session 0x55e14a07c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:24.214414+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e14772d800 session 0x55e14c1d3680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 heartbeat osd_stat(store_statfs(0x1b5fb9000/0x0/0x1bfc00000, data 0x3bfcbd6/0x3d34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129163264 unmapped: 34897920 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1692805 data_alloc: 285212672 data_used: 10629120
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:25.214571+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e148da2000 session 0x55e14a8c63c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 ms_handle_reset con 0x55e14c17d400 session 0x55e1490e7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129187840 unmapped: 34873344 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:26.214694+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129335296 unmapped: 34725888 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:27.214838+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 145096704 unmapped: 18964480 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:28.214967+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 191 heartbeat osd_stat(store_statfs(0x1b479b000/0x0/0x1bfc00000, data 0x541991c/0x5552000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137846784 unmapped: 26214400 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.519798279s of 10.133808136s, submitted: 119
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:29.215098+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2036775 data_alloc: 285212672 data_used: 10645504
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129515520 unmapped: 34545664 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:30.215322+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129540096 unmapped: 34521088 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:31.215503+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 192 heartbeat osd_stat(store_statfs(0x1b1771000/0x0/0x1bfc00000, data 0x844169f/0x857c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138059776 unmapped: 26001408 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:32.215668+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129736704 unmapped: 34324480 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:33.215860+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138272768 unmapped: 25788416 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:34.215971+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2368851 data_alloc: 285212672 data_used: 10657792
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 129974272 unmapped: 34086912 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:35.216106+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 192 heartbeat osd_stat(store_statfs(0x1af747000/0x0/0x1bfc00000, data 0xa46c499/0xa5a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139583488 unmapped: 24477696 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:36.216226+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139837440 unmapped: 24223744 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:37.216347+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain systemd-journald[47810]: Data hash table of /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Dec 06 10:32:58 np0005548789.localdomain systemd-journald[47810]: /run/log/journal/4b30904fc4748c16d0c72dbebcabab49/system.journal: Journal header limits reached or header out-of-date, rotating.
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 131497984 unmapped: 32563200 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:38.216481+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 131514368 unmapped: 32546816 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:39.216685+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.045259476s of 10.059768677s, submitted: 118
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2594437 data_alloc: 285212672 data_used: 10670080
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 131571712 unmapped: 32489472 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1ad703000/0x0/0x1bfc00000, data 0xc4b11b8/0xc5eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:40.217091+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 140075008 unmapped: 23986176 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:41.217245+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 140156928 unmapped: 23904256 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:42.217410+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 140263424 unmapped: 23797760 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:43.217540+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1aaee9000/0x0/0x1bfc00000, data 0xeccad9c/0xee05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 131997696 unmapped: 32063488 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:44.217726+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a9ee9000/0x0/0x1bfc00000, data 0xfccad9c/0xfe05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3034533 data_alloc: 285212672 data_used: 10670080
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 140509184 unmapped: 23552000 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:45.217886+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 132169728 unmapped: 31891456 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:46.218025+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 132235264 unmapped: 31825920 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:47.218185+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 140689408 unmapped: 23371776 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:48.218339+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 132489216 unmapped: 31571968 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:49.218524+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.399555206s of 10.005765915s, submitted: 54
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3318111 data_alloc: 285212672 data_used: 10670080
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 141123584 unmapped: 22937600 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a7e7c000/0x0/0x1bfc00000, data 0x12d1553e/0x12e52000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:50.218727+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 133947392 unmapped: 30113792 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:51.218891+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 51
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 134373376 unmapped: 29687808 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:52.219035+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 134529024 unmapped: 29532160 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a564e000/0x0/0x1bfc00000, data 0x1553e938/0x15680000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:53.219238+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 142966784 unmapped: 21094400 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:54.219387+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3761059 data_alloc: 285212672 data_used: 10670080
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 134676480 unmapped: 29384704 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:55.219536+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a362c000/0x0/0x1bfc00000, data 0x17562962/0x176a2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 143147008 unmapped: 20914176 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:56.219815+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 134823936 unmapped: 29237248 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:57.220075+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a2e1c000/0x0/0x1bfc00000, data 0x17d731a1/0x17eb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14b648c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 ms_handle_reset con 0x55e14b648c00 session 0x55e14b6d45a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 135053312 unmapped: 29007872 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:58.220242+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 134979584 unmapped: 29081600 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:59.220363+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.228947639s of 10.077434540s, submitted: 76
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 ms_handle_reset con 0x55e14772d800 session 0x55e14b6d4960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 heartbeat osd_stat(store_statfs(0x1a15ff000/0x0/0x1bfc00000, data 0x195909f4/0x196cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4042973 data_alloc: 285212672 data_used: 10670080
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 143458304 unmapped: 20602880 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 ms_handle_reset con 0x55e14c17d400 session 0x55e14b350f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:00.220542+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 ms_handle_reset con 0x55e14a5df800 session 0x55e14c26ed20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a02fc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e146a2bc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 143761408 unmapped: 20299776 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 194 ms_handle_reset con 0x55e146a2bc00 session 0x55e149db6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:01.220690+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 194 ms_handle_reset con 0x55e14a63d000 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 135446528 unmapped: 28614656 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14a02fc00 session 0x55e149096000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:02.220819+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e148da2000 session 0x55e14b350960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 heartbeat osd_stat(store_statfs(0x19dc40000/0x0/0x1bfc00000, data 0x1bda83e9/0x1beec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137052160 unmapped: 27009024 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:03.220964+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14a63d000 session 0x55e14a2e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e146a2bc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137117696 unmapped: 26943488 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:04.221103+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e146a2bc00 session 0x55e14a2e2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4436401 data_alloc: 285212672 data_used: 10682368
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137125888 unmapped: 26935296 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:05.221252+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14a5df800 session 0x55e14a24de00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14772d800 session 0x55e14909cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e146a2bc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137314304 unmapped: 26746880 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e146a2bc00 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:06.221455+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e148da2000 session 0x55e14a8c61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a02fc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 heartbeat osd_stat(store_statfs(0x19bb1e000/0x0/0x1bfc00000, data 0x1decafb4/0x1e010000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14a02fc00 session 0x55e14a0cb0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137445376 unmapped: 26615808 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14c17d400 session 0x55e14a3663c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:07.221588+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e146a2bc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 ms_handle_reset con 0x55e14c17d400 session 0x55e14c7ead20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 ms_handle_reset con 0x55e14a5df800 session 0x55e14a8e21e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 ms_handle_reset con 0x55e14772d800 session 0x55e1490de1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137691136 unmapped: 26370048 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:08.221734+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 ms_handle_reset con 0x55e148da2000 session 0x55e14a354d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e146a2bc00 session 0x55e14a8e2b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e148ba2400 session 0x55e1490dde00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e14772d800 session 0x55e14c26f860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e14a63d000 session 0x55e14c1d25a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137797632 unmapped: 26263552 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 heartbeat osd_stat(store_statfs(0x19b405000/0x0/0x1bfc00000, data 0x1e5df6f4/0x1e725000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:09.221876+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e148da2000 session 0x55e14b353a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.629086494s of 10.206071854s, submitted: 251
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4622226 data_alloc: 285212672 data_used: 10698752
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137797632 unmapped: 26263552 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:10.222069+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a02fc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 ms_handle_reset con 0x55e14a02fc00 session 0x55e14a8e2b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e14c17d400 session 0x55e14c1d30e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 heartbeat osd_stat(store_statfs(0x19b402000/0x0/0x1bfc00000, data 0x1e5e22a9/0x1e72b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [0,0,2])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137609216 unmapped: 26451968 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:11.222221+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 heartbeat osd_stat(store_statfs(0x19b402000/0x0/0x1bfc00000, data 0x1e5e22a9/0x1e72b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [1,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e14772d800 session 0x55e14a4f5680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e148ba2400 session 0x55e14909d4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148da2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e14a5df800 session 0x55e14a3130e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c314c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e148da2000 session 0x55e14a93ef00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e14c314c00 session 0x55e14b6d4960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 136355840 unmapped: 27705344 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 ms_handle_reset con 0x55e14772d800 session 0x55e14c1d2f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:12.222421+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 136314880 unmapped: 27746304 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 199 ms_handle_reset con 0x55e14a63d000 session 0x55e14a07dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:13.222714+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 136314880 unmapped: 27746304 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:14.222972+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 200 ms_handle_reset con 0x55e148ba2400 session 0x55e14a8c6780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1790710 data_alloc: 285212672 data_used: 10719232
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 136388608 unmapped: 27672576 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:15.223142+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 200 ms_handle_reset con 0x55e14a5df800 session 0x55e14a8c65a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137461760 unmapped: 26599424 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:16.223323+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 201 ms_handle_reset con 0x55e14a5df800 session 0x55e14b6d4d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 201 ms_handle_reset con 0x55e14772d800 session 0x55e14a8c6b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 201 heartbeat osd_stat(store_statfs(0x1b5bbc000/0x0/0x1bfc00000, data 0x3e2a787/0x3f72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137469952 unmapped: 26591232 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:17.223539+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137502720 unmapped: 26558464 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c314c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:18.223689+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 202 ms_handle_reset con 0x55e14c314c00 session 0x55e148d383c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 202 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 203 ms_handle_reset con 0x55e148ba2400 session 0x55e14a8e30e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17d400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 203 ms_handle_reset con 0x55e14c17d400 session 0x55e14a4f4000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138715136 unmapped: 25346048 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:19.223867+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 203 ms_handle_reset con 0x55e14772d800 session 0x55e149ddc780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.866982460s of 10.045821190s, submitted: 354
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a07c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 ms_handle_reset con 0x55e14a63d000 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1821711 data_alloc: 285212672 data_used: 10747904
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 ms_handle_reset con 0x55e148ba2400 session 0x55e149ddcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138772480 unmapped: 25288704 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:20.224098+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c314c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 ms_handle_reset con 0x55e14c314c00 session 0x55e14b95af00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 205 ms_handle_reset con 0x55e14a5df800 session 0x55e14c1dcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138805248 unmapped: 25255936 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:21.224351+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 206 ms_handle_reset con 0x55e14a63d000 session 0x55e14a07cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 206 heartbeat osd_stat(store_statfs(0x1b5b81000/0x0/0x1bfc00000, data 0x3e56a12/0x3fac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138846208 unmapped: 25214976 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 206 ms_handle_reset con 0x55e14772d800 session 0x55e14a2543c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:22.224578+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 ms_handle_reset con 0x55e148ba2400 session 0x55e14a24de00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138878976 unmapped: 25182208 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:23.224848+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a254780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a2552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138895360 unmapped: 25165824 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:24.225012+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e148ba2400 session 0x55e14a254f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14772d800 session 0x55e14a24dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63d000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1834621 data_alloc: 285212672 data_used: 10764288
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138944512 unmapped: 25116672 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14a5df800 session 0x55e14a2554a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14a63d000 session 0x55e14a354d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:25.225177+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 heartbeat osd_stat(store_statfs(0x1b5b77000/0x0/0x1bfc00000, data 0x3e5f0af/0x3fb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14772d800 session 0x55e14a0bbe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138977280 unmapped: 25083904 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:26.225318+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0ba960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138977280 unmapped: 25083904 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14a5df800 session 0x55e14a0bb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:27.225450+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a0bbc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 209 heartbeat osd_stat(store_statfs(0x1b5b5a000/0x0/0x1bfc00000, data 0x3e7e41f/0x3fd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139993088 unmapped: 24068096 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:28.225676+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138944512 unmapped: 25116672 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:29.225888+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1836274 data_alloc: 285212672 data_used: 10772480
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138944512 unmapped: 25116672 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:30.226081+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.996289253s of 10.892097473s, submitted: 272
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:31.226339+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 209 heartbeat osd_stat(store_statfs(0x1b5b53000/0x0/0x1bfc00000, data 0x3e86168/0x3fdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138952704 unmapped: 25108480 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c3a9800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 209 ms_handle_reset con 0x55e14c3a9800 session 0x55e14a0bba40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:32.226488+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138952704 unmapped: 25108480 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:33.226663+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138969088 unmapped: 25092096 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 211 ms_handle_reset con 0x55e14772d800 session 0x55e14a4f4780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:34.226834+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138338304 unmapped: 25722880 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1847043 data_alloc: 285212672 data_used: 10788864
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:35.227033+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138338304 unmapped: 25722880 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 211 heartbeat osd_stat(store_statfs(0x1b5b35000/0x0/0x1bfc00000, data 0x3e9e934/0x3ff8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 212 ms_handle_reset con 0x55e148ba2400 session 0x55e14a24dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 212 ms_handle_reset con 0x55e14a5df800 session 0x55e14a2552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:36.227220+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138362880 unmapped: 25698304 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 213 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a254780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a784c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 213 ms_handle_reset con 0x55e14a784c00 session 0x55e14a2543c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:37.227414+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 26091520 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 ms_handle_reset con 0x55e14772d800 session 0x55e14a07cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 ms_handle_reset con 0x55e148ba2400 session 0x55e14b95af00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 handle_osd_map epochs [213,214], i have 214, src has [1,214]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:38.227540+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 heartbeat osd_stat(store_statfs(0x1b570f000/0x0/0x1bfc00000, data 0x3ebdcee/0x401d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 ms_handle_reset con 0x55e14a5df800 session 0x55e149ddcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137994240 unmapped: 26066944 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:39.227689+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137994240 unmapped: 26066944 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a4f4000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1856120 data_alloc: 285212672 data_used: 10784768
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:40.227871+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 137994240 unmapped: 26066944 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.601160049s of 10.000592232s, submitted: 102
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:41.228045+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 138002432 unmapped: 26058752 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 215 ms_handle_reset con 0x55e14c17dc00 session 0x55e14a8e30e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:42.228212+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139059200 unmapped: 25001984 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 215 heartbeat osd_stat(store_statfs(0x1b56fe000/0x0/0x1bfc00000, data 0x3ed0d5d/0x402f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:43.228373+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139075584 unmapped: 24985600 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e14772d800 session 0x55e14c1d2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e14c17dc00 session 0x55e14a8c6b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:44.228578+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e148ba2400 session 0x55e14b6d4d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139075584 unmapped: 24985600 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e14a5df800 session 0x55e14a8c65a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 heartbeat osd_stat(store_statfs(0x1b56e6000/0x0/0x1bfc00000, data 0x3ee5838/0x4047000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a07dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1869747 data_alloc: 285212672 data_used: 10797056
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:45.228737+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139075584 unmapped: 24985600 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:46.228934+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139182080 unmapped: 24879104 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:47.229140+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139182080 unmapped: 24879104 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 heartbeat osd_stat(store_statfs(0x1b56c3000/0x0/0x1bfc00000, data 0x3f0863d/0x406b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:48.229304+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139182080 unmapped: 24879104 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e14a8d5000 session 0x55e14c1d2f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:49.229666+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 heartbeat osd_stat(store_statfs(0x1b56c3000/0x0/0x1bfc00000, data 0x3f08604/0x406b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139313152 unmapped: 24748032 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 heartbeat osd_stat(store_statfs(0x1b56c3000/0x0/0x1bfc00000, data 0x3f08604/0x406b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 216 ms_handle_reset con 0x55e148ba2400 session 0x55e14a366d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 217 ms_handle_reset con 0x55e14772d800 session 0x55e14a93ef00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1876144 data_alloc: 285212672 data_used: 10809344
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:50.230017+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139468800 unmapped: 24592384 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.770746231s of 10.000588417s, submitted: 63
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63cc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14a5df800 session 0x55e14909cd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14a63cc00 session 0x55e14909cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14c17dc00 session 0x55e14909d4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14a63b800 session 0x55e14c1d3a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:51.230164+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 139493376 unmapped: 24567808 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14772d800 session 0x55e14c1d30e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:52.230327+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148955136 unmapped: 15106048 heap: 164061184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:53.230546+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 154599424 unmapped: 22069248 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a07c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14e73dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:54.230726+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 146481152 unmapped: 30187520 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 ms_handle_reset con 0x55e14e73dc00 session 0x55e14a8e3c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2811163 data_alloc: 285212672 data_used: 10813440
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:55.230894+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 151814144 unmapped: 24854528 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 heartbeat osd_stat(store_statfs(0x1ad28e000/0x0/0x1bfc00000, data 0xc337253/0xc4a0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:56.231068+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 147652608 unmapped: 29016064 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:57.231249+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 143515648 unmapped: 33153024 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:58.231366+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 147759104 unmapped: 28909568 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 heartbeat osd_stat(store_statfs(0x1a5265000/0x0/0x1bfc00000, data 0x1435cd47/0x144c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,0,0,0,2])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:59.231528+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 147800064 unmapped: 28868608 heap: 176668672 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 heartbeat osd_stat(store_statfs(0x1a5265000/0x0/0x1bfc00000, data 0x1435cd47/0x144c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4083343 data_alloc: 285212672 data_used: 10825728
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:00.231710+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 144695296 unmapped: 36175872 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 6.328874588s of 10.004235268s, submitted: 419
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:01.231879+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14772d800 session 0x55e14a4f5a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148922368 unmapped: 31948800 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 heartbeat osd_stat(store_statfs(0x19fe50000/0x0/0x1bfc00000, data 0x19773605/0x198de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [0,0,0,0,0,1,0,1,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:02.232068+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149094400 unmapped: 31776768 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a63b800 session 0x55e14c1d3680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:03.232201+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 145145856 unmapped: 35725312 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:04.232383+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 146374656 unmapped: 34496512 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a8d5000 session 0x55e14c1d2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5430131 data_alloc: 285212672 data_used: 10825728
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:05.232507+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 160399360 unmapped: 20471808 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14c17dc00 session 0x55e14b350960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:06.232628+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a5df800 session 0x55e14b3525a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148209664 unmapped: 32661504 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e148ba2400 session 0x55e14a8e2b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14772d800 session 0x55e148c250e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a63b800 session 0x55e14a08fe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a5df800 session 0x55e14909d4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14a8d5000 session 0x55e14c7eb680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:07.232828+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 heartbeat osd_stat(store_statfs(0x192e29000/0x0/0x1bfc00000, data 0x26799a12/0x26905000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148226048 unmapped: 32645120 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 heartbeat osd_stat(store_statfs(0x192e29000/0x0/0x1bfc00000, data 0x26799a12/0x26905000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [0,0,1,3,2])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e14772d800 session 0x55e14909cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:08.232976+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149569536 unmapped: 31301632 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 ms_handle_reset con 0x55e148ba2400 session 0x55e14a8c6b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:09.233131+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149569536 unmapped: 31301632 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 220 ms_handle_reset con 0x55e14a5df800 session 0x55e14c1d2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2056539 data_alloc: 285212672 data_used: 10842112
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:10.233351+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 6.904662609s of 10.003268242s, submitted: 423
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148840448 unmapped: 32030720 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 220 heartbeat osd_stat(store_statfs(0x1b5603000/0x0/0x1bfc00000, data 0x3fc27dc/0x412a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 220 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:11.233588+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 148848640 unmapped: 32022528 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 222 ms_handle_reset con 0x55e14a63b800 session 0x55e14a4f4000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:12.233808+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149274624 unmapped: 31596544 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a8d5000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 222 ms_handle_reset con 0x55e14a8d5000 session 0x55e14a07cf00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:13.234030+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149250048 unmapped: 31621120 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 52
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 223 ms_handle_reset con 0x55e14772d800 session 0x55e14a2543c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:14.234216+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 223 ms_handle_reset con 0x55e148ba2400 session 0x55e14a2552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149299200 unmapped: 31571968 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:15.234411+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2069645 data_alloc: 285212672 data_used: 10854400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149381120 unmapped: 31490048 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 223 heartbeat osd_stat(store_statfs(0x1b55c9000/0x0/0x1bfc00000, data 0x3ff5eac/0x4164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:16.234714+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149438464 unmapped: 31432704 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 224 ms_handle_reset con 0x55e14a5df800 session 0x55e14a4f4780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:17.234937+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149438464 unmapped: 31432704 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:18.235086+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14c17dc00 session 0x55e14a93f2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14a63b800 session 0x55e14a0bba40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14a63b800 session 0x55e14a0ba960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 149479424 unmapped: 31391744 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 heartbeat osd_stat(store_statfs(0x1b55a3000/0x0/0x1bfc00000, data 0x401577a/0x4189000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14772d800 session 0x55e14a354d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:19.235198+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e148ba2400 session 0x55e14a24c000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150626304 unmapped: 30244864 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14a5df800 session 0x55e14a4f5860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:20.235442+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14c17dc00 session 0x55e14b351a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2086511 data_alloc: 285212672 data_used: 10870784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.308485031s of 10.009181976s, submitted: 480
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150462464 unmapped: 30408704 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14c17dc00 session 0x55e149225c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:21.235972+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 ms_handle_reset con 0x55e14a168800 session 0x55e14909c960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150462464 unmapped: 30408704 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 226 ms_handle_reset con 0x55e14772d800 session 0x55e1490e1c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:22.236135+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 227 handle_osd_map epochs [226,227], i have 227, src has [1,227]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 227 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0ce1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150732800 unmapped: 30138368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 handle_osd_map epochs [226,228], i have 228, src has [1,228]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 ms_handle_reset con 0x55e14a5df800 session 0x55e14b6d5c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:23.236340+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 heartbeat osd_stat(store_statfs(0x1b555e000/0x0/0x1bfc00000, data 0x405704c/0x41ce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150732800 unmapped: 30138368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 ms_handle_reset con 0x55e14a5df800 session 0x55e1490e0d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 ms_handle_reset con 0x55e14772d800 session 0x55e14909d860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:24.236477+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 ms_handle_reset con 0x55e148ba2400 session 0x55e148d38d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150732800 unmapped: 30138368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c17dc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 ms_handle_reset con 0x55e14a168800 session 0x55e1490e6000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:25.236681+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 228 handle_osd_map epochs [228,229], i have 228, src has [1,229]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2200215 data_alloc: 285212672 data_used: 10907648
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 ms_handle_reset con 0x55e14c17dc00 session 0x55e149ddcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 ms_handle_reset con 0x55e14a63b800 session 0x55e1490dc3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150142976 unmapped: 30728192 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:26.236940+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 ms_handle_reset con 0x55e148ba2400 session 0x55e14909dc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 ms_handle_reset con 0x55e14772d800 session 0x55e1490e0d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150167552 unmapped: 30703616 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 heartbeat osd_stat(store_statfs(0x1b484f000/0x0/0x1bfc00000, data 0x4d6560b/0x4ede000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:27.237147+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 ms_handle_reset con 0x55e14a5df800 session 0x55e14c7eba40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14e73cc00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150331392 unmapped: 30539776 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14d1d1c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 230 ms_handle_reset con 0x55e14e73cc00 session 0x55e14a2554a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 230 ms_handle_reset con 0x55e14d1d1c00 session 0x55e14a3663c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:28.237315+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150667264 unmapped: 30203904 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 230 ms_handle_reset con 0x55e148ba2400 session 0x55e14a31b2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a5df800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 231 ms_handle_reset con 0x55e14772d800 session 0x55e14b350780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 231 ms_handle_reset con 0x55e14a168800 session 0x55e14a0ce1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:29.237602+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150691840 unmapped: 30179328 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e14a5df800 session 0x55e149097c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e14772d800 session 0x55e14a0ba960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:30.237900+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2336995 data_alloc: 285212672 data_used: 10919936
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 8.624765396s of 10.019362450s, submitted: 293
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 heartbeat osd_stat(store_statfs(0x1b3538000/0x0/0x1bfc00000, data 0x5c7517d/0x5df4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150708224 unmapped: 30162944 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0bba40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e14a168800 session 0x55e14a93f2c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:31.238090+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14d1d1c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 150724608 unmapped: 30146560 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a63b800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c315000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e14a169400 session 0x55e14c7eb680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:32.238281+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e14a63b800 session 0x55e14a31a960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152199168 unmapped: 28672000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 ms_handle_reset con 0x55e148d22000 session 0x55e14909d4a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 233 ms_handle_reset con 0x55e14c315000 session 0x55e149096d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 233 ms_handle_reset con 0x55e14d1d1c00 session 0x55e14a2552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:33.238539+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152215552 unmapped: 28655616 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:34.238712+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152084480 unmapped: 28786688 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 ms_handle_reset con 0x55e14772d800 session 0x55e14a8e2b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 handle_osd_map epochs [234,235], i have 235, src has [1,235]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 handle_osd_map epochs [234,235], i have 235, src has [1,235]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 handle_osd_map epochs [234,235], i have 235, src has [1,235]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:35.238869+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 ms_handle_reset con 0x55e148ba2400 session 0x55e14b3525a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2350249 data_alloc: 285212672 data_used: 10932224
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152199168 unmapped: 28672000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 heartbeat osd_stat(store_statfs(0x1b3514000/0x0/0x1bfc00000, data 0x5c9530a/0x5e18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:36.239048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152199168 unmapped: 28672000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 heartbeat osd_stat(store_statfs(0x1b34ef000/0x0/0x1bfc00000, data 0x5cb880a/0x5e3d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:37.239217+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152387584 unmapped: 28483584 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 237 handle_osd_map epochs [236,237], i have 237, src has [1,237]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 237 ms_handle_reset con 0x55e148ba2400 session 0x55e14c1d2000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:38.239437+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152453120 unmapped: 28418048 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:39.239620+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 238 ms_handle_reset con 0x55e14772d800 session 0x55e14a4f5a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152518656 unmapped: 28352512 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 238 ms_handle_reset con 0x55e148d22000 session 0x55e14a354d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:40.239859+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158421 data_alloc: 285212672 data_used: 10956800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152616960 unmapped: 28254208 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c315000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.457736969s of 10.424562454s, submitted: 286
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 238 ms_handle_reset con 0x55e14c315000 session 0x55e148c24780
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 238 heartbeat osd_stat(store_statfs(0x1b50cb000/0x0/0x1bfc00000, data 0x40dd688/0x4262000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:41.240122+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152674304 unmapped: 28196864 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:42.240323+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 152682496 unmapped: 28188672 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b50a7000/0x0/0x1bfc00000, data 0x40fef1e/0x4285000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:43.240537+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b50a7000/0x0/0x1bfc00000, data 0x40fef1e/0x4285000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 153731072 unmapped: 27140096 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:44.240690+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 153731072 unmapped: 27140096 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b5092000/0x0/0x1bfc00000, data 0x4113e99/0x429c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:45.240885+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2171315 data_alloc: 285212672 data_used: 10969088
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b507a000/0x0/0x1bfc00000, data 0x412c6c0/0x42b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 153763840 unmapped: 27107328 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 heartbeat osd_stat(store_statfs(0x1b507a000/0x0/0x1bfc00000, data 0x412c6c0/0x42b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 240 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:46.241042+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 153305088 unmapped: 27566080 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:47.241360+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 153387008 unmapped: 27484160 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14d1d1c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14d1d1c00 session 0x55e149db6d20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14772d800 session 0x55e14c1dc5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0bb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e148d22000 session 0x55e14c7ebc20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:48.241508+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 heartbeat osd_stat(store_statfs(0x1b5049000/0x0/0x1bfc00000, data 0x4157b36/0x42e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c315000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 159326208 unmapped: 21544960 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 heartbeat osd_stat(store_statfs(0x1b5049000/0x0/0x1bfc00000, data 0x4157b36/0x42e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14c315000 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14a168800 session 0x55e14c7eb860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14a168800 session 0x55e149db7c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e14772d800 session 0x55e14b351a40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e148ba2400 session 0x55e14a254000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:49.241952+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155197440 unmapped: 25673728 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:50.242156+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2258368 data_alloc: 285212672 data_used: 10989568
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 heartbeat osd_stat(store_statfs(0x1b46fa000/0x0/0x1bfc00000, data 0x4aa6ba9/0x4c34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155049984 unmapped: 25821184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 heartbeat osd_stat(store_statfs(0x1b46fa000/0x0/0x1bfc00000, data 0x4aa6ba9/0x4c34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:51.242391+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155049984 unmapped: 25821184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.511168480s of 11.158887863s, submitted: 225
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 ms_handle_reset con 0x55e148d22000 session 0x55e14a93ef00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:52.242596+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c315000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155049984 unmapped: 25821184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:53.242864+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155156480 unmapped: 25714688 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:54.243046+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 155418624 unmapped: 25452544 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b46b1000/0x0/0x1bfc00000, data 0x4aeb96e/0x4c7c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:55.243218+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2291409 data_alloc: 285212672 data_used: 14184448
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157196288 unmapped: 23674880 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:56.243418+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157196288 unmapped: 23674880 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:57.243617+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157319168 unmapped: 23552000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b4671000/0x0/0x1bfc00000, data 0x4b2bffa/0x4cbd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:58.243846+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157728768 unmapped: 23142400 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:59.244014+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157728768 unmapped: 23142400 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:00.244256+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2303349 data_alloc: 301989888 data_used: 15208448
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 157917184 unmapped: 22953984 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:01.244450+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 158154752 unmapped: 22716416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b4632000/0x0/0x1bfc00000, data 0x4b6a413/0x4cfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.803024292s of 10.055063248s, submitted: 63
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:02.244670+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 158236672 unmapped: 22634496 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:03.244839+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 158359552 unmapped: 22511616 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:04.244989+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 158556160 unmapped: 22315008 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:05.245152+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2310345 data_alloc: 301989888 data_used: 15208448
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 158720000 unmapped: 22151168 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.245307+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 164192256 unmapped: 16678912 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:07.245493+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163217408 unmapped: 17653760 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b3bb4000/0x0/0x1bfc00000, data 0x55e74ad/0x577a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:08.245641+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163241984 unmapped: 17629184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:09.245807+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163241984 unmapped: 17629184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:10.246003+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2395961 data_alloc: 301989888 data_used: 15212544
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163274752 unmapped: 17596416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:11.246236+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163274752 unmapped: 17596416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:12.246416+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b3ac8000/0x0/0x1bfc00000, data 0x56d31e0/0x5866000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163405824 unmapped: 17465344 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.174256325s of 11.011490822s, submitted: 163
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:13.246554+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163749888 unmapped: 17121280 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:14.246736+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163749888 unmapped: 17121280 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:15.246950+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2404809 data_alloc: 301989888 data_used: 15216640
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 164913152 unmapped: 15958016 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b3a8c000/0x0/0x1bfc00000, data 0x570ee60/0x58a2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:16.247198+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165101568 unmapped: 15769600 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:17.247385+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 164044800 unmapped: 16826368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b3a4a000/0x0/0x1bfc00000, data 0x5750469/0x58e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:18.247599+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 164044800 unmapped: 16826368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b3a4a000/0x0/0x1bfc00000, data 0x5750469/0x58e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:19.247752+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163848192 unmapped: 17022976 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:20.248042+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2416951 data_alloc: 301989888 data_used: 15216640
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163962880 unmapped: 16908288 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:21.248244+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163962880 unmapped: 16908288 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b39d7000/0x0/0x1bfc00000, data 0x57c19f9/0x5957000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:22.248404+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163954688 unmapped: 16916480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:23.248605+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163954688 unmapped: 16916480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:24.248850+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b39a7000/0x0/0x1bfc00000, data 0x57f2de7/0x5987000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 163954688 unmapped: 16916480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:25.248993+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2419079 data_alloc: 301989888 data_used: 15216640
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.968339920s of 12.320811272s, submitted: 76
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165322752 unmapped: 15548416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 heartbeat osd_stat(store_statfs(0x1b39a7000/0x0/0x1bfc00000, data 0x57f2de7/0x5987000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:26.249206+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165322752 unmapped: 15548416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:27.249369+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165322752 unmapped: 15548416 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:28.249585+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165453824 unmapped: 15417344 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:29.249817+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165462016 unmapped: 15409152 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 244 heartbeat osd_stat(store_statfs(0x1b3929000/0x0/0x1bfc00000, data 0x586e818/0x5a05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:30.250070+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2429809 data_alloc: 301989888 data_used: 15228928
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165462016 unmapped: 15409152 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14e512c00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:31.250204+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165584896 unmapped: 15286272 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba3400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:32.250401+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 245 ms_handle_reset con 0x55e148ba3400 session 0x55e148d39e00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba3400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 246 ms_handle_reset con 0x55e148ba3400 session 0x55e149db7860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165666816 unmapped: 15204352 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 246 ms_handle_reset con 0x55e14e512c00 session 0x55e14a8c6f00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:33.250580+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165675008 unmapped: 15196160 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 246 heartbeat osd_stat(store_statfs(0x1b274d000/0x0/0x1bfc00000, data 0x58a170d/0x5a3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:34.250830+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165675008 unmapped: 15196160 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:35.251005+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 247 ms_handle_reset con 0x55e148ba2400 session 0x55e14a31a3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.500386238s of 10.003231049s, submitted: 127
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2449900 data_alloc: 301989888 data_used: 15253504
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165888000 unmapped: 14983168 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:36.251199+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 248 ms_handle_reset con 0x55e14772d800 session 0x55e14b352000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165888000 unmapped: 14983168 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:37.251389+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166019072 unmapped: 14852096 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 248 heartbeat osd_stat(store_statfs(0x1b2714000/0x0/0x1bfc00000, data 0x58d6ec2/0x5a78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:38.251565+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167116800 unmapped: 13754368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:39.251689+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167124992 unmapped: 13746176 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 53
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:40.251871+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2460802 data_alloc: 301989888 data_used: 15265792
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166535168 unmapped: 14336000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:41.252030+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148d22800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166535168 unmapped: 14336000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 250 ms_handle_reset con 0x55e148d22800 session 0x55e14b95b860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:42.252190+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166813696 unmapped: 14057472 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 ms_handle_reset con 0x55e14a168800 session 0x55e149224960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:43.252414+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 heartbeat osd_stat(store_statfs(0x1b265f000/0x0/0x1bfc00000, data 0x5980746/0x5b2c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167124992 unmapped: 13746176 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:44.252614+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165879808 unmapped: 14991360 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 heartbeat osd_stat(store_statfs(0x1b265f000/0x0/0x1bfc00000, data 0x5980634/0x5b2b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:45.252816+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.475018501s of 10.001503944s, submitted: 136
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2487050 data_alloc: 301989888 data_used: 15282176
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167043072 unmapped: 13828096 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:46.253077+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 252 heartbeat osd_stat(store_statfs(0x1b2609000/0x0/0x1bfc00000, data 0x59d7b18/0x5b84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167231488 unmapped: 13639680 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:47.253322+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167256064 unmapped: 13615104 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 253 heartbeat osd_stat(store_statfs(0x1b2608000/0x0/0x1bfc00000, data 0x59db556/0x5b86000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 253 ms_handle_reset con 0x55e14772d800 session 0x55e14a8e23c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:48.253511+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167305216 unmapped: 13565952 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:49.253676+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 254 ms_handle_reset con 0x55e14a169400 session 0x55e14c26e5a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167305216 unmapped: 13565952 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:50.253925+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 254 ms_handle_reset con 0x55e14c315000 session 0x55e14a8c6960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495042 data_alloc: 301989888 data_used: 15314944
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165978112 unmapped: 14893056 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:51.254134+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 255 ms_handle_reset con 0x55e148ba2400 session 0x55e14c26fe00
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 255 heartbeat osd_stat(store_statfs(0x1b39be000/0x0/0x1bfc00000, data 0x462128a/0x47c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 165945344 unmapped: 14925824 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:52.254315+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166035456 unmapped: 14835712 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:53.254491+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 256 heartbeat osd_stat(store_statfs(0x1b3989000/0x0/0x1bfc00000, data 0x46591b0/0x4804000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 256 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167084032 unmapped: 13787136 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:54.254657+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167084032 unmapped: 13787136 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:55.254840+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.108334541s of 10.006304741s, submitted: 314
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2335254 data_alloc: 285212672 data_used: 11141120
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 257 handle_osd_map epochs [257,258], i have 257, src has [1,258]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166780928 unmapped: 14090240 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:56.255029+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 258 heartbeat osd_stat(store_statfs(0x1b3924000/0x0/0x1bfc00000, data 0x46bcb58/0x4869000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166789120 unmapped: 14082048 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:57.255207+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 ms_handle_reset con 0x55e148ba2400 session 0x55e14b351680
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 166830080 unmapped: 14041088 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 ms_handle_reset con 0x55e14772d800 session 0x55e14b352b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:58.255380+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167436288 unmapped: 13434880 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:59.255591+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167436288 unmapped: 13434880 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:00.255833+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2346037 data_alloc: 285212672 data_used: 11153408
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167436288 unmapped: 13434880 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:01.256031+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 heartbeat osd_stat(store_statfs(0x1b34e5000/0x0/0x1bfc00000, data 0x46fc4b2/0x48a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167559168 unmapped: 13312000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:02.256249+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167559168 unmapped: 13312000 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:03.256472+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167567360 unmapped: 13303808 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:04.256649+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 167567360 unmapped: 13303808 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 262 handle_osd_map epochs [261,262], i have 262, src has [1,262]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 262 ms_handle_reset con 0x55e14a168800 session 0x55e14a8e3860
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:05.256839+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 262 heartbeat osd_stat(store_statfs(0x1b34ba000/0x0/0x1bfc00000, data 0x4723dc7/0x48d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.292941093s of 10.002080917s, submitted: 249
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2359399 data_alloc: 285212672 data_used: 11169792
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 169885696 unmapped: 10985472 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:06.257013+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 263 ms_handle_reset con 0x55e14a169400 session 0x55e14a3552c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170016768 unmapped: 10854400 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:07.257141+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14c315000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 264 ms_handle_reset con 0x55e14c315000 session 0x55e14a4f52c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170139648 unmapped: 10731520 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:08.257252+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 265 ms_handle_reset con 0x55e14772d800 session 0x55e14a8e21e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170164224 unmapped: 10706944 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:09.257442+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170287104 unmapped: 10584064 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:10.257659+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 265 ms_handle_reset con 0x55e148ba2400 session 0x55e14a93e000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2372702 data_alloc: 285212672 data_used: 11194368
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 265 heartbeat osd_stat(store_statfs(0x1b22af000/0x0/0x1bfc00000, data 0x47833f5/0x493d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 266 ms_handle_reset con 0x55e14a168800 session 0x55e14c1dc3c0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170319872 unmapped: 10551296 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:11.257838+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 266 ms_handle_reset con 0x55e14a169400 session 0x55e14c1dcd20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba3400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 267 ms_handle_reset con 0x55e148ba3400 session 0x55e14a3130e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170336256 unmapped: 10534912 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:12.257958+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 267 handle_osd_map epochs [267,268], i have 267, src has [1,268]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 268 ms_handle_reset con 0x55e14772d800 session 0x55e1490e1c20
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170352640 unmapped: 10518528 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 268 heartbeat osd_stat(store_statfs(0x1b22a4000/0x0/0x1bfc00000, data 0x478d280/0x4948000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:13.258106+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0ba1e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170401792 unmapped: 10469376 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:14.258282+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a168800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170409984 unmapped: 10461184 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 ms_handle_reset con 0x55e14a168800 session 0x55e14a2541e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14a169400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:15.258402+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 ms_handle_reset con 0x55e14a169400 session 0x55e14c1d2b40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.285099983s of 10.000877380s, submitted: 194
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2390729 data_alloc: 285212672 data_used: 11223040
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170655744 unmapped: 10215424 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:16.258554+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170770432 unmapped: 10100736 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:17.258730+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 heartbeat osd_stat(store_statfs(0x1b2240000/0x0/0x1bfc00000, data 0x47f29c8/0x49ad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 269 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170803200 unmapped: 10067968 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:18.258919+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 270 heartbeat osd_stat(store_statfs(0x1b223c000/0x0/0x1bfc00000, data 0x47f4dfd/0x49b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170827776 unmapped: 10043392 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:19.259143+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170827776 unmapped: 10043392 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:20.259396+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2397363 data_alloc: 285212672 data_used: 11223040
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170844160 unmapped: 10027008 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:21.259586+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170844160 unmapped: 10027008 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:22.259900+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170655744 unmapped: 10215424 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:23.260075+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b21bf000/0x0/0x1bfc00000, data 0x486e05f/0x4a2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170672128 unmapped: 10199040 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:24.260205+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170672128 unmapped: 10199040 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:25.260408+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b21bf000/0x0/0x1bfc00000, data 0x486e05f/0x4a2e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.806579590s of 10.001614571s, submitted: 64
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2407629 data_alloc: 285212672 data_used: 11239424
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b2191000/0x0/0x1bfc00000, data 0x489d94e/0x4a5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170827776 unmapped: 10043392 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:26.260672+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170827776 unmapped: 10043392 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:27.260798+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170827776 unmapped: 10043392 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:28.260969+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170983424 unmapped: 9887744 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:29.261161+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b214c000/0x0/0x1bfc00000, data 0x48e34a0/0x4aa2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170983424 unmapped: 9887744 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:30.261390+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2411441 data_alloc: 285212672 data_used: 11243520
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170991616 unmapped: 9879552 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:31.261584+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170991616 unmapped: 9879552 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:32.261753+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 170991616 unmapped: 9879552 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:33.261997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171008000 unmapped: 9863168 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:34.262192+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171008000 unmapped: 9863168 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:35.262385+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.870240211s of 10.004336357s, submitted: 31
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2416563 data_alloc: 285212672 data_used: 11243520
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b20ef000/0x0/0x1bfc00000, data 0x494011b/0x4aff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171114496 unmapped: 9756672 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:36.262627+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b20ef000/0x0/0x1bfc00000, data 0x494011b/0x4aff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171114496 unmapped: 9756672 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:37.262880+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171122688 unmapped: 9748480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:38.263109+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171122688 unmapped: 9748480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:39.263264+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 171122688 unmapped: 9748480 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:40.263439+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b206f000/0x0/0x1bfc00000, data 0x49bfac6/0x4b7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2421537 data_alloc: 285212672 data_used: 11243520
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 172318720 unmapped: 8552448 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:41.263568+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 173506560 unmapped: 7364608 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:42.263824+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 ms_handle_reset con 0x55e148d22000 session 0x55e14a2e2960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 173752320 unmapped: 7118848 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:43.264149+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 54
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174030848 unmapped: 6840320 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:44.264290+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 heartbeat osd_stat(store_statfs(0x1b2020000/0x0/0x1bfc00000, data 0x4a1068e/0x4bce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174030848 unmapped: 6840320 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:45.264429+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.759084702s of 10.004197121s, submitted: 316
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2427123 data_alloc: 285212672 data_used: 11243520
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174145536 unmapped: 6725632 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:46.264590+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174145536 unmapped: 6725632 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:47.264807+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 272 heartbeat osd_stat(store_statfs(0x1b1fdc000/0x0/0x1bfc00000, data 0x4a52ac7/0x4c12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174145536 unmapped: 6725632 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:48.265004+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174284800 unmapped: 6586368 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:49.265158+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 272 heartbeat osd_stat(store_statfs(0x1b1f96000/0x0/0x1bfc00000, data 0x4a97dbd/0x4c58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 272 heartbeat osd_stat(store_statfs(0x1b1f96000/0x0/0x1bfc00000, data 0x4a97dbd/0x4c58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174391296 unmapped: 6479872 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:50.265343+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2436373 data_alloc: 285212672 data_used: 11255808
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174473216 unmapped: 6397952 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:51.265505+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174473216 unmapped: 6397952 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:52.265693+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 272 heartbeat osd_stat(store_statfs(0x1b1f76000/0x0/0x1bfc00000, data 0x4ab8191/0x4c78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 173858816 unmapped: 7012352 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:53.265888+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 173867008 unmapped: 7004160 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:54.266020+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174915584 unmapped: 5955584 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:55.266139+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.005720139s of 10.328762054s, submitted: 88
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2447993 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175144960 unmapped: 5726208 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:56.266282+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1ea6000/0x0/0x1bfc00000, data 0x4b841de/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1ea6000/0x0/0x1bfc00000, data 0x4b841de/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175144960 unmapped: 5726208 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:57.266450+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175144960 unmapped: 5726208 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:58.303834+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174759936 unmapped: 6111232 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:59.303989+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174759936 unmapped: 6111232 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:00.304172+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2467157 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175013888 unmapped: 5857280 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:01.304307+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 174309376 unmapped: 6561792 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:02.304451+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1e09000/0x0/0x1bfc00000, data 0x4c2398b/0x4de5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175472640 unmapped: 5398528 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:03.304629+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1dc2000/0x0/0x1bfc00000, data 0x4c6aa07/0x4e2c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175677440 unmapped: 5193728 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:04.304852+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175677440 unmapped: 5193728 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:05.305012+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2466745 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 176750592 unmapped: 4120576 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:06.305169+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.320298195s of 10.657917023s, submitted: 78
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 175702016 unmapped: 5169152 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:07.305310+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177111040 unmapped: 3760128 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:08.305464+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1d03000/0x0/0x1bfc00000, data 0x4d27b84/0x4eea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:09.305633+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177111040 unmapped: 3760128 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:10.305816+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177111040 unmapped: 3760128 heap: 180871168 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1cd3000/0x0/0x1bfc00000, data 0x4d593e7/0x4f1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [1,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2491173 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:11.305989+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177487872 unmapped: 4431872 heap: 181919744 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:12.306117+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177528832 unmapped: 4390912 heap: 181919744 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1c47000/0x0/0x1bfc00000, data 0x4de42a5/0x4fa6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:13.306313+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177528832 unmapped: 4390912 heap: 181919744 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:14.306467+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178749440 unmapped: 4218880 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:15.306652+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178749440 unmapped: 4218880 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2489631 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:16.306882+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178757632 unmapped: 4210688 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1bf6000/0x0/0x1bfc00000, data 0x4e35bb3/0x4ff7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.309183121s of 10.700140953s, submitted: 85
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:17.306999+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178855936 unmapped: 4112384 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:18.307148+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178855936 unmapped: 4112384 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1b95000/0x0/0x1bfc00000, data 0x4e94647/0x5057000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:19.307314+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178855936 unmapped: 4112384 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:20.307479+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178257920 unmapped: 4710400 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2496375 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:21.307655+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177823744 unmapped: 5144576 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1b57000/0x0/0x1bfc00000, data 0x4ed493d/0x5096000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:22.307817+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177823744 unmapped: 5144576 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:23.307970+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177823744 unmapped: 5144576 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:24.308165+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177823744 unmapped: 5144576 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:25.308314+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177823744 unmapped: 5144576 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2493981 data_alloc: 285212672 data_used: 11268096
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 heartbeat osd_stat(store_statfs(0x1b1b56000/0x0/0x1bfc00000, data 0x4ed4970/0x5096000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 273 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:26.308504+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 177831936 unmapped: 5136384 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.769973755s of 10.077122688s, submitted: 90
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:27.308621+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:28.308849+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:29.309039+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b1b51000/0x0/0x1bfc00000, data 0x4ed6f73/0x509b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:30.309328+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2498767 data_alloc: 285212672 data_used: 11280384
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:31.803442+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 274 heartbeat osd_stat(store_statfs(0x1b1b52000/0x0/0x1bfc00000, data 0x4ed6fa6/0x509b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178880512 unmapped: 4087808 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:32.803579+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 4079616 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:33.803731+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 4079616 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:34.803938+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 4079616 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 275 heartbeat osd_stat(store_statfs(0x1b1b4d000/0x0/0x1bfc00000, data 0x4ed9361/0x509f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:35.804102+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2503871 data_alloc: 285212672 data_used: 11292672
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 4079616 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:36.804229+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.891979218s of 10.009073257s, submitted: 39
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 4079616 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:37.804462+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 4063232 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:38.804644+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 4063232 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:39.804829+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 275 heartbeat osd_stat(store_statfs(0x1b1b4c000/0x0/0x1bfc00000, data 0x4ed94f6/0x50a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 4063232 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:40.805027+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2510975 data_alloc: 285212672 data_used: 11304960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b1b4c000/0x0/0x1bfc00000, data 0x4ed94f6/0x50a1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178929664 unmapped: 4038656 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:41.805235+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178929664 unmapped: 4038656 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:42.805412+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178929664 unmapped: 4038656 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:43.805606+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178937856 unmapped: 4030464 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:44.805819+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178937856 unmapped: 4030464 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:45.806027+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2510999 data_alloc: 285212672 data_used: 11304960
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178937856 unmapped: 4030464 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:46.806208+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 276 heartbeat osd_stat(store_statfs(0x1b1b4a000/0x0/0x1bfc00000, data 0x4edb92c/0x50a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178937856 unmapped: 4030464 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:47.806418+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.405011177s of 10.621738434s, submitted: 71
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178954240 unmapped: 4014080 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:48.806560+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 4005888 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 277 heartbeat osd_stat(store_statfs(0x1b1b46000/0x0/0x1bfc00000, data 0x4eddc7b/0x50a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:49.806742+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 4005888 heap: 182968320 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:50.807024+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2514159 data_alloc: 285212672 data_used: 11317248
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 5054464 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 277 heartbeat osd_stat(store_statfs(0x1b1b45000/0x0/0x1bfc00000, data 0x4eddd71/0x50a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:51.807191+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 5054464 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:52.807429+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 5054464 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 277 heartbeat osd_stat(store_statfs(0x1b1b44000/0x0/0x1bfc00000, data 0x4edde0c/0x50a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:53.807702+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 5054464 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:54.807894+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178962432 unmapped: 5054464 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:55.808108+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2515387 data_alloc: 285212672 data_used: 11317248
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178978816 unmapped: 5038080 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:56.808246+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178978816 unmapped: 5038080 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:57.808386+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.803047180s of 10.009399414s, submitted: 80
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179019776 unmapped: 4997120 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:58.808542+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 278 heartbeat osd_stat(store_statfs(0x1b1b42000/0x0/0x1bfc00000, data 0x4ee024a/0x50aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179019776 unmapped: 4997120 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:59.808737+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179019776 unmapped: 4997120 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:00.809075+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2518101 data_alloc: 285212672 data_used: 11329536
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179019776 unmapped: 4997120 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:01.809304+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179019776 unmapped: 4997120 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:02.809570+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179027968 unmapped: 4988928 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:03.809805+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179052544 unmapped: 4964352 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee26f9/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:04.809974+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179052544 unmapped: 4964352 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:05.810183+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2523015 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179060736 unmapped: 4956160 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:06.810351+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4ee272b/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:07.810521+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.987839699s of 10.120377541s, submitted: 47
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:08.810711+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:09.810985+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:10.811189+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2522325 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:11.811355+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4ee2663/0x50ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179068928 unmapped: 4947968 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:12.811478+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178888704 unmapped: 5128192 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:13.846010+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 5111808 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:14.846216+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 5111808 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:15.846349+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2524397 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 5111808 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:16.846531+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3e000/0x0/0x1bfc00000, data 0x4ee26a4/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 178913280 unmapped: 5103616 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:17.846811+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3e000/0x0/0x1bfc00000, data 0x4ee26a4/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.133948326s of 10.280371666s, submitted: 31
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179970048 unmapped: 4046848 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3b000/0x0/0x1bfc00000, data 0x4ee275a/0x50b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:18.847048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179970048 unmapped: 4046848 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:19.847215+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179970048 unmapped: 4046848 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:20.847392+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2525813 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179994624 unmapped: 4022272 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:21.847572+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179994624 unmapped: 4022272 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee2791/0x50b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:22.847839+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 179994624 unmapped: 4022272 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee2791/0x50b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:23.847989+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180002816 unmapped: 4014080 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:24.848183+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180011008 unmapped: 4005888 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:25.848363+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2529557 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180011008 unmapped: 4005888 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:26.848553+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180011008 unmapped: 4005888 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:27.848779+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee288a/0x50b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180019200 unmapped: 3997696 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:28.848918+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 180019200 unmapped: 3997696 heap: 184016896 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:29.849066+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.844007492s of 12.013623238s, submitted: 32
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181067776 unmapped: 3997696 heap: 185065472 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:30.849298+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2529127 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181067776 unmapped: 5046272 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:31.849487+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181067776 unmapped: 5046272 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:32.849706+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4ee2727/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181067776 unmapped: 5046272 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:33.849955+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181075968 unmapped: 5038080 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:34.850149+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181075968 unmapped: 5038080 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:35.850319+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2528133 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181075968 unmapped: 5038080 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:36.850575+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181084160 unmapped: 5029888 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:37.850818+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee272c/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181100544 unmapped: 5013504 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:38.850930+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee272c/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee272c/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181100544 unmapped: 5013504 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:39.851118+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.027934074s of 10.156760216s, submitted: 31
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3d000/0x0/0x1bfc00000, data 0x4ee272c/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181100544 unmapped: 5013504 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:40.851297+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2528859 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181116928 unmapped: 4997120 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3e000/0x0/0x1bfc00000, data 0x4ee26f7/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:41.851457+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181116928 unmapped: 4997120 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:42.851808+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3e000/0x0/0x1bfc00000, data 0x4ee26f7/0x50af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181116928 unmapped: 4997120 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:43.851964+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:44.852147+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:45.852341+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2528683 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:46.852549+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4ee2662/0x50ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:47.852703+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:48.852876+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:49.853088+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.890512466s of 10.000330925s, submitted: 21
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:50.853291+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2528169 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181125120 unmapped: 4988928 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:51.853499+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181133312 unmapped: 4980736 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4ee2663/0x50ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:52.853647+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181133312 unmapped: 4980736 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:53.853826+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181141504 unmapped: 4972544 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:54.853990+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181141504 unmapped: 4972544 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:55.854147+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2527993 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181141504 unmapped: 4972544 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:56.854321+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181157888 unmapped: 4956160 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:57.854539+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b40000/0x0/0x1bfc00000, data 0x4ee2630/0x50ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181264384 unmapped: 4849664 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:58.854683+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181264384 unmapped: 4849664 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1b0e000/0x0/0x1bfc00000, data 0x4f108bd/0x50de000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:59.854857+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.869131088s of 10.000544548s, submitted: 28
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181264384 unmapped: 4849664 heap: 186114048 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:00.855100+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2544395 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182329344 unmapped: 4833280 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:01.855240+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181952512 unmapped: 5210112 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:02.855448+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181952512 unmapped: 5210112 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:03.855649+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1ac8000/0x0/0x1bfc00000, data 0x4f58fa2/0x5125000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [1,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181714944 unmapped: 5447680 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:04.855841+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182181888 unmapped: 4980736 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:05.855993+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2542889 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182173696 unmapped: 4988928 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:06.856161+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182181888 unmapped: 4980736 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:07.856402+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1a50000/0x0/0x1bfc00000, data 0x4fd2b15/0x519e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182476800 unmapped: 4685824 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:08.856588+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182476800 unmapped: 4685824 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:09.856905+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.783621788s of 10.000998497s, submitted: 56
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 181420032 unmapped: 5742592 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:10.857122+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2555965 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182476800 unmapped: 4685824 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:11.857302+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b19af000/0x0/0x1bfc00000, data 0x5071262/0x523d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182476800 unmapped: 4685824 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:12.857452+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182476800 unmapped: 4685824 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:13.857714+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182657024 unmapped: 4505600 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:14.857864+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182665216 unmapped: 4497408 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:15.858027+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2559393 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182665216 unmapped: 4497408 heap: 187162624 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1567000/0x0/0x1bfc00000, data 0x50b9d84/0x5286000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:16.858219+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:17.858421+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:18.858579+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1515000/0x0/0x1bfc00000, data 0x5109a85/0x52d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:19.858800+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.760993004s of 10.007385254s, submitted: 63
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:20.858995+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2567855 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:21.859149+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b14d3000/0x0/0x1bfc00000, data 0x514cef8/0x5319000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 4481024 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:22.859342+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183984128 unmapped: 4227072 heap: 188211200 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:23.859543+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182337536 unmapped: 6922240 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:24.859744+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1477000/0x0/0x1bfc00000, data 0x51a7fa5/0x5375000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182337536 unmapped: 6922240 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:25.859937+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568177 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 182566912 unmapped: 6692864 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:26.860100+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183721984 unmapped: 5537792 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:27.860297+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183730176 unmapped: 5529600 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:28.860465+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b143c000/0x0/0x1bfc00000, data 0x51e54d1/0x53b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 183959552 unmapped: 5300224 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:29.860598+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.780093193s of 10.000996590s, submitted: 56
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184066048 unmapped: 5193728 heap: 189259776 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:30.860875+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2583857 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184229888 unmapped: 6078464 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:31.861048+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:32.861218+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184229888 unmapped: 6078464 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b13dc000/0x0/0x1bfc00000, data 0x5246592/0x5412000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:33.861550+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184229888 unmapped: 6078464 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:34.861799+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184254464 unmapped: 6053888 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:35.861991+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184532992 unmapped: 5775360 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2580355 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1393000/0x0/0x1bfc00000, data 0x528f0c3/0x545b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:36.862120+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 184532992 unmapped: 5775360 heap: 190308352 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:37.862458+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185589760 unmapped: 5767168 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:38.862690+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52decf9/0x54aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185868288 unmapped: 5488640 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:39.862902+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185868288 unmapped: 5488640 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.623038292s of 10.001656532s, submitted: 61
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52decf9/0x54aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:40.863104+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185589760 unmapped: 5767168 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2583277 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:41.863329+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185589760 unmapped: 5767168 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:42.863484+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185589760 unmapped: 5767168 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:43.863661+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185589760 unmapped: 5767168 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1340000/0x0/0x1bfc00000, data 0x52e2595/0x54ad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:44.863820+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:45.864003+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2585349 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:46.864129+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:47.864487+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133e000/0x0/0x1bfc00000, data 0x52e268b/0x54ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:48.864661+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:49.864885+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 9.915193558s of 10.002569199s, submitted: 17
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:50.865151+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185606144 unmapped: 5750784 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586333 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133f000/0x0/0x1bfc00000, data 0x52e2726/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:51.865329+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133f000/0x0/0x1bfc00000, data 0x52e2726/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:52.865898+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133d000/0x0/0x1bfc00000, data 0x52e27d8/0x54b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:53.866056+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:54.866217+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:55.866394+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2588357 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:56.866555+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133c000/0x0/0x1bfc00000, data 0x52e2756/0x54b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:57.866805+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:58.866955+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133c000/0x0/0x1bfc00000, data 0x52e2756/0x54b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:59.867124+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:00.882124+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2588229 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:01.882335+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 11.789787292s of 11.895949364s, submitted: 21
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:02.882521+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:03.882680+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 5742592 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b133f000/0x0/0x1bfc00000, data 0x52e2660/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [0,0,0,1])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:04.882949+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185622528 unmapped: 5734400 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:05.883141+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1341000/0x0/0x1bfc00000, data 0x52e2595/0x54ad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2585855 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:06.883346+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 ms_handle_reset con 0x55e147aca000 session 0x55e14a8e34a0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e14772d800
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:07.883490+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:08.883685+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:09.883836+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:10.884063+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584475 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:11.884231+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185630720 unmapped: 5726208 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:12.884396+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:13.884575+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:14.884710+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:15.884854+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584475 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:16.885039+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:17.885199+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:18.885387+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:19.885643+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185638912 unmapped: 5718016 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:20.885885+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185647104 unmapped: 5709824 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584475 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:21.886104+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:22.886254+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:23.886413+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:24.886561+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:25.886857+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584475 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:26.887035+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:27.887216+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 25.676549911s of 25.726005554s, submitted: 11
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185655296 unmapped: 5701632 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:28.887340+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:29.887478+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:30.887696+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:31.887839+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584475 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:32.887990+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:33.888171+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:34.888393+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185663488 unmapped: 5693440 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:35.888586+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185671680 unmapped: 5685248 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:36.888800+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2584299 data_alloc: 285212672 data_used: 11341824
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185679872 unmapped: 5677056 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:37.888950+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185679872 unmapped: 5677056 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:38.891832+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185679872 unmapped: 5677056 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:39.892042+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 heartbeat osd_stat(store_statfs(0x1b1343000/0x0/0x1bfc00000, data 0x52e24ca/0x54ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185679872 unmapped: 5677056 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 12.339368820s of 12.366766930s, submitted: 5
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:40.892209+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _renew_subs
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _send_mon_message to mon.np0005548788 at v2:172.18.0.103:3300/0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:41.892425+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2588677 data_alloc: 285212672 data_used: 11354112
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:42.892622+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.2 total, 600.0 interval
                                                          Cumulative writes: 22K writes, 87K keys, 22K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.01 MB/s
                                                          Cumulative WAL: 22K writes, 8052 syncs, 2.84 writes per sync, written: 0.08 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 12K writes, 48K keys, 12K commit groups, 1.0 writes per commit group, ingest: 44.21 MB, 0.07 MB/s
                                                          Interval WAL: 12K writes, 5039 syncs, 2.50 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 280 heartbeat osd_stat(store_statfs(0x1b133e000/0x0/0x1bfc00000, data 0x52e4a66/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:43.892853+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 280 heartbeat osd_stat(store_statfs(0x1b133e000/0x0/0x1bfc00000, data 0x52e4a66/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:44.893031+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:45.893228+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:46.893398+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2587797 data_alloc: 285212672 data_used: 11354112
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185696256 unmapped: 5660672 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 280 heartbeat osd_stat(store_statfs(0x1b133f000/0x0/0x1bfc00000, data 0x52e4a66/0x54af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:47.893585+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 280 handle_osd_map epochs [280,281], i have 280, src has [1,281]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185704448 unmapped: 5652480 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:48.893814+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185704448 unmapped: 5652480 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:49.893978+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185704448 unmapped: 5652480 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:50.894226+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185704448 unmapped: 5652480 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:51.894363+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591823 data_alloc: 285212672 data_used: 11366400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185704448 unmapped: 5652480 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:52.894537+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:53.894730+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:54.894880+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:55.895074+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:56.895205+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591823 data_alloc: 285212672 data_used: 11366400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:57.895393+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:58.895554+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:59.895807+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185712640 unmapped: 5644288 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:00.896035+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185720832 unmapped: 5636096 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:01.896168+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591823 data_alloc: 285212672 data_used: 11366400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:02.896557+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:03.896708+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:04.896857+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:05.896996+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:06.897116+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591823 data_alloc: 285212672 data_used: 11366400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:07.897309+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185729024 unmapped: 5627904 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:08.897516+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185737216 unmapped: 5619712 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:09.897717+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185737216 unmapped: 5619712 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:10.897997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185737216 unmapped: 5619712 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:11.898195+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591823 data_alloc: 285212672 data_used: 11366400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185745408 unmapped: 5611520 heap: 191356928 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:12.898343+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 heartbeat osd_stat(store_statfs(0x1b133a000/0x0/0x1bfc00000, data 0x52e6e7f/0x54b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e147aca000
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 32.518573761s of 32.653316498s, submitted: 57
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 194150400 unmapped: 13991936 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:13.898514+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185778176 unmapped: 22364160 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:14.898741+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 ms_handle_reset con 0x55e147aca000 session 0x55e149db61e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185778176 unmapped: 22364160 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: handle_auth_request added challenge on 0x55e148ba2400
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:15.898931+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185794560 unmapped: 22347776 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 heartbeat osd_stat(store_statfs(0x1b0335000/0x0/0x1bfc00000, data 0x62e93c2/0x64b8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 ms_handle_reset con 0x55e148ba2400 session 0x55e14a0bab40
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:16.899102+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 heartbeat osd_stat(store_statfs(0x1b2331000/0x0/0x1bfc00000, data 0x52eb939/0x54bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2602416 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:17.899353+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:18.899516+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:19.899717+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 heartbeat osd_stat(store_statfs(0x1b2331000/0x0/0x1bfc00000, data 0x52eb939/0x54bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:20.900057+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:21.900586+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 heartbeat osd_stat(store_statfs(0x1b2331000/0x0/0x1bfc00000, data 0x52eb939/0x54bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2602416 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:22.900823+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185802752 unmapped: 22339584 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 10.433533669s of 10.570907593s, submitted: 23
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:23.900969+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185819136 unmapped: 22323200 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:24.901135+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:25.901294+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:26.901849+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604746 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232e000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:27.901997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:28.902244+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232e000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:29.902673+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:30.903183+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:31.903550+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604746 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185827328 unmapped: 22315008 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:32.903713+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:33.904000+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232e000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:34.904191+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:35.904398+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232e000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:36.904570+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604746 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:37.904778+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:38.905089+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:39.905333+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185835520 unmapped: 22306816 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:40.905511+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232e000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 185843712 unmapped: 22298624 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:41.906052+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore(/var/lib/ceph/osd/ceph-4) _kv_sync_thread utilization: idle 18.184650421s of 18.202966690s, submitted: 14
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 ms_handle_reset con 0x55e14a8e0400 session 0x55e14c26f0e0
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186179584 unmapped: 21962752 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:42.906277+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186187776 unmapped: 21954560 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:43.906437+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Got map version 55
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186376192 unmapped: 21766144 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:44.906603+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186376192 unmapped: 21766144 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:45.906747+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186376192 unmapped: 21766144 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:46.907045+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186376192 unmapped: 21766144 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:47.907216+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186376192 unmapped: 21766144 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:48.907410+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:49.907602+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:50.907837+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:51.908050+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:52.908182+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:53.908406+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:54.908646+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186392576 unmapped: 21749760 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:55.908997+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186400768 unmapped: 21741568 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:56.909524+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:57.910107+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:58.910612+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:59.910903+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:00.911096+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:01.911235+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186417152 unmapped: 21725184 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:02.911381+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186425344 unmapped: 21716992 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:03.911529+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:04.911696+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:05.911855+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:06.912018+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604042 data_alloc: 285212672 data_used: 11378688
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:07.912160+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:08.912331+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:09.912476+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:10.912662+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186441728 unmapped: 21700608 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:11.912792+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:12.912918+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:13.913074+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:14.913223+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:15.913378+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:16.913514+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:17.913659+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:18.913840+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186466304 unmapped: 21676032 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:19.914005+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:20.914199+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:21.914397+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:22.914526+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:23.914707+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:24.914860+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:25.915015+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:26.915165+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186490880 unmapped: 21651456 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:27.915356+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:28.915543+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:29.915672+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:30.915841+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:31.915985+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:32.916122+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186515456 unmapped: 21626880 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:33.916259+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186523648 unmapped: 21618688 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:34.916424+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186523648 unmapped: 21618688 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:35.916582+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:36.916725+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:37.916852+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:38.917034+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:39.917216+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:40.917482+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:41.917723+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:42.918042+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:43.918243+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186540032 unmapped: 21602304 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:44.918505+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:45.918670+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:46.918884+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:47.919049+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:48.919196+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:49.919345+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:50.919507+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:51.919601+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186548224 unmapped: 21594112 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:52.919859+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 21585920 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:53.919985+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 21585920 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:54.920229+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 21585920 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:55.920392+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186556416 unmapped: 21585920 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:56.920557+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21577728 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:57.920730+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21577728 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:58.920919+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21577728 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:59.921130+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21577728 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:00.921318+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:01.921447+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:02.921595+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:03.921740+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:04.921923+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:05.922093+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:06.922243+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:07.922710+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186572800 unmapped: 21569536 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:08.922860+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186580992 unmapped: 21561344 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:09.923019+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:10.923227+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:11.923404+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:12.923563+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:13.923730+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:14.923880+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:15.942578+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:16.943394+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:17.943534+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:18.943678+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:19.943802+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:20.943956+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:21.944094+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:22.944206+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:23.944323+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186597376 unmapped: 21544960 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:24.944506+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186605568 unmapped: 21536768 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:25.944629+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186564608 unmapped: 21577728 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'config show' '{prefix=config show}'
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: osd.4 284 heartbeat osd_stat(store_statfs(0x1b232f000/0x0/0x1bfc00000, data 0x52edd52/0x54bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,2,3,5] op hist [])
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:26.944741+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186277888 unmapped: 21864448 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: bluestore.MempoolThread(0x55e1469bbb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604202 data_alloc: 285212672 data_used: 11382784
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: tick
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_tickets
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:27.944906+0000)
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 21553152 heap: 208142336 old mem: 4047415775 new mem: 4047415775
Dec 06 10:32:58 np0005548789.localdomain ceph-osd[32665]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:32:58 np0005548789.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.59227 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.49836 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.69452 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.69446 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: pgmap v830: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.59254 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.69464 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.49863 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2083356224' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1654648070' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2756974343' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/997443539' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2621128474' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:58 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/880484158' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2460720482' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/842483459' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.59269 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.69476 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.49875 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.59275 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.69488 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.49890 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.59293 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3924164660' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2460720482' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4179676265' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4060345088' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/842483459' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1645509840' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3258516296' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:32:59 np0005548789.localdomain sshd[343836]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:32:59 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/330610004' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain crontab[343929]: (root) LIST (root)
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2196284679' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1440614881' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.69503 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.49905 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.59308 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.69521 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.49914 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: pgmap v831: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.59329 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.69536 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3258516296' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.59350 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.49932 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/330610004' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1470188650' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4104655568' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2196284679' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3825277840' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1440614881' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:00 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2183890531' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 06 10:33:01 np0005548789.localdomain sshd[343836]: Invalid user ubuntu from 43.163.93.82 port 41044
Dec 06 10:33:01 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 06 10:33:01 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/887149509' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:01 np0005548789.localdomain sshd[343836]: Received disconnect from 43.163.93.82 port 41044:11:  [preauth]
Dec 06 10:33:01 np0005548789.localdomain sshd[343836]: Disconnected from invalid user ubuntu 43.163.93.82 port 41044 [preauth]
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2327447688' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2107682123' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.69551 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.69566 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.59356 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.69575 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.59377 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.49959 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2272977300' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/305260225' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/887149509' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2736339179' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2479036' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/466198046' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 06 10:33:02 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3564364230' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.
Dec 06 10:33:02 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.
Dec 06 10:33:02 np0005548789.localdomain systemd[1]: tmp-crun.jUTzLw.mount: Deactivated successfully.
Dec 06 10:33:02 np0005548789.localdomain podman[344239]: 2025-12-06 10:33:02.918470457 +0000 UTC m=+0.074807407 container health_status bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:33:02 np0005548789.localdomain podman[344239]: 2025-12-06 10:33:02.92539598 +0000 UTC m=+0.081733080 container exec_died bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:33:02 np0005548789.localdomain systemd[1]: bb7a9bcbc6b5241be8139843cc74262124ff60e2712fcc5adc78da15e3143094.service: Deactivated successfully.
Dec 06 10:33:02 np0005548789.localdomain podman[344238]: 2025-12-06 10:33:02.967743319 +0000 UTC m=+0.124956425 container health_status 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git)
Dec 06 10:33:02 np0005548789.localdomain podman[344238]: 2025-12-06 10:33:02.976477507 +0000 UTC m=+0.133690653 container exec_died 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible)
Dec 06 10:33:02 np0005548789.localdomain systemd[1]: 10ccb8cb18db7849aa21674e6bb5b9d3988304a3a1061b356419125284408a6e.service: Deactivated successfully.
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1424714830' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:47.564623+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:48.564810+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951462 data_alloc: 301989888 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:49.564951+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:50.565085+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:51.565261+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:52.565448+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:53.565590+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951462 data_alloc: 301989888 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:54.565793+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:55.565962+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:56.566946+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:57.567109+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:58.567272+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951462 data_alloc: 301989888 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:59.567434+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:00.567567+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:01.567699+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:02.567895+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105725952 unmapped: 3219456 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:03.568056+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951462 data_alloc: 301989888 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:04.568147+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:05.568397+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:06.568561+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:07.568696+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:08.568809+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951462 data_alloc: 301989888 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:09.568949+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 heartbeat osd_stat(store_statfs(0x1b94e1000/0x0/0x1bfc00000, data 0x25249e8/0x25ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:10.569084+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105717760 unmapped: 3227648 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:11.569281+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 42
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now 
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/2148019987
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc reconnect No active mgr available yet
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 ms_handle_reset con 0x55d11a41fc00 session 0x55d11d1003c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 63.981388092s of 63.989768982s, submitted: 1
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105906176 unmapped: 3039232 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:12.569417+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 43
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: get_auth_request con 0x55d119cd8400 auth_method 0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106053632 unmapped: 2891776 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:13.569548+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106053632 unmapped: 2891776 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:14.569671+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106053632 unmapped: 2891776 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 44
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:15.569854+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106201088 unmapped: 2744320 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:16.570004+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 45
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.570145+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.570268+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.570417+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.570566+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.570735+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.570917+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.571074+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.571238+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.571471+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.571634+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.571847+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.572053+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.572381+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.572563+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.572731+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.572855+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.573040+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.573203+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.573420+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.573575+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.573686+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.573802+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.573962+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.574097+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.574231+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106389504 unmapped: 2555904 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.574409+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.574564+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.574705+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.574946+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.575134+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.575315+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.575518+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.575675+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106397696 unmapped: 2547712 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.575821+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.575935+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.576105+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.576272+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.576389+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.576590+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.576820+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.577004+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.577186+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.577338+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.577491+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.577647+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.577817+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.578072+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.578336+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.578585+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.578831+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106405888 unmapped: 2539520 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.578982+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.579206+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.579411+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.579602+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.579832+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.580031+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.580256+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.580432+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.580728+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106414080 unmapped: 2531328 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.581009+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.581290+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.581491+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.581684+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.581870+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.582086+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.582299+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.582473+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.582654+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.582901+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.583062+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.583233+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.583411+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.583594+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.583786+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.583957+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106422272 unmapped: 2523136 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.584174+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.584321+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.584444+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.584619+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.584795+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.584968+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.585117+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953790 data_alloc: 285212672 data_used: 22122496
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 heartbeat osd_stat(store_statfs(0x1b94de000/0x0/0x1bfc00000, data 0x2526f21/0x25af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.585282+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106430464 unmapped: 2514944 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.585434+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d952800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 89.460609436s of 89.475273132s, submitted: 3
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106438656 unmapped: 2506752 heap: 108945408 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.585526+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106487808 unmapped: 18194432 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.585657+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 46
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 ms_handle_reset con 0x55d11d952800 session 0x55d11bc5f860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 105955328 unmapped: 18726912 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.585933+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107069440 unmapped: 17612800 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1160119 data_alloc: 285212672 data_used: 22134784
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.586037+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 heartbeat osd_stat(store_statfs(0x1b7868000/0x0/0x1bfc00000, data 0x41994aa/0x4226000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 ms_handle_reset con 0x55d11a875c00 session 0x55d11b89cd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.586208+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.586367+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.586546+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.586648+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1162519 data_alloc: 285212672 data_used: 22147072
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b7863000/0x0/0x1bfc00000, data 0x419b9dd/0x422a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.586824+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.586954+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.587106+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.587276+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.587424+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1162519 data_alloc: 285212672 data_used: 22147072
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.587575+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b7863000/0x0/0x1bfc00000, data 0x419b9dd/0x422a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.587737+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.588051+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.588183+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.588330+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b7863000/0x0/0x1bfc00000, data 0x419b9dd/0x422a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1162519 data_alloc: 285212672 data_used: 22147072
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.588509+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.588684+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.588829+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b7863000/0x0/0x1bfc00000, data 0x419b9dd/0x422a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.588995+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.589181+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1162519 data_alloc: 285212672 data_used: 22147072
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.589358+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.589561+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.589707+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 heartbeat osd_stat(store_statfs(0x1b7863000/0x0/0x1bfc00000, data 0x419b9dd/0x422a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.589899+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106962944 unmapped: 17719296 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.590067+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d118f03800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 27.616872787s of 27.921602249s, submitted: 41
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106971136 unmapped: 17711104 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1163168 data_alloc: 285212672 data_used: 22147072
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.590210+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 98 ms_handle_reset con 0x55d118f03800 session 0x55d11b7f2d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106971136 unmapped: 17711104 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.590331+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106971136 unmapped: 17711104 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.590506+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 98 heartbeat osd_stat(store_statfs(0x1b785e000/0x0/0x1bfc00000, data 0x419df33/0x422f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 106979328 unmapped: 17702912 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.590692+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 ms_handle_reset con 0x55d11be83000 session 0x55d119c6a960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.590939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 heartbeat osd_stat(store_statfs(0x1b785b000/0x0/0x1bfc00000, data 0x41a0497/0x4232000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1172244 data_alloc: 285212672 data_used: 22171648
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.591109+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.591306+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 heartbeat osd_stat(store_statfs(0x1b785b000/0x0/0x1bfc00000, data 0x41a0497/0x4232000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.591512+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.591748+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.592055+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 heartbeat osd_stat(store_statfs(0x1b785b000/0x0/0x1bfc00000, data 0x41a0497/0x4232000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 heartbeat osd_stat(store_statfs(0x1b785b000/0x0/0x1bfc00000, data 0x41a0497/0x4232000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1172244 data_alloc: 285212672 data_used: 22171648
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.592263+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2dc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.965310097s of 11.214971542s, submitted: 62
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107839488 unmapped: 16842752 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.592432+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 100 ms_handle_reset con 0x55d11de2dc00 session 0x55d119c6a3c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107855872 unmapped: 16826368 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.592555+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.592802+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.592980+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1184357 data_alloc: 285212672 data_used: 22183936
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.593134+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b7850000/0x0/0x1bfc00000, data 0x41a5222/0x423c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.593345+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b7850000/0x0/0x1bfc00000, data 0x41a5222/0x423c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 101 heartbeat osd_stat(store_statfs(0x1b7850000/0x0/0x1bfc00000, data 0x41a5222/0x423c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.593496+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107905024 unmapped: 16777216 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.593658+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 102 ms_handle_reset con 0x55d11be82400 session 0x55d119c6a1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107929600 unmapped: 16752640 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.593811+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1184641 data_alloc: 285212672 data_used: 22196224
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.593939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.594071+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 102 heartbeat osd_stat(store_statfs(0x1b784f000/0x0/0x1bfc00000, data 0x41a7386/0x423e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.594257+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.594446+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 102 heartbeat osd_stat(store_statfs(0x1b784f000/0x0/0x1bfc00000, data 0x41a7386/0x423e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.594657+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.594827+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1184641 data_alloc: 285212672 data_used: 22196224
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.595004+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.595167+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107945984 unmapped: 16736256 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.595307+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107954176 unmapped: 16728064 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 16.879070282s of 17.161037445s, submitted: 85
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b784b000/0x0/0x1bfc00000, data 0x41a979f/0x4242000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.595445+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.595598+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1187643 data_alloc: 285212672 data_used: 22196224
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.595812+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.596001+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b784b000/0x0/0x1bfc00000, data 0x41a979f/0x4242000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.596171+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.596319+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.596478+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1187643 data_alloc: 285212672 data_used: 22196224
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d9fb400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11d9fb400 session 0x55d11d9fd0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11e137000 session 0x55d11d9fda40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d118f03800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d118f03800 session 0x55d11d9fc1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.596691+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108011520 unmapped: 16670720 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.596870+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be82400 session 0x55d11d9fcb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108019712 unmapped: 16662528 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b764e000/0x0/0x1bfc00000, data 0x43a57d8/0x4440000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [1,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be83000 session 0x55d11d9fd2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d9fb400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11d9fb400 session 0x55d11a59f4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d118f03800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d118f03800 session 0x55d11b777e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be82400 session 0x55d119c6a3c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be83000 session 0x55d119c6be00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.597072+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107872256 unmapped: 16809984 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11dfc6000 session 0x55d11d9fd4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11b603c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11b603c00 session 0x55d11d9fdc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.597234+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107872256 unmapped: 16809984 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d118f03800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d118f03800 session 0x55d11d9fd0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.449080467s of 11.698691368s, submitted: 64
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b708d000/0x0/0x1bfc00000, data 0x4966811/0x4a01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [0,0,0,2])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be82400 session 0x55d11d9fc1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.597372+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 107954176 unmapped: 16728064 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1258484 data_alloc: 285212672 data_used: 22196224
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.597506+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108027904 unmapped: 16654336 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.597792+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 108724224 unmapped: 15958016 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b7068000/0x0/0x1bfc00000, data 0x498a821/0x4a26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.597951+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111263744 unmapped: 13418496 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.598113+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b7068000/0x0/0x1bfc00000, data 0x498a821/0x4a26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.598398+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1287548 data_alloc: 285212672 data_used: 26165248
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.598688+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.598866+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.599038+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b7068000/0x0/0x1bfc00000, data 0x498a821/0x4a26000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.599218+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.599395+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1287548 data_alloc: 285212672 data_used: 26165248
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111296512 unmapped: 13385728 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.599560+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 111304704 unmapped: 13377536 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.676588058s of 11.714396477s, submitted: 9
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.599668+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 118865920 unmapped: 5816320 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.599841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 119054336 unmapped: 5627904 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b6953000/0x0/0x1bfc00000, data 0x5097821/0x5133000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.600087+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116801536 unmapped: 7880704 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.600327+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1354530 data_alloc: 285212672 data_used: 27172864
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116973568 unmapped: 7708672 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2c400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11de2c400 session 0x55d11d9fcd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11e136400 session 0x55d119951e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11c09b400 session 0x55d11a5cc3c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.600588+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116973568 unmapped: 7708672 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be83000 session 0x55d11bc5f860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11dfc6000 session 0x55d119c6b680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3066090828' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.600819+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116989952 unmapped: 7692288 heap: 124682240 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11c09b400 session 0x55d11e6514a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d118f03800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.600957+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 123133952 unmapped: 7397376 heap: 130531328 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be82400 session 0x55d11ea901e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d118f03800 session 0x55d11b80e780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.601167+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 115875840 unmapped: 17809408 heap: 133685248 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 heartbeat osd_stat(store_statfs(0x1b5c9c000/0x0/0x1bfc00000, data 0x5d56821/0x5df2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 ms_handle_reset con 0x55d11be83000 session 0x55d11ea90780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11be82400 session 0x55d11c2d45a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11dfc6000 session 0x55d11ea90b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11c09b400 session 0x55d11d0c85a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8b800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.601361+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1494795 data_alloc: 301989888 data_used: 29908992
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 120324096 unmapped: 13361152 heap: 133685248 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11ea8b800 session 0x55d11d100b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11be83000 session 0x55d11ea910e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.601503+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124469248 unmapped: 12058624 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 ms_handle_reset con 0x55d11c09b400 session 0x55d11ea912c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.972964287s of 10.004225731s, submitted: 204
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 105 ms_handle_reset con 0x55d11dfc6000 session 0x55d11ea914a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 105 ms_handle_reset con 0x55d11be82400 session 0x55d119c6b4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8b800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.601676+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124067840 unmapped: 12460032 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 ms_handle_reset con 0x55d11ea8b800 session 0x55d1199bed20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 heartbeat osd_stat(store_statfs(0x1b5158000/0x0/0x1bfc00000, data 0x689434d/0x6935000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 ms_handle_reset con 0x55d11be82400 session 0x55d11b8ea1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.601818+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 ms_handle_reset con 0x55d11be83000 session 0x55d11bd485a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 ms_handle_reset con 0x55d11c09b400 session 0x55d118117e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124174336 unmapped: 12353536 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.601955+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 120250368 unmapped: 16277504 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 ms_handle_reset con 0x55d11bd71800 session 0x55d1199c6d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.602096+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1432994 data_alloc: 285212672 data_used: 22220800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116580352 unmapped: 19947520 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.602293+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 116580352 unmapped: 19947520 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 heartbeat osd_stat(store_statfs(0x1b6059000/0x0/0x1bfc00000, data 0x599387c/0x5a34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.602441+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 117915648 unmapped: 18612224 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.602623+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 118079488 unmapped: 18448384 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11d004800 session 0x55d11ea91860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11bd71800 session 0x55d11ea91c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.602819+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be82400 session 0x55d11ea91e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be83000 session 0x55d11d1003c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 118095872 unmapped: 18432000 heap: 136527872 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11dfc6800 session 0x55d119c6b860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11c09b400 session 0x55d11d0c92c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11bd71800 session 0x55d11e650b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be82400 session 0x55d11e651a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be83000 session 0x55d11e6514a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11c09b400 session 0x55d11e4525a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.603374+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1517477 data_alloc: 301989888 data_used: 27389952
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135323648 unmapped: 5185536 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11dfc6800 session 0x55d11b8ea780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 heartbeat osd_stat(store_statfs(0x1b5038000/0x0/0x1bfc00000, data 0x69b1d07/0x6a56000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.603534+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11bd71800 session 0x55d11e6503c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126025728 unmapped: 14483456 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be82400 session 0x55d11bcc74a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.603711+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126025728 unmapped: 14483456 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11be83000 session 0x55d11bcf3c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.151296616s of 10.838096619s, submitted: 164
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 ms_handle_reset con 0x55d11c09b400 session 0x55d11c2d54a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0af000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.603854+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126033920 unmapped: 14475264 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.604029+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 ms_handle_reset con 0x55d11c0af000 session 0x55d11bcc6960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124354560 unmapped: 16154624 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.604193+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 heartbeat osd_stat(store_statfs(0x1b57bd000/0x0/0x1bfc00000, data 0x62281da/0x62cd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1563996 data_alloc: 285212672 data_used: 27090944
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124682240 unmapped: 15826944 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.604448+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 127590400 unmapped: 12918784 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.604665+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 127590400 unmapped: 12918784 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.604819+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132366336 unmapped: 8142848 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.605021+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 133578752 unmapped: 6930432 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 ms_handle_reset con 0x55d11dbc8800 session 0x55d1199510e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.605144+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1729266 data_alloc: 301989888 data_used: 31518720
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 133742592 unmapped: 6766592 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 heartbeat osd_stat(store_statfs(0x1b445f000/0x0/0x1bfc00000, data 0x757c1da/0x7621000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.605288+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131481600 unmapped: 9027584 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.605455+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 heartbeat osd_stat(store_statfs(0x1b4416000/0x0/0x1bfc00000, data 0x75cb1da/0x7670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131563520 unmapped: 8945664 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.113204956s of 10.093416214s, submitted: 257
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.605645+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131571712 unmapped: 8937472 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.605847+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130850816 unmapped: 9658368 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.605988+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1732242 data_alloc: 301989888 data_used: 31637504
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130850816 unmapped: 9658368 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.606156+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130662400 unmapped: 9846784 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.606294+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130662400 unmapped: 9846784 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 heartbeat osd_stat(store_statfs(0x1b43ef000/0x0/0x1bfc00000, data 0x75f55f3/0x769c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.606497+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 ms_handle_reset con 0x55d11d004000 session 0x55d11a59e1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 ms_handle_reset con 0x55d11c097000 session 0x55d11c2d5860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130662400 unmapped: 9846784 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.606646+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 ms_handle_reset con 0x55d11d005000 session 0x55d11bc5ab40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130670592 unmapped: 9838592 heap: 140509184 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 109 handle_osd_map epochs [109,110], i have 109, src has [1,110]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 ms_handle_reset con 0x55d11bd71800 session 0x55d11bc5ba40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 handle_osd_map epochs [110,110], i have 110, src has [1,110]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 ms_handle_reset con 0x55d11bd71800 session 0x55d11a44c1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.606800+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1882791 data_alloc: 301989888 data_used: 31649792
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 148201472 unmapped: 13680640 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 ms_handle_reset con 0x55d11c097000 session 0x55d11bcf30e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets getting new tickets!
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.607381+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _finish_auth 0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.608424+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135495680 unmapped: 26386432 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 111 ms_handle_reset con 0x55d11d004000 session 0x55d11b8ebe00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.607561+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 111 heartbeat osd_stat(store_statfs(0x1b2a56000/0x0/0x1bfc00000, data 0x8f8eb36/0x9038000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135544832 unmapped: 26337280 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.608332634s of 10.076329231s, submitted: 98
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11d005000 session 0x55d11bf32780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11dbc8800 session 0x55d11e452f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.607741+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11bd71800 session 0x55d11bcf3860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11c097000 session 0x55d11bcf2b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135569408 unmapped: 26312704 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.607936+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 heartbeat osd_stat(store_statfs(0x1b2a4c000/0x0/0x1bfc00000, data 0x8f9360c/0x9040000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,2,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130981888 unmapped: 30900224 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11c096c00 session 0x55d11bcf32c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.608099+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11d004400 session 0x55d11b89cd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d952800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11d952800 session 0x55d119953c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1898904 data_alloc: 301989888 data_used: 27680768
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131072000 unmapped: 30810112 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.608306+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131072000 unmapped: 30810112 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.608583+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11bd71800 session 0x55d1199c4780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11c096c00 session 0x55d11c2d41e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 ms_handle_reset con 0x55d11c097000 session 0x55d11c2d43c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 131112960 unmapped: 30769152 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d004400 session 0x55d11c2d45a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d952800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0af800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11c0af800 session 0x55d11e452000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d952800 session 0x55d11b804000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0af800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11c0af800 session 0x55d11b8041e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11bd71800 session 0x55d11a44c960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11c096c00 session 0x55d11b8eb0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d004400 session 0x55d11ea912c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11c097000 session 0x55d11e4521e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.608734+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153632768 unmapped: 8249344 heap: 161882112 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d004000 session 0x55d11ea914a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.608889+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d005000 session 0x55d11ea910e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11bd71800 session 0x55d11e453c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11c09b400 session 0x55d11d0c85a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 146661376 unmapped: 25673728 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d004000 session 0x55d11a5cd2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.609048+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11d005000 session 0x55d11c2d5e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 heartbeat osd_stat(store_statfs(0x1b0bc3000/0x0/0x1bfc00000, data 0xaa1ba87/0xaacb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.038835
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1728053248 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1660944384 meta_used: 2197931 data_alloc: 301989888 data_used: 38866944
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 146669568 unmapped: 25665536 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11dfc7400 session 0x55d11e6505a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 ms_handle_reset con 0x55d11e136c00 session 0x55d11d0c8d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 heartbeat osd_stat(store_statfs(0x1b0bc3000/0x0/0x1bfc00000, data 0xaa1ba87/0xaacb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11b603000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0ae800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.609199+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 146726912 unmapped: 25608192 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.609354+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11dfc7000 session 0x55d11ea901e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136437760 unmapped: 35897344 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.998419762s of 10.131677628s, submitted: 262
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.609523+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11d004000 session 0x55d11ea91a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11b603000 session 0x55d11bc5af00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11c0ae800 session 0x55d119c6bc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11d005000 session 0x55d11b8a6d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11e136000 session 0x55d11b8a70e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 127139840 unmapped: 45195264 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11b603000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0ae800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.609647+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11b603000 session 0x55d11bcf2780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 ms_handle_reset con 0x55d11c0ae800 session 0x55d11bf33c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 125050880 unmapped: 47284224 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.609807+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1569803 data_alloc: 285212672 data_used: 19173376
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 125050880 unmapped: 47284224 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.609998+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 125140992 unmapped: 47194112 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 heartbeat osd_stat(store_statfs(0x1b38ef000/0x0/0x1bfc00000, data 0x5db8fde/0x5e67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.610192+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 124461056 unmapped: 47874048 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11d004000 session 0x55d11a5a9680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.610307+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 125378560 unmapped: 46956544 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.610465+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 125476864 unmapped: 46858240 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.610613+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1626251 data_alloc: 301989888 data_used: 26263552
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126697472 unmapped: 45637632 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.610809+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126820352 unmapped: 45514752 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.610966+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126820352 unmapped: 45514752 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b5821000/0x0/0x1bfc00000, data 0x5dbb41a/0x5e6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.611109+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126820352 unmapped: 45514752 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.611247+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b5821000/0x0/0x1bfc00000, data 0x5dbb41a/0x5e6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126820352 unmapped: 45514752 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.611407+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1627371 data_alloc: 301989888 data_used: 26431488
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126820352 unmapped: 45514752 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.444602013s of 13.174457550s, submitted: 202
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8dc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.611623+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126828544 unmapped: 45506560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11bd71800 session 0x55d119c6b4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11c097000 session 0x55d11ea90b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.611800+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b5822000/0x0/0x1bfc00000, data 0x5dbb41a/0x5e6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126844928 unmapped: 45490176 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.611958+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 126844928 unmapped: 45490176 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b5822000/0x0/0x1bfc00000, data 0x5dbb41a/0x5e6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.612064+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 130850816 unmapped: 41484288 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.612220+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1736407 data_alloc: 301989888 data_used: 26652672
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136323072 unmapped: 36012032 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.612345+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136757248 unmapped: 35577856 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.612501+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136970240 unmapped: 35364864 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.612697+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137535488 unmapped: 34799616 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.612891+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b379a000/0x0/0x1bfc00000, data 0x7e3d41a/0x7eee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137617408 unmapped: 34717696 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b379a000/0x0/0x1bfc00000, data 0x7e3d41a/0x7eee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.613035+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1893871 data_alloc: 301989888 data_used: 27947008
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137641984 unmapped: 34693120 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b379a000/0x0/0x1bfc00000, data 0x7e3d41a/0x7eee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.613215+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137723904 unmapped: 34611200 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.613356+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.619566917s of 11.564517975s, submitted: 321
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137551872 unmapped: 34783232 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.613543+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b377c000/0x0/0x1bfc00000, data 0x7e6141a/0x7f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11c4a9400 session 0x55d11b89c000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11d005000 session 0x55d11a44cb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137715712 unmapped: 34619392 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11ea8dc00 session 0x55d11b89cf00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11dbc8c00 session 0x55d11b80f2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11c09b000 session 0x55d11b7f34a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.613874+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 ms_handle_reset con 0x55d11c4a9400 session 0x55d11b897860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.614006+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.614136+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.614286+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.614464+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.614651+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.614812+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.615006+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.615280+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.615436+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.615609+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.615746+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.615945+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.616099+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.616284+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.616434+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.616560+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.616795+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.616942+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.617095+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.617268+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.617439+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.617594+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.617745+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.617941+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.618102+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.618304+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.618502+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.618679+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.618832+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.619024+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.619199+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.619349+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.619516+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.619681+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.619902+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.620047+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.620213+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.620391+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.620526+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.620743+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.620975+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132800512 unmapped: 39534592 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.621179+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.621437+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.621636+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.621816+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.622018+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.622228+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.622436+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.622587+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132816896 unmapped: 39518208 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.622733+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.622998+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.623192+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.623376+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.623570+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.623804+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.623982+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.624250+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132825088 unmapped: 39510016 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.624408+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132833280 unmapped: 39501824 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.624605+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132833280 unmapped: 39501824 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.624792+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132833280 unmapped: 39501824 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.624989+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1373827 data_alloc: 285212672 data_used: 19181568
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132833280 unmapped: 39501824 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:20.625172+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132833280 unmapped: 39501824 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.625334+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 64.225265503s of 64.728683472s, submitted: 152
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 heartbeat osd_stat(store_statfs(0x1b741c000/0x0/0x1bfc00000, data 0x41c5323/0x4272000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 116 ms_handle_reset con 0x55d119163c00 session 0x55d11bf32780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132857856 unmapped: 39477248 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8c000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.625504+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 132907008 unmapped: 39428096 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 ms_handle_reset con 0x55d11ea89400 session 0x55d11a850b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 ms_handle_reset con 0x55d11ea8c000 session 0x55d11a44c1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.626126+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2dc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 heartbeat osd_stat(store_statfs(0x1b7412000/0x0/0x1bfc00000, data 0x41c8517/0x427c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 118 ms_handle_reset con 0x55d11de2dc00 session 0x55d1199be3c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134021120 unmapped: 38313984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.626330+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1407522 data_alloc: 285212672 data_used: 19206144
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134135808 unmapped: 38199296 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 119 handle_osd_map epochs [118,119], i have 119, src has [1,119]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.626494+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 119 ms_handle_reset con 0x55d119163c00 session 0x55d11bc5b2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 120 ms_handle_reset con 0x55d11c4a9400 session 0x55d119c6a1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134152192 unmapped: 38182912 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.626647+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134160384 unmapped: 38174720 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.626819+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 121 ms_handle_reset con 0x55d11ea89400 session 0x55d11a44c780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134225920 unmapped: 38109184 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11918d400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.626935+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 121 heartbeat osd_stat(store_statfs(0x1b73f4000/0x0/0x1bfc00000, data 0x41d5510/0x4297000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134250496 unmapped: 38084608 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 ms_handle_reset con 0x55d11918d400 session 0x55d11b7f3a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.627087+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 heartbeat osd_stat(store_statfs(0x1b73f6000/0x0/0x1bfc00000, data 0x41d5533/0x4298000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1425832 data_alloc: 285212672 data_used: 19230720
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134250496 unmapped: 38084608 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.627219+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 heartbeat osd_stat(store_statfs(0x1b73f1000/0x0/0x1bfc00000, data 0x41d7a82/0x429c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0af800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134250496 unmapped: 38084608 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.627381+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.747723579s of 10.366942406s, submitted: 166
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 123 ms_handle_reset con 0x55d11c0af800 session 0x55d11a44d0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134299648 unmapped: 38035456 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.627505+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 123 heartbeat osd_stat(store_statfs(0x1b73ed000/0x0/0x1bfc00000, data 0x41d9ff6/0x429f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134324224 unmapped: 38010880 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.627688+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8d400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 ms_handle_reset con 0x55d11ea8d400 session 0x55d1199f4f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134373376 unmapped: 37961728 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 ms_handle_reset con 0x55d11d96b400 session 0x55d1199f41e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.627817+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 ms_handle_reset con 0x55d11dfc7400 session 0x55d119950960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8a000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 ms_handle_reset con 0x55d11ea8a000 session 0x55d11a5cc1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1432398 data_alloc: 285212672 data_used: 19234816
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134373376 unmapped: 37961728 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.627987+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 125 ms_handle_reset con 0x55d11d005000 session 0x55d1199c65a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134520832 unmapped: 37814272 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 125 ms_handle_reset con 0x55d11dfc7800 session 0x55d1199f4000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.628129+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134684672 unmapped: 37650432 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.628337+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 127 ms_handle_reset con 0x55d11d96b400 session 0x55d11b804b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134742016 unmapped: 37593088 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 127 heartbeat osd_stat(store_statfs(0x1b73e1000/0x0/0x1bfc00000, data 0x41e2992/0x42ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.628479+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 127 ms_handle_reset con 0x55d11dfc7400 session 0x55d11a5d6d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 127 handle_osd_map epochs [127,128], i have 128, src has [1,128]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134840320 unmapped: 37494784 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.628643+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 ms_handle_reset con 0x55d11d005c00 session 0x55d119c6af00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 ms_handle_reset con 0x55d119163400 session 0x55d119950d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1443612 data_alloc: 285212672 data_used: 19243008
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134971392 unmapped: 37363712 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.628818+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 129 ms_handle_reset con 0x55d11d005c00 session 0x55d1181163c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 129 heartbeat osd_stat(store_statfs(0x1b73e5000/0x0/0x1bfc00000, data 0x41e3a72/0x42a6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 129 heartbeat osd_stat(store_statfs(0x1b6fe4000/0x0/0x1bfc00000, data 0x41e6039/0x42aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 134995968 unmapped: 37339136 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.628962+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.669843674s of 10.142852783s, submitted: 424
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135020544 unmapped: 37314560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.629135+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 130 heartbeat osd_stat(store_statfs(0x1b6fe4000/0x0/0x1bfc00000, data 0x41e6039/0x42aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135020544 unmapped: 37314560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.629296+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135020544 unmapped: 37314560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.629472+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1444742 data_alloc: 285212672 data_used: 19259392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135020544 unmapped: 37314560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.629715+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135020544 unmapped: 37314560 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.630817+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135036928 unmapped: 37298176 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.630984+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.631154+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.631302+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1447744 data_alloc: 285212672 data_used: 19259392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.631458+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.631592+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.631734+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.631856+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.632007+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1447744 data_alloc: 285212672 data_used: 19259392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.632209+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135045120 unmapped: 37289984 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.632354+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.632532+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.632705+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.632854+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1447744 data_alloc: 285212672 data_used: 19259392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.633006+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:01.633177+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:02.633343+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:03.633478+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:04.633596+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1447744 data_alloc: 285212672 data_used: 19259392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135077888 unmapped: 37257216 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:05.633750+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 heartbeat osd_stat(store_statfs(0x1b6fdb000/0x0/0x1bfc00000, data 0x41ea907/0x42b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 23.991611481s of 24.049470901s, submitted: 42
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:06.633936+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135086080 unmapped: 37249024 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 133 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:07.634158+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135086080 unmapped: 37249024 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:08.634304+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135086080 unmapped: 37249024 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 handle_osd_map epochs [132,134], i have 134, src has [1,134]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:09.634465+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135151616 unmapped: 37183488 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 heartbeat osd_stat(store_statfs(0x1b6fce000/0x0/0x1bfc00000, data 0x41f1972/0x42bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 heartbeat osd_stat(store_statfs(0x1b6fce000/0x0/0x1bfc00000, data 0x41f1972/0x42bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1461837 data_alloc: 285212672 data_used: 19271680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:10.634621+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135151616 unmapped: 37183488 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 ms_handle_reset con 0x55d11c097c00 session 0x55d11a5a8960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:11.634795+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135184384 unmapped: 37150720 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:12.634921+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135208960 unmapped: 37126144 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 heartbeat osd_stat(store_statfs(0x1b6fc6000/0x0/0x1bfc00000, data 0x41f62dc/0x42c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:13.635053+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135208960 unmapped: 37126144 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 heartbeat osd_stat(store_statfs(0x1b6fc6000/0x0/0x1bfc00000, data 0x41f62dc/0x42c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 heartbeat osd_stat(store_statfs(0x1b6fc6000/0x0/0x1bfc00000, data 0x41f62dc/0x42c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:14.635235+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135208960 unmapped: 37126144 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 heartbeat osd_stat(store_statfs(0x1b6fc6000/0x0/0x1bfc00000, data 0x41f62dc/0x42c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1467351 data_alloc: 285212672 data_used: 19283968
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:15.635453+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135208960 unmapped: 37126144 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:16.635633+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135208960 unmapped: 37126144 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.503304482s of 10.842945099s, submitted: 101
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 heartbeat osd_stat(store_statfs(0x1b6fc6000/0x0/0x1bfc00000, data 0x41f62dc/0x42c6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:17.635835+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 heartbeat osd_stat(store_statfs(0x1b6fc3000/0x0/0x1bfc00000, data 0x41f86f5/0x42ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135225344 unmapped: 37109760 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:18.636029+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135225344 unmapped: 37109760 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:19.636219+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135225344 unmapped: 37109760 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 heartbeat osd_stat(store_statfs(0x1b6fc3000/0x0/0x1bfc00000, data 0x41f86f5/0x42ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1469681 data_alloc: 285212672 data_used: 19283968
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:20.636370+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135225344 unmapped: 37109760 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:21.636529+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135233536 unmapped: 37101568 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96bc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 ms_handle_reset con 0x55d11d96bc00 session 0x55d11b8a7e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:22.636735+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135233536 unmapped: 37101568 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 heartbeat osd_stat(store_statfs(0x1b6fc4000/0x0/0x1bfc00000, data 0x41f86f5/0x42ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d119a70000 session 0x55d11bc5e780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:23.636916+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135258112 unmapped: 37076992 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 heartbeat osd_stat(store_statfs(0x1b6fbf000/0x0/0x1bfc00000, data 0x41fac28/0x42ce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:24.637089+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135266304 unmapped: 37068800 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d11a41fc00 session 0x55d11b8a63c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d119a70000 session 0x55d11c2d4b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1479560 data_alloc: 285212672 data_used: 19296256
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:25.637311+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135274496 unmapped: 37060608 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d11a41fc00 session 0x55d11a59a5a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d11c097c00 session 0x55d1199f5680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:26.637458+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135299072 unmapped: 37036032 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 heartbeat osd_stat(store_statfs(0x1b6fc0000/0x0/0x1bfc00000, data 0x41fac28/0x42ce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.079842567s of 10.222591400s, submitted: 43
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 ms_handle_reset con 0x55d11d005c00 session 0x55d11bcc6f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96bc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:27.637828+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135290880 unmapped: 37044224 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 ms_handle_reset con 0x55d11d96bc00 session 0x55d11d0c8b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:28.637944+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136364032 unmapped: 35971072 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:29.638069+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136364032 unmapped: 35971072 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 ms_handle_reset con 0x55d119a70000 session 0x55d11b896960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1480234 data_alloc: 285212672 data_used: 19308544
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 heartbeat osd_stat(store_statfs(0x1b6fbb000/0x0/0x1bfc00000, data 0x41fd1af/0x42d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:30.638391+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 ms_handle_reset con 0x55d11a41fc00 session 0x55d11b8972c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:31.638552+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 ms_handle_reset con 0x55d11c097c00 session 0x55d11a5a7680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:32.638697+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 ms_handle_reset con 0x55d11d005c00 session 0x55d11bc5e000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:33.638849+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:34.639029+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1484422 data_alloc: 285212672 data_used: 19308544
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:35.639253+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 heartbeat osd_stat(store_statfs(0x1b6fbb000/0x0/0x1bfc00000, data 0x41fd1c0/0x42d3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:36.639498+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 135241728 unmapped: 37093376 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.910618782s of 10.080940247s, submitted: 55
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:37.639703+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136339456 unmapped: 35995648 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:38.639836+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136339456 unmapped: 35995648 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:39.640045+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136339456 unmapped: 35995648 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 140 ms_handle_reset con 0x55d11d96b800 session 0x55d11b80f680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1491276 data_alloc: 285212672 data_used: 19324928
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:40.640221+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136339456 unmapped: 35995648 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 141 ms_handle_reset con 0x55d11ea89400 session 0x55d11a851c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:41.640371+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 141 ms_handle_reset con 0x55d119a70000 session 0x55d11ea903c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136355840 unmapped: 35979264 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 141 ms_handle_reset con 0x55d11a41fc00 session 0x55d1199c4d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 heartbeat osd_stat(store_statfs(0x1b6fb1000/0x0/0x1bfc00000, data 0x4201b1c/0x42dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:42.640525+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136347648 unmapped: 35987456 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 ms_handle_reset con 0x55d119163c00 session 0x55d11a5a6b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11918c800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 ms_handle_reset con 0x55d11918c800 session 0x55d11b7f2780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:43.640668+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136372224 unmapped: 35962880 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 ms_handle_reset con 0x55d119a70000 session 0x55d11d9fcf00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 ms_handle_reset con 0x55d119163c00 session 0x55d11e452960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 ms_handle_reset con 0x55d11a41fc00 session 0x55d11e453e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:44.640845+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 ms_handle_reset con 0x55d11ea89400 session 0x55d11b776960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136282112 unmapped: 36052992 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1509312 data_alloc: 285212672 data_used: 19349504
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:45.641026+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d9fa800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136314880 unmapped: 36020224 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 heartbeat osd_stat(store_statfs(0x1b6fa4000/0x0/0x1bfc00000, data 0x4208b5d/0x42e8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d11d9fa800 session 0x55d11c2d4f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:46.641186+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136388608 unmapped: 35946496 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d119163c00 session 0x55d11bf32b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d119a70000 session 0x55d119c6bc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:47.641338+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136404992 unmapped: 35930112 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.724060059s of 10.507861137s, submitted: 207
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d11a41fc00 session 0x55d11a59b860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:48.641491+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 136380416 unmapped: 35954688 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:49.641631+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d11dfc6400 session 0x55d11c2d4d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11b603000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153133056 unmapped: 19202048 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 heartbeat osd_stat(store_statfs(0x1b5f9c000/0x0/0x1bfc00000, data 0x520b5d9/0x52f2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 ms_handle_reset con 0x55d11b603000 session 0x55d118117860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:50.641903+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1693694 data_alloc: 285212672 data_used: 19345408
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137453568 unmapped: 34881536 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:51.642053+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 146 heartbeat osd_stat(store_statfs(0x1b5796000/0x0/0x1bfc00000, data 0x5a0db3b/0x5af7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137519104 unmapped: 34816000 heap: 172335104 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:52.642199+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 152985600 unmapped: 27746304 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 147 ms_handle_reset con 0x55d119163c00 session 0x55d11b89cd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 148 ms_handle_reset con 0x55d11a875800 session 0x55d119952000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:53.642332+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137641984 unmapped: 43089920 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:54.642482+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137773056 unmapped: 42958848 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 149 handle_osd_map epochs [148,149], i have 149, src has [1,149]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 149 ms_handle_reset con 0x55d11de2d000 session 0x55d11ea91a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:55.642655+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1554324 data_alloc: 285212672 data_used: 19369984
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137871360 unmapped: 42860544 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 150 ms_handle_reset con 0x55d11fece800 session 0x55d11bcc6b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:56.642828+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137912320 unmapped: 42819584 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8c400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 151 ms_handle_reset con 0x55d11ea8c400 session 0x55d11b89c000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 151 heartbeat osd_stat(store_statfs(0x1b5de8000/0x0/0x1bfc00000, data 0x421922f/0x4304000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:57.642970+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137928704 unmapped: 42803200 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.852189064s of 10.462317467s, submitted: 375
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 152 ms_handle_reset con 0x55d11e137800 session 0x55d11b7f32c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:58.643108+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:59.643332+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:00.643498+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1557573 data_alloc: 285212672 data_used: 19382272
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 152 heartbeat osd_stat(store_statfs(0x1b5de4000/0x0/0x1bfc00000, data 0x421b730/0x4306000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:01.643694+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:02.643888+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 153 heartbeat osd_stat(store_statfs(0x1b5de3000/0x0/0x1bfc00000, data 0x421dba1/0x430a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:03.644046+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:04.644194+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:05.644421+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1558673 data_alloc: 285212672 data_used: 19382272
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 153 heartbeat osd_stat(store_statfs(0x1b5de3000/0x0/0x1bfc00000, data 0x421dba1/0x430a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:06.644630+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:07.644842+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 153 handle_osd_map epochs [153,154], i have 153, src has [1,154]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:08.644981+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.420944214s of 10.487941742s, submitted: 44
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:09.645132+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 154 ms_handle_reset con 0x55d11c097c00 session 0x55d119c6ad20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:10.645349+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1566570 data_alloc: 285212672 data_used: 19394560
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea75c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 155 ms_handle_reset con 0x55d11ea75c00 session 0x55d1181170e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137945088 unmapped: 42786816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 155 ms_handle_reset con 0x55d11e136400 session 0x55d11a851e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea74000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:11.645562+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 ms_handle_reset con 0x55d11ea74000 session 0x55d11a59e960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137953280 unmapped: 42778624 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 heartbeat osd_stat(store_statfs(0x1b5dd5000/0x0/0x1bfc00000, data 0x4224c64/0x4318000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 ms_handle_reset con 0x55d11c097c00 session 0x55d11b80eb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:12.645798+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:13.645998+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:14.646171+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 heartbeat osd_stat(store_statfs(0x1b5dd7000/0x0/0x1bfc00000, data 0x4224bf2/0x4316000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:15.646362+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1571086 data_alloc: 285212672 data_used: 19406848
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:16.672593+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 heartbeat osd_stat(store_statfs(0x1b5dd7000/0x0/0x1bfc00000, data 0x4224bf2/0x4316000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:17.672748+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 ms_handle_reset con 0x55d11de2d800 session 0x55d11bcc72c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:18.672919+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 137969664 unmapped: 42762240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.940179825s of 10.157337189s, submitted: 99
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea75800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 ms_handle_reset con 0x55d11ea75800 session 0x55d11bf33a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 47
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 ms_handle_reset con 0x55d11ea8d800 session 0x55d11c2d50e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:19.673136+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:20.673344+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1577953 data_alloc: 285212672 data_used: 19406848
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:21.673534+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 heartbeat osd_stat(store_statfs(0x1b5dd0000/0x0/0x1bfc00000, data 0x42270c6/0x431b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:22.673659+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:23.673831+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 heartbeat osd_stat(store_statfs(0x1b5dd0000/0x0/0x1bfc00000, data 0x42270c6/0x431b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:24.673989+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138264576 unmapped: 42467328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 48
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 heartbeat osd_stat(store_statfs(0x1b5dd0000/0x0/0x1bfc00000, data 0x42270c6/0x431b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:25.674257+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1577953 data_alloc: 285212672 data_used: 19406848
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138346496 unmapped: 42385408 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:26.674410+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138362880 unmapped: 42369024 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:27.674567+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138362880 unmapped: 42369024 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 heartbeat osd_stat(store_statfs(0x1b5dd3000/0x0/0x1bfc00000, data 0x42270f5/0x431b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:28.674803+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138362880 unmapped: 42369024 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:29.674948+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138362880 unmapped: 42369024 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.348354340s of 11.408683777s, submitted: 15
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:30.675105+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1578990 data_alloc: 285212672 data_used: 19406848
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138371072 unmapped: 42360832 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:31.675252+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138371072 unmapped: 42360832 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:32.675409+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 138371072 unmapped: 42360832 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:33.675524+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 ms_handle_reset con 0x55d11ea89800 session 0x55d11b7f3e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c097c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 139747328 unmapped: 40984576 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 ms_handle_reset con 0x55d11c097c00 session 0x55d11a59b4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 heartbeat osd_stat(store_statfs(0x1b4ac3000/0x0/0x1bfc00000, data 0x5534157/0x562b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:34.675678+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140525568 unmapped: 40206336 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:35.675875+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 157 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1776339 data_alloc: 285212672 data_used: 19419136
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140451840 unmapped: 40280064 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 158 ms_handle_reset con 0x55d11de2d800 session 0x55d119c6a1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:36.676478+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140476416 unmapped: 40255488 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:37.676680+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140476416 unmapped: 40255488 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:38.676824+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140476416 unmapped: 40255488 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:39.676970+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 158 handle_osd_map epochs [158,159], i have 158, src has [1,159]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140509184 unmapped: 40222720 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 heartbeat osd_stat(store_statfs(0x1b5498000/0x0/0x1bfc00000, data 0x4b5a7ce/0x4c56000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.115469933s of 10.001636505s, submitted: 191
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:40.677115+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1688418 data_alloc: 285212672 data_used: 19431424
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140533760 unmapped: 40198144 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 heartbeat osd_stat(store_statfs(0x1b5494000/0x0/0x1bfc00000, data 0x4b5ccf1/0x4c59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:41.677273+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 ms_handle_reset con 0x55d119163c00 session 0x55d11b897c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8c800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 ms_handle_reset con 0x55d11ea8c800 session 0x55d11b7763c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140550144 unmapped: 40181760 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:42.677411+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140591104 unmapped: 40140800 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:43.677544+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 160 handle_osd_map epochs [160,161], i have 160, src has [1,161]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fecf400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 handle_osd_map epochs [160,161], i have 161, src has [1,161]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140591104 unmapped: 40140800 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 ms_handle_reset con 0x55d11fecf400 session 0x55d11bcc70e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:44.677674+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140640256 unmapped: 40091648 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:45.677851+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1694341 data_alloc: 285212672 data_used: 19443712
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140664832 unmapped: 40067072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:46.677975+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140664832 unmapped: 40067072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 heartbeat osd_stat(store_statfs(0x1b508e000/0x0/0x1bfc00000, data 0x4b61757/0x4c5f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:47.678129+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140664832 unmapped: 40067072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:48.678287+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140664832 unmapped: 40067072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:49.678462+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140664832 unmapped: 40067072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.662364006s of 10.126696587s, submitted: 121
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:50.678664+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1697409 data_alloc: 285212672 data_used: 19443712
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140722176 unmapped: 40009728 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:51.678837+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 heartbeat osd_stat(store_statfs(0x1b508e000/0x0/0x1bfc00000, data 0x4b61767/0x4c60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140730368 unmapped: 40001536 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:52.679016+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 39960576 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:53.679193+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 162 ms_handle_reset con 0x55d11ea89400 session 0x55d11b89cf00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 39960576 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:54.679397+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d005c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140771328 unmapped: 39960576 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11d005c00 session 0x55d11b8974a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:55.679613+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1712900 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140812288 unmapped: 39919616 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11fece400 session 0x55d11c2d5c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5083000/0x0/0x1bfc00000, data 0x4b66188/0x4c6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:56.679788+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140812288 unmapped: 39919616 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:57.679934+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140853248 unmapped: 39878656 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:58.680095+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 39854080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:59.680275+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8ac00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11ea8ac00 session 0x55d11bcf30e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0afc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11c0afc00 session 0x55d11bcf32c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 39854080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:00.680472+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5084000/0x0/0x1bfc00000, data 0x4b661b1/0x4c6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1714000 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 39854080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:01.680637+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 39854080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:02.680826+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140877824 unmapped: 39854080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.398462296s of 13.022570610s, submitted: 148
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:03.680948+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140902400 unmapped: 39829504 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:04.681074+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140902400 unmapped: 39829504 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:05.681260+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1715668 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea75800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5083000/0x0/0x1bfc00000, data 0x4b6621a/0x4c6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,6])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:06.681380+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11de2d800 session 0x55d11bcf2000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140836864 unmapped: 39895040 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11e136c00 session 0x55d11bd48780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:07.681570+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140836864 unmapped: 39895040 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:08.681682+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11c4a9400 session 0x55d11b777c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 39886848 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b43e3000/0x0/0x1bfc00000, data 0x58052ee/0x590b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:09.681851+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140845056 unmapped: 39886848 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11dfc6400 session 0x55d11b7761e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:10.682209+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1825451 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140902400 unmapped: 39829504 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:11.682359+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2cc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b43e0000/0x0/0x1bfc00000, data 0x58053e5/0x590d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 ms_handle_reset con 0x55d11de2cc00 session 0x55d11a5d65a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140910592 unmapped: 39821312 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:12.682538+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140910592 unmapped: 39821312 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:13.682679+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.821046829s of 10.309316635s, submitted: 109
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b507f000/0x0/0x1bfc00000, data 0x4b6640d/0x4c6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:14.682852+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:15.683053+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b507f000/0x0/0x1bfc00000, data 0x4b663aa/0x4c6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1729716 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:16.706141+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b507f000/0x0/0x1bfc00000, data 0x4b663aa/0x4c6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:17.706356+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b507f000/0x0/0x1bfc00000, data 0x4b663aa/0x4c6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:18.706522+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:19.706691+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:20.706885+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1727388 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:21.707122+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:22.707302+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:23.707466+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:24.707678+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:25.707921+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1727388 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:26.708257+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140918784 unmapped: 39813120 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:27.708502+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140926976 unmapped: 39804928 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:28.708714+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:29.708922+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:30.709107+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1727388 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:31.709350+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:32.709544+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:33.709664+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:34.709887+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:35.710141+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1727388 data_alloc: 285212672 data_used: 19468288
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:36.710885+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:37.711052+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 39796736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:38.711254+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 24.986505508s of 25.018226624s, submitted: 8
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b663d9/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 13K writes, 48K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s
                                                          Cumulative WAL: 13K writes, 4005 syncs, 3.29 writes per sync, written: 0.04 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 7263 writes, 22K keys, 7263 commit groups, 1.0 writes per commit group, ingest: 18.95 MB, 0.03 MB/s
                                                          Interval WAL: 7263 writes, 3188 syncs, 2.28 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140959744 unmapped: 39772160 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:39.711489+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 140959744 unmapped: 39772160 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 164 heartbeat osd_stat(store_statfs(0x1b5085000/0x0/0x1bfc00000, data 0x4b664a3/0x4c69000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:40.711601+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1731590 data_alloc: 285212672 data_used: 19480576
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142016512 unmapped: 38715392 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 164 ms_handle_reset con 0x55d11dbc8000 session 0x55d11a5d6780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:41.711801+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142016512 unmapped: 38715392 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:42.711960+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 164 heartbeat osd_stat(store_statfs(0x1b6061000/0x0/0x1bfc00000, data 0x4b68b09/0x4c6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142024704 unmapped: 38707200 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 165 heartbeat osd_stat(store_statfs(0x1b6061000/0x0/0x1bfc00000, data 0x4b68b09/0x4c6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:43.712171+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142032896 unmapped: 38699008 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:44.712316+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 38690816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:45.712517+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 166 heartbeat osd_stat(store_statfs(0x1b605c000/0x0/0x1bfc00000, data 0x4b6b0f3/0x4c72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1741129 data_alloc: 285212672 data_used: 19505152
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 38690816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 166 ms_handle_reset con 0x55d11be83800 session 0x55d11c2d45a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:46.712686+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142090240 unmapped: 38641664 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:47.712939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 38633472 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:48.713085+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.025701523s of 10.386328697s, submitted: 109
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 38633472 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:49.713239+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 168 heartbeat osd_stat(store_statfs(0x1b604f000/0x0/0x1bfc00000, data 0x4b72150/0x4c7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 168 ms_handle_reset con 0x55d11c2b8c00 session 0x55d11c2d5e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142139392 unmapped: 38592512 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:50.713412+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1749765 data_alloc: 285212672 data_used: 19517440
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142155776 unmapped: 38576128 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 170 ms_handle_reset con 0x55d11c4a8c00 session 0x55d11c2d5680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:51.713580+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142180352 unmapped: 38551552 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 171 ms_handle_reset con 0x55d119163c00 session 0x55d11a59cb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119163c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:52.713803+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142196736 unmapped: 38535168 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 172 ms_handle_reset con 0x55d119163c00 session 0x55d11a59c960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:53.713962+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142213120 unmapped: 38518784 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:54.714159+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 172 heartbeat osd_stat(store_statfs(0x1b603f000/0x0/0x1bfc00000, data 0x4b7b7a6/0x4c8e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142213120 unmapped: 38518784 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:55.714408+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1764037 data_alloc: 285212672 data_used: 19517440
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b8400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142237696 unmapped: 38494208 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 173 ms_handle_reset con 0x55d11c4a9000 session 0x55d11b896780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:56.714506+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 142245888 unmapped: 38486016 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 174 heartbeat osd_stat(store_statfs(0x1b6035000/0x0/0x1bfc00000, data 0x4b8040a/0x4c98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:57.714697+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143294464 unmapped: 37437440 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:58.714863+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d953c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 174 ms_handle_reset con 0x55d11d953c00 session 0x55d1199f50e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143302656 unmapped: 37429248 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.674676895s of 10.171784401s, submitted: 167
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:59.715126+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143319040 unmapped: 37412864 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:00.715315+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1772282 data_alloc: 285212672 data_used: 19529728
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143310848 unmapped: 37421056 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:01.715479+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 175 heartbeat osd_stat(store_statfs(0x1b6031000/0x0/0x1bfc00000, data 0x4b82a7b/0x4c9c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143335424 unmapped: 37396480 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:02.715626+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143360000 unmapped: 37371904 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:03.715841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143360000 unmapped: 37371904 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:04.715964+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143384576 unmapped: 37347328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 49
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:05.716160+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1785634 data_alloc: 285212672 data_used: 19529728
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 176 heartbeat osd_stat(store_statfs(0x1b602c000/0x0/0x1bfc00000, data 0x4b851a1/0x4ca2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143417344 unmapped: 37314560 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:06.716338+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 176 heartbeat osd_stat(store_statfs(0x1b6029000/0x0/0x1bfc00000, data 0x4b85441/0x4ca5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea75400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 176 ms_handle_reset con 0x55d11ea75400 session 0x55d11bd48d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143384576 unmapped: 37347328 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:07.716504+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143376384 unmapped: 37355520 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:08.716651+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 ms_handle_reset con 0x55d11ea89000 session 0x55d1199c5a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 ms_handle_reset con 0x55d11ea89000 session 0x55d11bc5b0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143499264 unmapped: 37232640 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:09.716828+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 143499264 unmapped: 37232640 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:10.716949+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 heartbeat osd_stat(store_statfs(0x1b6028000/0x0/0x1bfc00000, data 0x4b87815/0x4ca6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.962414742s of 11.423717499s, submitted: 128
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 177 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1792228 data_alloc: 285212672 data_used: 19554304
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144556032 unmapped: 36175872 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:11.717079+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144564224 unmapped: 36167680 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:12.717217+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144564224 unmapped: 36167680 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:13.717349+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 178 heartbeat osd_stat(store_statfs(0x1b6025000/0x0/0x1bfc00000, data 0x4b89eda/0x4ca9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144564224 unmapped: 36167680 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:14.717498+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144564224 unmapped: 36167680 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 178 heartbeat osd_stat(store_statfs(0x1b6025000/0x0/0x1bfc00000, data 0x4b89eda/0x4ca9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:15.717787+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1790690 data_alloc: 285212672 data_used: 19554304
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144564224 unmapped: 36167680 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:16.717961+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 178 ms_handle_reset con 0x55d11bd70000 session 0x55d11e452d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 36151296 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:17.718155+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11dbc8000 session 0x55d11e453680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8c000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11ea8c000 session 0x55d11e452000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:18.718321+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:19.718573+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:20.718751+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.805301666s of 10.050050735s, submitted: 85
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1796036 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6021000/0x0/0x1bfc00000, data 0x4b8c342/0x4cac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:21.718995+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:22.719124+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6022000/0x0/0x1bfc00000, data 0x4b8c40c/0x4cac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:23.719305+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:24.719445+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:25.719674+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1796036 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144654336 unmapped: 36077568 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11e137800 session 0x55d11c2d5c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:26.719864+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144670720 unmapped: 36061184 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:27.720045+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11e136c00 session 0x55d11c2d45a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6021000/0x0/0x1bfc00000, data 0x4b8c4e6/0x4cad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144728064 unmapped: 36003840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11bd70000 session 0x55d11c2d5e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:28.720158+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144728064 unmapped: 36003840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:29.720287+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144728064 unmapped: 36003840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:30.720443+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1797809 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.400137901s of 10.506799698s, submitted: 26
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144728064 unmapped: 36003840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:31.720590+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6021000/0x0/0x1bfc00000, data 0x4b8c538/0x4cad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144728064 unmapped: 36003840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:32.720860+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144744448 unmapped: 35987456 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:33.720993+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11bd71c00 session 0x55d11c2d5680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144752640 unmapped: 35979264 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:34.721146+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144752640 unmapped: 35979264 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8dc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11ea8dc00 session 0x55d11a5d6780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:35.721365+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1811429 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144752640 unmapped: 35979264 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:36.721555+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11be83800 session 0x55d11b7763c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11be83800 session 0x55d11b7761e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11bd71c00 session 0x55d11a5d61e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11bd70000 session 0x55d11bcf30e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144752640 unmapped: 35979264 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:37.721721+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b601d000/0x0/0x1bfc00000, data 0x4b8c5aa/0x4caf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11e136c00 session 0x55d11b89cf00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144760832 unmapped: 35971072 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:38.721888+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b601f000/0x0/0x1bfc00000, data 0x4b8c538/0x4cad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144777216 unmapped: 35954688 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:39.722075+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144777216 unmapped: 35954688 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:40.722284+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1803150 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:41.722516+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6022000/0x0/0x1bfc00000, data 0x4b8c4d6/0x4cac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:42.722688+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:43.722899+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:44.723096+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:45.723330+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.943153381s of 14.352549553s, submitted: 84
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6022000/0x0/0x1bfc00000, data 0x4b8c4d6/0x4cac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1806686 data_alloc: 285212672 data_used: 19566592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:46.723521+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea88800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11ea88800 session 0x55d119c6a1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144785408 unmapped: 35946496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:47.723813+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11de2d800 session 0x55d11a59b4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0aec00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 ms_handle_reset con 0x55d11c0aec00 session 0x55d11b8eba40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144859136 unmapped: 35872768 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:48.723977+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 heartbeat osd_stat(store_statfs(0x1b6020000/0x0/0x1bfc00000, data 0x4b8c63b/0x4cae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144859136 unmapped: 35872768 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:49.724177+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144859136 unmapped: 35872768 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:50.724331+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1812146 data_alloc: 285212672 data_used: 19578880
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144875520 unmapped: 35856384 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:51.724531+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144875520 unmapped: 35856384 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:52.724705+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 heartbeat osd_stat(store_statfs(0x1b601c000/0x0/0x1bfc00000, data 0x4b8ec06/0x4cb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 ms_handle_reset con 0x55d11ea8b400 session 0x55d11e4530e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144883712 unmapped: 35848192 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:53.724903+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 ms_handle_reset con 0x55d11ea8d800 session 0x55d11e4525a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144883712 unmapped: 35848192 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:54.725060+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144924672 unmapped: 35807232 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d97a800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 ms_handle_reset con 0x55d11d97a800 session 0x55d11bc5b0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:55.725238+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 heartbeat osd_stat(store_statfs(0x1b601d000/0x0/0x1bfc00000, data 0x4b8ec06/0x4cb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1811980 data_alloc: 285212672 data_used: 19578880
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144949248 unmapped: 35782656 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:56.725366+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.410224915s of 11.726423264s, submitted: 89
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144957440 unmapped: 35774464 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:57.725523+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144957440 unmapped: 35774464 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:58.725711+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144957440 unmapped: 35774464 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:59.725899+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144957440 unmapped: 35774464 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:00.726099+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 heartbeat osd_stat(store_statfs(0x1b6018000/0x0/0x1bfc00000, data 0x4b910ba/0x4cb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1817799 data_alloc: 285212672 data_used: 19591168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 ms_handle_reset con 0x55d11e136c00 session 0x55d1199c5a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 144998400 unmapped: 35733504 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:01.726314+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145006592 unmapped: 35725312 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:02.726470+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119c2d400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 ms_handle_reset con 0x55d119c2d400 session 0x55d11b897c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 ms_handle_reset con 0x55d119a70000 session 0x55d11b896780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145039360 unmapped: 35692544 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:03.726688+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145039360 unmapped: 35692544 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:04.726860+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d953000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145080320 unmapped: 35651584 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:05.727066+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 182 heartbeat osd_stat(store_statfs(0x1b6012000/0x0/0x1bfc00000, data 0x4b93686/0x4cbb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1830043 data_alloc: 285212672 data_used: 19603456
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 182 handle_osd_map epochs [182,183], i have 182, src has [1,183]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145080320 unmapped: 35651584 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 183 handle_osd_map epochs [183,183], i have 183, src has [1,183]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 183 ms_handle_reset con 0x55d11d953000 session 0x55d11a632f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:06.727205+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 183 handle_osd_map epochs [183,184], i have 183, src has [1,184]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.666190147s of 10.001499176s, submitted: 96
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145088512 unmapped: 35643392 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d952800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:07.727337+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 184 ms_handle_reset con 0x55d11d952800 session 0x55d11bcf32c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 185 ms_handle_reset con 0x55d11bd71800 session 0x55d11e452960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145096704 unmapped: 35635200 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:08.727522+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 185 ms_handle_reset con 0x55d11c096c00 session 0x55d11e452780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145104896 unmapped: 35627008 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8ac00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:09.727689+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145104896 unmapped: 35627008 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:10.727829+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 186 ms_handle_reset con 0x55d11ea8ac00 session 0x55d11ea90d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be83000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 186 ms_handle_reset con 0x55d11be83000 session 0x55d11bf32d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 186 heartbeat osd_stat(store_statfs(0x1b6000000/0x0/0x1bfc00000, data 0x4b9d1ed/0x4ccd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1845142 data_alloc: 285212672 data_used: 19615744
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145113088 unmapped: 35618816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:11.727978+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 186 handle_osd_map epochs [186,187], i have 186, src has [1,187]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145670144 unmapped: 35061760 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:12.728124+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 ms_handle_reset con 0x55d11dfc6000 session 0x55d11bf33c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145694720 unmapped: 35037184 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 50
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:13.728301+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 heartbeat osd_stat(store_statfs(0x1b5fff000/0x0/0x1bfc00000, data 0x4b9f2f9/0x4cce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 34988032 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:14.728517+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 34988032 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:15.728920+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1844246 data_alloc: 285212672 data_used: 19628032
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 34988032 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:16.729134+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.627459526s of 10.038949013s, submitted: 363
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 34988032 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:17.729288+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145752064 unmapped: 34979840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:18.729608+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 heartbeat osd_stat(store_statfs(0x1b5ff9000/0x0/0x1bfc00000, data 0x4ba3caa/0x4cd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145752064 unmapped: 34979840 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:19.729805+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145727488 unmapped: 35004416 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:20.729954+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 heartbeat osd_stat(store_statfs(0x1b5ff9000/0x0/0x1bfc00000, data 0x4ba3d74/0x4cd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1854744 data_alloc: 285212672 data_used: 19652608
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 heartbeat osd_stat(store_statfs(0x1b5ff9000/0x0/0x1bfc00000, data 0x4ba3d74/0x4cd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145727488 unmapped: 35004416 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:21.730161+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145727488 unmapped: 35004416 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:22.730426+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 145743872 unmapped: 34988032 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 ms_handle_reset con 0x55d11e137000 session 0x55d11a5a8b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:23.730686+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 heartbeat osd_stat(store_statfs(0x1b5ff2000/0x0/0x1bfc00000, data 0x4ba63a7/0x4cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154198016 unmapped: 26533888 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:24.730861+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 heartbeat osd_stat(store_statfs(0x1b5ff2000/0x0/0x1bfc00000, data 0x4ba6379/0x4cdc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 146939904 unmapped: 33792000 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:25.731039+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2138599 data_alloc: 285212672 data_used: 19664896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 146972672 unmapped: 33759232 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:26.731201+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 190 handle_osd_map epochs [191,191], i have 191, src has [1,191]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:27.731435+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 147021824 unmapped: 33710080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.728066444s of 10.287567139s, submitted: 131
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:28.731701+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 147038208 unmapped: 33693696 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 191 heartbeat osd_stat(store_statfs(0x1b13ec000/0x0/0x1bfc00000, data 0x93a8850/0x94e1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:29.731923+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155443200 unmapped: 25288704 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:30.732206+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 147070976 unmapped: 33660928 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 handle_osd_map epochs [192,192], i have 192, src has [1,192]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 heartbeat osd_stat(store_statfs(0x1b0beb000/0x0/0x1bfc00000, data 0x9ba898a/0x9ce3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x532f9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2531557 data_alloc: 285212672 data_used: 19689472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:31.732402+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 149233664 unmapped: 31498240 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:32.732531+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 149348352 unmapped: 31383552 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:33.732655+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 149413888 unmapped: 31318016 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 heartbeat osd_stat(store_statfs(0x1ad249000/0x0/0x1bfc00000, data 0xc3aaf84/0xc4e5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:34.732838+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 149610496 unmapped: 31121408 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:35.733016+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 149757952 unmapped: 30973952 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3080191 data_alloc: 285212672 data_used: 19689472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:36.733166+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 150896640 unmapped: 29835264 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 192 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:37.733285+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151011328 unmapped: 29720576 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.141380310s of 10.063614845s, submitted: 109
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:38.733458+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151175168 unmapped: 29556736 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:39.733658+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151289856 unmapped: 29442048 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x1a6a44000/0x0/0x1bfc00000, data 0x12bad69e/0x12cea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:40.733821+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151355392 unmapped: 29376512 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3654085 data_alloc: 285212672 data_used: 19701760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:41.734038+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151470080 unmapped: 29261824 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:42.734274+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 151625728 unmapped: 29106176 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:43.734433+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 152674304 unmapped: 28057600 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:44.734602+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 152821760 unmapped: 27910144 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x1a3244000/0x0/0x1bfc00000, data 0x163ad768/0x164ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:45.734766+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161300480 unmapped: 19431424 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x1a1244000/0x0/0x1bfc00000, data 0x183ad842/0x184ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4019567 data_alloc: 285212672 data_used: 19701760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:46.734857+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153018368 unmapped: 27713536 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:47.735040+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153149440 unmapped: 27582464 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.108864784s of 10.142596245s, submitted: 59
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:48.735208+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161685504 unmapped: 19046400 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x19e243000/0x0/0x1bfc00000, data 0x1b3ad953/0x1b4eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8c800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:49.735417+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153427968 unmapped: 27303936 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:50.735576+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153485312 unmapped: 27246592 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4462237 data_alloc: 285212672 data_used: 19701760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:51.735741+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 51
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153698304 unmapped: 27033600 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:52.735978+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153845760 unmapped: 26886144 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x19ca44000/0x0/0x1bfc00000, data 0x1cbada58/0x1ccea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:53.736205+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153993216 unmapped: 26738688 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:54.736343+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154050560 unmapped: 26681344 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:55.736498+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154189824 unmapped: 26542080 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4899801 data_alloc: 285212672 data_used: 19701760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:56.736631+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154288128 unmapped: 26443776 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 heartbeat osd_stat(store_statfs(0x199a45000/0x0/0x1bfc00000, data 0x1fbadad7/0x1fce9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:57.736831+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154345472 unmapped: 26386432 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.974666595s of 10.031741142s, submitted: 64
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:58.737056+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154550272 unmapped: 26181632 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:59.737327+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154615808 unmapped: 26116096 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8bc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 ms_handle_reset con 0x55d11ea8bc00 session 0x55d11b8974a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0aec00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:00.737466+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8dc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea88800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154812416 unmapped: 25919488 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 194 ms_handle_reset con 0x55d11ea8dc00 session 0x55d11a5cc960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5409827 data_alloc: 285212672 data_used: 19718144
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:01.737607+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155025408 unmapped: 25706496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11ea88800 session 0x55d119c6bc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11c0aec00 session 0x55d11b8ea5a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:02.737804+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 heartbeat osd_stat(store_statfs(0x194235000/0x0/0x1bfc00000, data 0x253b308c/0x254f7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155025408 unmapped: 25706496 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:03.737961+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 163545088 unmapped: 17186816 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:04.738147+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156327936 unmapped: 24403968 heap: 180731904 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 heartbeat osd_stat(store_statfs(0x192a37000/0x0/0x1bfc00000, data 0x26bb308c/0x26cf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11918d400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11c096c00 session 0x55d119951e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11918d400 session 0x55d11bf32780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:05.738364+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156041216 unmapped: 33087488 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 6034949 data_alloc: 285212672 data_used: 19718144
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:06.738516+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 164528128 unmapped: 24600576 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11c4a9400 session 0x55d11a44c780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea88c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 ms_handle_reset con 0x55d11ea88c00 session 0x55d11ea903c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 195 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:07.738679+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 196 ms_handle_reset con 0x55d11fece800 session 0x55d11d9fd0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156336128 unmapped: 32792576 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.484739304s of 10.031317711s, submitted: 187
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d11dbc9400 session 0x55d11b89cd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d11a41fc00 session 0x55d11bcc6d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca3000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:08.738813+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d120ca3000 session 0x55d11ea90d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d11dfc7400 session 0x55d11b777c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156442624 unmapped: 32686080 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e443c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:09.739015+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 heartbeat osd_stat(store_statfs(0x18f22d000/0x0/0x1bfc00000, data 0x2a3b7c7f/0x2a501000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156483584 unmapped: 32645120 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d12175f000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ddfe400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d12175f000 session 0x55d11b7770e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 ms_handle_reset con 0x55d11a41fc00 session 0x55d11bcc7860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 198 ms_handle_reset con 0x55d11ddfe400 session 0x55d11e452960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:10.739209+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.69590 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.59407 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.49983 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: pgmap v832: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.69614 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1780513470' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2229859225' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2327447688' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2107682123' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3153028272' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3333518524' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2479036' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/545032456' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/466198046' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3265217251' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1245768976' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3564364230' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/471931038' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/329508633' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2266612871' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1424714830' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1424359202' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3066090828' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156614656 unmapped: 32514048 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 198 ms_handle_reset con 0x55d11dbc9400 session 0x55d11e452000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 198 ms_handle_reset con 0x55d11e443c00 session 0x55d11e4530e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2020587 data_alloc: 285212672 data_used: 19742720
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:11.739337+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154787840 unmapped: 34340864 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:12.739478+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154189824 unmapped: 34938880 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 199 ms_handle_reset con 0x55d11dfc7400 session 0x55d11a5d61e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:13.739672+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154189824 unmapped: 34938880 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:14.739842+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 200 ms_handle_reset con 0x55d11dfc7400 session 0x55d11c2d5c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154206208 unmapped: 34922496 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:15.740131+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 201 heartbeat osd_stat(store_statfs(0x1b4a2d000/0x0/0x1bfc00000, data 0x4bbe8b7/0x4d01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154238976 unmapped: 34889728 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2026451 data_alloc: 285212672 data_used: 19767296
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:16.740347+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153640960 unmapped: 35487744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:17.740544+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153640960 unmapped: 35487744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e442000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.863880157s of 10.136446953s, submitted: 428
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 202 ms_handle_reset con 0x55d11e442000 session 0x55d1199c4b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 ms_handle_reset con 0x55d11a41fc00 session 0x55d11c2d45a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 heartbeat osd_stat(store_statfs(0x1b4a22000/0x0/0x1bfc00000, data 0x4bc3533/0x4d0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca2000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:18.740660+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 ms_handle_reset con 0x55d120ca2000 session 0x55d1199510e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153665536 unmapped: 35463168 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 203 handle_osd_map epochs [203,204], i have 203, src has [1,204]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 ms_handle_reset con 0x55d11c2b8c00 session 0x55d11d100000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:19.740808+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 ms_handle_reset con 0x55d11a875800 session 0x55d11b8970e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153698304 unmapped: 35430400 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc7400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 ms_handle_reset con 0x55d11dfc7400 session 0x55d11b8ead20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 204 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 205 ms_handle_reset con 0x55d11a41fc00 session 0x55d11b8ea960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:20.740994+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153755648 unmapped: 35373056 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e442000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca2000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:21.741159+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2062221 data_alloc: 285212672 data_used: 19791872
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153223168 unmapped: 35905536 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 206 ms_handle_reset con 0x55d11e442000 session 0x55d119a763c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:22.741286+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 ms_handle_reset con 0x55d120ca2000 session 0x55d11b8a6f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153231360 unmapped: 35897344 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:23.741409+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153346048 unmapped: 35782656 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 heartbeat osd_stat(store_statfs(0x1b4a0a000/0x0/0x1bfc00000, data 0x4bcf5a2/0x4d21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 207 handle_osd_map epochs [207,208], i have 207, src has [1,208]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:24.741559+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153362432 unmapped: 35766272 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:25.744099+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 ms_handle_reset con 0x55d11a41fc00 session 0x55d11dcff2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 153362432 unmapped: 35766272 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 heartbeat osd_stat(store_statfs(0x1b4a0a000/0x0/0x1bfc00000, data 0x4bd1afe/0x4d23000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 ms_handle_reset con 0x55d11a875800 session 0x55d11dcff4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:26.744271+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2060978 data_alloc: 285212672 data_used: 19791872
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154419200 unmapped: 34709504 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 208 handle_osd_map epochs [208,209], i have 208, src has [1,209]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:27.744419+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154443776 unmapped: 34684928 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:28.744571+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154443776 unmapped: 34684928 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:29.744809+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154443776 unmapped: 34684928 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:30.763736+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 209 heartbeat osd_stat(store_statfs(0x1b4a09000/0x0/0x1bfc00000, data 0x4bd3fd3/0x4d24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154443776 unmapped: 34684928 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.098771095s of 13.334545135s, submitted: 357
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 209 ms_handle_reset con 0x55d11c096c00 session 0x55d11dcffa40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:31.763870+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2063603 data_alloc: 285212672 data_used: 19804160
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154443776 unmapped: 34684928 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:32.764021+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154476544 unmapped: 34652160 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 210 heartbeat osd_stat(store_statfs(0x1b4a09000/0x0/0x1bfc00000, data 0x4bd406e/0x4d25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 211 ms_handle_reset con 0x55d11fece400 session 0x55d11dcffc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:33.764170+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154509312 unmapped: 34619392 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:34.764321+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154451968 unmapped: 34676736 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 211 handle_osd_map epochs [211,212], i have 211, src has [1,212]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 212 ms_handle_reset con 0x55d11dfc6400 session 0x55d11a5cc1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:35.764479+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154451968 unmapped: 34676736 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 212 ms_handle_reset con 0x55d11dfc6400 session 0x55d11a5a65a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 213 ms_handle_reset con 0x55d11a41fc00 session 0x55d11bc5fa40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:36.764616+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 213 heartbeat osd_stat(store_statfs(0x1b49fb000/0x0/0x1bfc00000, data 0x4bdafea/0x4d33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2083221 data_alloc: 285212672 data_used: 19836928
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 213 ms_handle_reset con 0x55d11a875800 session 0x55d1199c70e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154476544 unmapped: 34652160 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 ms_handle_reset con 0x55d11c096c00 session 0x55d11d9fc3c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:37.764747+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 ms_handle_reset con 0x55d11fece400 session 0x55d12175c1e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11fece400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154517504 unmapped: 34611200 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 ms_handle_reset con 0x55d11fece400 session 0x55d12175c5a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:38.764954+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154550272 unmapped: 34578432 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 heartbeat osd_stat(store_statfs(0x1b49f2000/0x0/0x1bfc00000, data 0x4bdfa96/0x4d3a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:39.765082+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 ms_handle_reset con 0x55d11a41fc00 session 0x55d12175da40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154566656 unmapped: 34562048 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:40.765222+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 154591232 unmapped: 34537472 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.667528152s of 10.117173195s, submitted: 127
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:41.765357+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 215 ms_handle_reset con 0x55d11a875800 session 0x55d12175dc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2094623 data_alloc: 285212672 data_used: 19849216
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155664384 unmapped: 33464320 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 215 handle_osd_map epochs [215,216], i have 215, src has [1,216]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:42.765520+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155672576 unmapped: 33456128 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca3400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca2800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d120ca2800 session 0x55d11e3885a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d120ca3400 session 0x55d11e388b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:43.765695+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 heartbeat osd_stat(store_statfs(0x1b45e8000/0x0/0x1bfc00000, data 0x4be459d/0x4d45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155680768 unmapped: 33447936 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca3400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d120ca3400 session 0x55d11e389680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d11a41fc00 session 0x55d11e389c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d11a875800 session 0x55d12175d0e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:44.765868+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155688960 unmapped: 33439744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:45.766111+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155688960 unmapped: 33439744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 heartbeat osd_stat(store_statfs(0x1b45e6000/0x0/0x1bfc00000, data 0x4be46e1/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:46.766264+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2104154 data_alloc: 285212672 data_used: 19849216
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155688960 unmapped: 33439744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:47.766444+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 heartbeat osd_stat(store_statfs(0x1b45e6000/0x0/0x1bfc00000, data 0x4be46e1/0x4d47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155688960 unmapped: 33439744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:48.766638+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155688960 unmapped: 33439744 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:49.766822+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 ms_handle_reset con 0x55d11c096c00 session 0x55d11a5a90e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155738112 unmapped: 33390592 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:50.766978+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 217 handle_osd_map epochs [217,218], i have 217, src has [1,218]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 ms_handle_reset con 0x55d11dfc6400 session 0x55d11dcfe780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155803648 unmapped: 33325056 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 ms_handle_reset con 0x55d11dfc6400 session 0x55d11a5cc960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 ms_handle_reset con 0x55d11a41fc00 session 0x55d11d9fcb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:51.767163+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2110654 data_alloc: 285212672 data_used: 19861504
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.787451744s of 10.353865623s, submitted: 183
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 155959296 unmapped: 33169408 heap: 189128704 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c096c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:52.767322+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 heartbeat osd_stat(store_statfs(0x1b45e2000/0x0/0x1bfc00000, data 0x4be9352/0x4d4c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [1,0,0,0,0,0,0,2,3])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168656896 unmapped: 33079296 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 ms_handle_reset con 0x55d11dfc6000 session 0x55d119a76f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:53.767488+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09b800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168681472 unmapped: 33054720 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 ms_handle_reset con 0x55d11c09b800 session 0x55d11a44d860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:54.767636+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156082176 unmapped: 45654016 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:55.767816+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156090368 unmapped: 45645824 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 heartbeat osd_stat(store_statfs(0x1aade3000/0x0/0x1bfc00000, data 0xe3e937b/0xe54b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:56.767950+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3401422 data_alloc: 285212672 data_used: 19861504
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168722432 unmapped: 33013760 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:57.768158+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160333824 unmapped: 41402368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:58.768371+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 164560896 unmapped: 37175296 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:59.768518+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:00.768687+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168820736 unmapped: 32915456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:01.768826+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 heartbeat osd_stat(store_statfs(0x1a01de000/0x0/0x1bfc00000, data 0x18feb7dc/0x19150000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4589040 data_alloc: 285212672 data_used: 19873792
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 7.671572685s of 10.091114044s, submitted: 137
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 162201600 unmapped: 39534592 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:02.769023+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 heartbeat osd_stat(store_statfs(0x19e5de000/0x0/0x1bfc00000, data 0x1abeb7dc/0x1ad50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 45506560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:03.769204+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 45490176 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:04.769351+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 45457408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:05.769530+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156319744 unmapped: 45416448 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 ms_handle_reset con 0x55d11a875800 session 0x55d1199c43c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 heartbeat osd_stat(store_statfs(0x196ddc000/0x0/0x1bfc00000, data 0x223eb964/0x22552000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 ms_handle_reset con 0x55d11c096c00 session 0x55d11b8eb4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:06.769753+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a875800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 ms_handle_reset con 0x55d11a875800 session 0x55d11ea91860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5434971 data_alloc: 285212672 data_used: 19873792
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156401664 unmapped: 45334528 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:07.770049+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 ms_handle_reset con 0x55d11a41fc00 session 0x55d11e389860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 45441024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea88400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 ms_handle_reset con 0x55d11ea88400 session 0x55d11d9fc5a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:08.770219+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156311552 unmapped: 45424640 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:09.770384+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156319744 unmapped: 45416448 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:10.770547+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 221 handle_osd_map epochs [220,221], i have 221, src has [1,221]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156336128 unmapped: 45400064 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:11.770706+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2267258 data_alloc: 285212672 data_used: 19886080
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156344320 unmapped: 45391872 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 8.693782806s of 10.169593811s, submitted: 320
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 handle_osd_map epochs [221,222], i have 222, src has [1,222]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 ms_handle_reset con 0x55d11ea8c800 session 0x55d119c6ad20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 heartbeat osd_stat(store_statfs(0x1b45cf000/0x0/0x1bfc00000, data 0x4bf2b77/0x4d5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:12.770935+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 ms_handle_reset con 0x55d11de2d000 session 0x55d11bcf2780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156598272 unmapped: 45137920 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 52
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:13.771075+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 223 heartbeat osd_stat(store_statfs(0x1b45d0000/0x0/0x1bfc00000, data 0x4bf2cad/0x4d5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156606464 unmapped: 45129728 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 223 ms_handle_reset con 0x55d11e137800 session 0x55d11b8965a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:14.771232+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156606464 unmapped: 45129728 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:15.771451+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156631040 unmapped: 45105152 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:16.771581+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2272826 data_alloc: 285212672 data_used: 19902464
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 224 ms_handle_reset con 0x55d11bd71800 session 0x55d11d0c8000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156639232 unmapped: 45096960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:17.771730+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 225 heartbeat osd_stat(store_statfs(0x1b45c4000/0x0/0x1bfc00000, data 0x4bf9d97/0x4d68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156639232 unmapped: 45096960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:18.771931+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156639232 unmapped: 45096960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:19.772139+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dfc6400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 225 ms_handle_reset con 0x55d11dfc6400 session 0x55d11d0c85a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea89400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156647424 unmapped: 45088768 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:20.772318+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 226 ms_handle_reset con 0x55d11ea89400 session 0x55d11d0c8d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 156639232 unmapped: 45096960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:21.772482+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 226 ms_handle_reset con 0x55d11bd71800 session 0x55d11d0c81e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2285519 data_alloc: 285212672 data_used: 19927040
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 157704192 unmapped: 44032000 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.287421227s of 10.016274452s, submitted: 434
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 handle_osd_map epochs [227,228], i have 228, src has [1,228]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:22.772684+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 heartbeat osd_stat(store_statfs(0x1b45c2000/0x0/0x1bfc00000, data 0x4bfc389/0x4d6c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 ms_handle_reset con 0x55d11de2d000 session 0x55d11d0c9c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 158760960 unmapped: 42975232 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:23.772849+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 ms_handle_reset con 0x55d11e136400 session 0x55d11d0c9e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea68800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 ms_handle_reset con 0x55d11ea68800 session 0x55d11d0c94a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bf77400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 158769152 unmapped: 42967040 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 ms_handle_reset con 0x55d11bf77400 session 0x55d11d0c8f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:24.773013+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 ms_handle_reset con 0x55d11bd71800 session 0x55d11bc5b680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 158318592 unmapped: 43417600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11de2d000 session 0x55d11e389e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11e136400 session 0x55d11bc5be00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:25.773152+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 158318592 unmapped: 43417600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea68800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11d96b400 session 0x55d118117e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11ea68800 session 0x55d11bc5bc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:26.773335+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2378993 data_alloc: 285212672 data_used: 19951616
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11d96b400 session 0x55d11d0c9e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 158318592 unmapped: 43417600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2d000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 ms_handle_reset con 0x55d11de2d000 session 0x55d11d0c9c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e136400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:27.773453+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11e137000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 ms_handle_reset con 0x55d11e136400 session 0x55d11d0c81e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 heartbeat osd_stat(store_statfs(0x1b372a000/0x0/0x1bfc00000, data 0x5a889fa/0x5c03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 ms_handle_reset con 0x55d11d004000 session 0x55d11b776960
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 159383552 unmapped: 42352640 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 231 ms_handle_reset con 0x55d11dbc8c00 session 0x55d11a5ccb40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 231 ms_handle_reset con 0x55d11e137000 session 0x55d11ea91e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:28.773606+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 231 ms_handle_reset con 0x55d11bd71800 session 0x55d11bc5b4a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 159383552 unmapped: 42352640 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 231 handle_osd_map epochs [231,232], i have 231, src has [1,232]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 ms_handle_reset con 0x55d11d004000 session 0x55d11d0c85a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:29.773784+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 handle_osd_map epochs [231,232], i have 232, src has [1,232]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 159424512 unmapped: 42311680 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:30.773984+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 heartbeat osd_stat(store_statfs(0x1b371e000/0x0/0x1bfc00000, data 0x5a8d58c/0x5c0d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 ms_handle_reset con 0x55d11d96b400 session 0x55d11e453680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 159244288 unmapped: 42491904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc8c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 ms_handle_reset con 0x55d11dbc8c00 session 0x55d11e4525a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:31.774181+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bd71800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2453905 data_alloc: 285212672 data_used: 19963904
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 heartbeat osd_stat(store_statfs(0x1b34ec000/0x0/0x1bfc00000, data 0x5cc451a/0x5e42000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.018497467s of 10.003925323s, submitted: 248
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 ms_handle_reset con 0x55d11d004000 session 0x55d11b8a61e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166510592 unmapped: 35225600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:32.774409+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 233 ms_handle_reset con 0x55d11d96b400 session 0x55d11a44de00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 233 ms_handle_reset con 0x55d11bd71800 session 0x55d11bcc63c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 233 heartbeat osd_stat(store_statfs(0x1b2984000/0x0/0x1bfc00000, data 0x682aa58/0x69a9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161284096 unmapped: 40452096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:33.774568+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160956416 unmapped: 40779776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:34.774735+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 handle_osd_map epochs [234,235], i have 235, src has [1,235]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bf76000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 ms_handle_reset con 0x55d11bf76000 session 0x55d11a59a5a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161005568 unmapped: 40730624 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:35.775020+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b9c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161013760 unmapped: 40722432 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:36.775162+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2463745 data_alloc: 285212672 data_used: 19976192
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161021952 unmapped: 40714240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 236 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 237 ms_handle_reset con 0x55d11c2b9c00 session 0x55d11bcc70e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:37.775354+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161046528 unmapped: 40689664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:38.775579+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 237 heartbeat osd_stat(store_statfs(0x1b3d63000/0x0/0x1bfc00000, data 0x544801d/0x55c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161046528 unmapped: 40689664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 237 ms_handle_reset con 0x55d11ea8b400 session 0x55d1199c4d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8d800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:39.775740+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 238 ms_handle_reset con 0x55d11ea8d800 session 0x55d11bcf32c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161071104 unmapped: 40665088 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:40.775921+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161079296 unmapped: 40656896 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 238 handle_osd_map epochs [238,239], i have 238, src has [1,239]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:41.776062+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2366114 data_alloc: 285212672 data_used: 19984384
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161087488 unmapped: 40648704 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.439701080s of 10.255721092s, submitted: 261
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 240 heartbeat osd_stat(store_statfs(0x1b458a000/0x0/0x1bfc00000, data 0x4c1ad96/0x4da1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:42.776224+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161103872 unmapped: 40632320 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:43.776377+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161103872 unmapped: 40632320 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:44.776549+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161103872 unmapped: 40632320 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:45.776733+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161112064 unmapped: 40624128 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:46.776896+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2368222 data_alloc: 285212672 data_used: 19984384
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 241 heartbeat osd_stat(store_statfs(0x1b4585000/0x0/0x1bfc00000, data 0x4c1f86e/0x4da8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161112064 unmapped: 40624128 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:47.777112+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 heartbeat osd_stat(store_statfs(0x1b4585000/0x0/0x1bfc00000, data 0x4c1f86e/0x4da8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bf76c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161112064 unmapped: 40624128 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:48.777233+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 ms_handle_reset con 0x55d11bf76c00 session 0x55d11c2d4b40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160956416 unmapped: 40779776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:49.777429+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160956416 unmapped: 40779776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:50.777570+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 ms_handle_reset con 0x55d11a41fc00 session 0x55d11bcf34a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160956416 unmapped: 40779776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ddfe400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 ms_handle_reset con 0x55d11ddfe400 session 0x55d11e388780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:51.777716+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11be82c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 ms_handle_reset con 0x55d11be82c00 session 0x55d11e3883c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 heartbeat osd_stat(store_statfs(0x1b3993000/0x0/0x1bfc00000, data 0x5810ceb/0x599b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d953c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2471266 data_alloc: 285212672 data_used: 19984384
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 ms_handle_reset con 0x55d11d953c00 session 0x55d11a5a8780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161161216 unmapped: 40574976 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.663251877s of 10.008812904s, submitted: 156
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ddfe400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:52.777986+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11dbc9400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160645120 unmapped: 41091072 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:53.778131+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160858112 unmapped: 40878080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:54.778284+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 160956416 unmapped: 40779776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b3562000/0x0/0x1bfc00000, data 0x583d1ff/0x59cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:55.778487+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b3562000/0x0/0x1bfc00000, data 0x583d1ff/0x59cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:56.778640+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2496632 data_alloc: 285212672 data_used: 22171648
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:57.778869+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:58.779039+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:59.779267+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b3562000/0x0/0x1bfc00000, data 0x583d1ff/0x59cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:00.779430+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161054720 unmapped: 40681472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:01.779623+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2498198 data_alloc: 285212672 data_used: 22171648
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b3561000/0x0/0x1bfc00000, data 0x583d354/0x59cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161062912 unmapped: 40673280 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:02.779841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b3561000/0x0/0x1bfc00000, data 0x583d354/0x59cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161062912 unmapped: 40673280 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:03.779979+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161062912 unmapped: 40673280 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:04.780136+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.606501579s of 12.687765121s, submitted: 25
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 161062912 unmapped: 40673280 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:05.780303+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26f4000/0x0/0x1bfc00000, data 0x66a43ef/0x6834000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166412288 unmapped: 35323904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.780429+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2624702 data_alloc: 285212672 data_used: 23355392
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166494208 unmapped: 35241984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:07.780597+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166920192 unmapped: 34816000 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26cc000/0x0/0x1bfc00000, data 0x66c43ef/0x6854000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:08.780817+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166920192 unmapped: 34816000 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:09.780989+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166936576 unmapped: 34799616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:10.781143+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166936576 unmapped: 34799616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26cb000/0x0/0x1bfc00000, data 0x66c4554/0x6855000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:11.781336+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2636784 data_alloc: 285212672 data_used: 23318528
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 34766848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26d7000/0x0/0x1bfc00000, data 0x66c461d/0x6856000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:12.781531+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 34766848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:13.781923+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 34766848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:14.782207+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 34766848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:15.782479+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26d9000/0x0/0x1bfc00000, data 0x66c464c/0x6855000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166969344 unmapped: 34766848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:16.782689+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2635806 data_alloc: 285212672 data_used: 23318528
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166985728 unmapped: 34750464 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:17.782833+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 166985728 unmapped: 34750464 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.818944931s of 13.349108696s, submitted: 121
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:18.783051+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26d9000/0x0/0x1bfc00000, data 0x66c464a/0x6855000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:19.783256+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:20.783456+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:21.783642+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2634906 data_alloc: 285212672 data_used: 23318528
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:22.783833+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:23.784020+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 heartbeat osd_stat(store_statfs(0x1b26da000/0x0/0x1bfc00000, data 0x66c46b2/0x6853000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:24.784178+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167002112 unmapped: 34734080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:25.784375+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 244 heartbeat osd_stat(store_statfs(0x1b26da000/0x0/0x1bfc00000, data 0x66c46b2/0x6853000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:26.784546+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2638242 data_alloc: 285212672 data_used: 23339008
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:27.784801+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:28.785019+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.128591537s of 10.320652008s, submitted: 49
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:29.785281+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:30.785526+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11a41fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 244 ms_handle_reset con 0x55d11a41fc00 session 0x55d11bcf3e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 244 heartbeat osd_stat(store_statfs(0x1b26d6000/0x0/0x1bfc00000, data 0x66c6cb5/0x6857000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167010304 unmapped: 34725888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:31.785672+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 244 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c09bc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 245 ms_handle_reset con 0x55d11c09bc00 session 0x55d11a5a6780
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2646651 data_alloc: 285212672 data_used: 23351296
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167018496 unmapped: 34717696 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:32.785827+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 245 ms_handle_reset con 0x55d119a70800 session 0x55d11b8eb860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167100416 unmapped: 34635776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:33.786017+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d121822800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 246 ms_handle_reset con 0x55d121822800 session 0x55d11e4534a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167100416 unmapped: 34635776 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:34.786238+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d121822400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d951400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea68000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 ms_handle_reset con 0x55d11d951400 session 0x55d119953c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167116800 unmapped: 34619392 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:35.786523+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 heartbeat osd_stat(store_statfs(0x1b26c6000/0x0/0x1bfc00000, data 0x66ce300/0x6867000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 247 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 248 ms_handle_reset con 0x55d11ea68000 session 0x55d11e388d20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c4a9000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 248 ms_handle_reset con 0x55d11c4a9000 session 0x55d11b805e00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 248 ms_handle_reset con 0x55d121822400 session 0x55d11e452f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168165376 unmapped: 33570816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:36.786888+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2663139 data_alloc: 285212672 data_used: 23351296
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168165376 unmapped: 33570816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:37.787043+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0c3400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168181760 unmapped: 33554432 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:38.787225+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2c000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.129745483s of 10.251703262s, submitted: 38
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 249 ms_handle_reset con 0x55d11c0c3400 session 0x55d11bcf3a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de2c800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 249 ms_handle_reset con 0x55d11de2c800 session 0x55d11bf32f00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168263680 unmapped: 33472512 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:39.787459+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 53
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 249 heartbeat osd_stat(store_statfs(0x1b26ba000/0x0/0x1bfc00000, data 0x66d3fc1/0x6872000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168460288 unmapped: 33275904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:40.787705+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d12175fc00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 249 ms_handle_reset con 0x55d12175fc00 session 0x55d11d1003c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168476672 unmapped: 33259520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:41.787841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96a400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca1400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 ms_handle_reset con 0x55d11d96a400 session 0x55d11d1001e0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670427 data_alloc: 285212672 data_used: 23384064
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168476672 unmapped: 33259520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:42.788015+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 ms_handle_reset con 0x55d120ca1400 session 0x55d11b80ef00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168484864 unmapped: 33251328 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:43.788240+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168484864 unmapped: 33251328 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea8b400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:44.788386+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 251 heartbeat osd_stat(store_statfs(0x1b26b5000/0x0/0x1bfc00000, data 0x66d8a57/0x6878000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168501248 unmapped: 33234944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:45.788610+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 252 ms_handle_reset con 0x55d11ea8b400 session 0x55d11bc5f860
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168501248 unmapped: 33234944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:46.788782+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11bf76000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2675461 data_alloc: 285212672 data_used: 23384064
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168501248 unmapped: 33234944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:47.789002+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 253 ms_handle_reset con 0x55d11bf76000 session 0x55d11bc5f2c0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168509440 unmapped: 33226752 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 254 handle_osd_map epochs [253,253], i have 254, src has [1,253]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:48.789154+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 254 heartbeat osd_stat(store_statfs(0x1b26a9000/0x0/0x1bfc00000, data 0x66df973/0x6884000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168525824 unmapped: 33210368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:49.789334+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.552460670s of 10.953345299s, submitted: 143
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 254 ms_handle_reset con 0x55d11dbc9400 session 0x55d11ea91c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 254 ms_handle_reset con 0x55d11ddfe400 session 0x55d119c6ab40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168525824 unmapped: 33210368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:50.789494+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 254 handle_osd_map epochs [254,255], i have 254, src has [1,255]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d120ca2800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 255 ms_handle_reset con 0x55d120ca2800 session 0x55d11b8a7c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167133184 unmapped: 34603008 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:51.789664+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2452805 data_alloc: 285212672 data_used: 20054016
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 255 heartbeat osd_stat(store_statfs(0x1b4146000/0x0/0x1bfc00000, data 0x4c41043/0x4de5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167133184 unmapped: 34603008 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:52.789850+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168181760 unmapped: 33554432 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:53.789975+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167141376 unmapped: 34594816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:54.790203+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11ea69800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167141376 unmapped: 34594816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 257 handle_osd_map epochs [257,258], i have 257, src has [1,258]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:55.790369+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167141376 unmapped: 34594816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 258 handle_osd_map epochs [258,259], i have 258, src has [1,259]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:56.790532+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2462205 data_alloc: 285212672 data_used: 20066304
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167141376 unmapped: 34594816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:57.790639+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 260 ms_handle_reset con 0x55d11ea69800 session 0x55d11a59dc20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 260 handle_osd_map epochs [259,260], i have 260, src has [1,260]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 260 heartbeat osd_stat(store_statfs(0x1b4139000/0x0/0x1bfc00000, data 0x4c4cf20/0x4df4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167165952 unmapped: 34570240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:58.790815+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167165952 unmapped: 34570240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:59.791034+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.168146133s of 10.007822037s, submitted: 290
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167165952 unmapped: 34570240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:00.791292+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167165952 unmapped: 34570240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:01.791545+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2462919 data_alloc: 285212672 data_used: 20066304
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:02.791712+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 261 heartbeat osd_stat(store_statfs(0x1b413a000/0x0/0x1bfc00000, data 0x4c4cfed/0x4df3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 261 heartbeat osd_stat(store_statfs(0x1b4136000/0x0/0x1bfc00000, data 0x4c4f45e/0x4df7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 261 handle_osd_map epochs [261,262], i have 261, src has [1,262]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:03.791885+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 262 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 262 heartbeat osd_stat(store_statfs(0x1b4133000/0x0/0x1bfc00000, data 0x4c51a8f/0x4dfb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:04.792062+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b8800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 262 ms_handle_reset con 0x55d11c2b8800 session 0x55d11bcf25a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:05.792413+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:06.792563+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2471619 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:07.792731+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 264 heartbeat osd_stat(store_statfs(0x1b4129000/0x0/0x1bfc00000, data 0x4c56517/0x4e03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:08.792926+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:09.793160+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c0c3400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 265 ms_handle_reset con 0x55d11c0c3400 session 0x55d119a77a40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:10.793292+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.506402016s of 10.695487022s, submitted: 76
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d96a400
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 266 ms_handle_reset con 0x55d11d96a400 session 0x55d11d1005a0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:11.793691+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 267 heartbeat osd_stat(store_statfs(0x1b4122000/0x0/0x1bfc00000, data 0x4c5b029/0x4e0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2483947 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d004000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 267 ms_handle_reset con 0x55d11d004000 session 0x55d119a77680
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167174144 unmapped: 34562048 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:12.793850+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:13.794008+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:14.794202+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:15.794475+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:16.794647+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2491911 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:17.794805+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 270 heartbeat osd_stat(store_statfs(0x1b4114000/0x0/0x1bfc00000, data 0x4c6211d/0x4e18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:18.794972+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:19.795122+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 270 heartbeat osd_stat(store_statfs(0x1b4111000/0x0/0x1bfc00000, data 0x4c64552/0x4e1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:20.795323+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:21.795522+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2493873 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 270 handle_osd_map epochs [270,271], i have 270, src has [1,271]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.617444038s of 11.845149040s, submitted: 98
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:22.795680+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b4111000/0x0/0x1bfc00000, data 0x4c64552/0x4e1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:23.795853+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:24.796041+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410e000/0x0/0x1bfc00000, data 0x4c6699a/0x4e1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:25.796229+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:26.796391+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410f000/0x0/0x1bfc00000, data 0x4c66b2e/0x4e1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495497 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:27.796606+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:28.796825+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.418 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.419 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.419 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.420 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:29.796983+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:30.797134+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:31.797282+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495497 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:32.797425+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410f000/0x0/0x1bfc00000, data 0x4c66b2e/0x4e1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:33.797574+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167182336 unmapped: 34553856 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.423744202s of 11.486595154s, submitted: 27
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:34.797694+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:35.797876+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:36.798065+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495513 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:37.798242+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410f000/0x0/0x1bfc00000, data 0x4c66c5d/0x4e1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:38.798394+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:39.798537+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410e000/0x0/0x1bfc00000, data 0x4c66d5d/0x4e20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:40.798719+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:41.798888+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167190528 unmapped: 34545664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2498857 data_alloc: 285212672 data_used: 20078592
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:42.798998+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 ms_handle_reset con 0x55d11de2c000 session 0x55d11b8a6000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167460864 unmapped: 34275328 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:43.799124+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 heartbeat osd_stat(store_statfs(0x1b410d000/0x0/0x1bfc00000, data 0x4c66e5d/0x4e21000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167452672 unmapped: 34283520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 54
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.941683769s of 10.007706642s, submitted: 244
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:44.799292+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167460864 unmapped: 34275328 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:45.799501+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167460864 unmapped: 34275328 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:46.799644+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2502437 data_alloc: 285212672 data_used: 20090880
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:47.799874+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:48.800030+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 272 heartbeat osd_stat(store_statfs(0x1b4109000/0x0/0x1bfc00000, data 0x4c69458/0x4e24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:49.800204+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 272 heartbeat osd_stat(store_statfs(0x1b4109000/0x0/0x1bfc00000, data 0x4c69458/0x4e24000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:50.800345+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:51.800498+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2502827 data_alloc: 285212672 data_used: 20090880
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:52.800648+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167477248 unmapped: 34258944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4108000/0x0/0x1bfc00000, data 0x4c695af/0x4e25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:53.800842+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.833016396s of 10.008376122s, submitted: 59
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:54.800999+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:55.801211+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:56.801385+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2507163 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:57.801540+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:58.801713+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4105000/0x0/0x1bfc00000, data 0x4c6bae1/0x4e28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:59.801916+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:00.802038+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167493632 unmapped: 34242560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:01.802227+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2506891 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:02.802366+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:03.802504+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:04.802656+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4101000/0x0/0x1bfc00000, data 0x4c6bdd3/0x4e2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:05.802869+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.149744987s of 12.275115013s, submitted: 27
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4101000/0x0/0x1bfc00000, data 0x4c6bdd3/0x4e2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:06.803058+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2510043 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:07.803276+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:08.803410+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4105000/0x0/0x1bfc00000, data 0x4c6bd3c/0x4e29000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:09.803599+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167501824 unmapped: 34234368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4105000/0x0/0x1bfc00000, data 0x4c6bd3c/0x4e29000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:10.803732+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167510016 unmapped: 34226176 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:11.803879+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167510016 unmapped: 34226176 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2511347 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:12.804097+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167510016 unmapped: 34226176 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4101000/0x0/0x1bfc00000, data 0x4c6be44/0x4e28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:13.804251+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:14.804407+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:15.804955+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:16.805091+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2510643 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:17.805247+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.309055328s of 11.439734459s, submitted: 30
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b4106000/0x0/0x1bfc00000, data 0x4c6be42/0x4e28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:18.805390+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:19.805563+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167518208 unmapped: 34217984 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:20.805704+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168681472 unmapped: 33054720 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:21.805880+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168787968 unmapped: 32948224 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2522775 data_alloc: 285212672 data_used: 20103168
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:22.806055+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 169000960 unmapped: 32735232 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 heartbeat osd_stat(store_statfs(0x1b40b5000/0x0/0x1bfc00000, data 0x4cbeb54/0x4e79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:23.806217+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 169107456 unmapped: 32628736 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:24.806355+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167583744 unmapped: 34152448 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:25.806531+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 273 handle_osd_map epochs [273,274], i have 273, src has [1,274]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167845888 unmapped: 33890304 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:26.806696+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167845888 unmapped: 33890304 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2530783 data_alloc: 285212672 data_used: 20115456
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:27.806869+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167845888 unmapped: 33890304 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 274 heartbeat osd_stat(store_statfs(0x1b4029000/0x0/0x1bfc00000, data 0x4d48417/0x4f04000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:28.807056+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.662422180s of 10.985755920s, submitted: 76
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167444480 unmapped: 34291712 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:29.807211+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 167444480 unmapped: 34291712 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:30.807370+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168787968 unmapped: 32948224 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:31.807507+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168632320 unmapped: 33103872 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 274 handle_osd_map epochs [274,275], i have 274, src has [1,275]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:32.807636+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2544801 data_alloc: 285212672 data_used: 20127744
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168632320 unmapped: 33103872 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:33.807841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 275 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 168763392 unmapped: 32972800 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 275 heartbeat osd_stat(store_statfs(0x1b3f5e000/0x0/0x1bfc00000, data 0x4e0f6e7/0x4fcf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 275 heartbeat osd_stat(store_statfs(0x1b3f1f000/0x0/0x1bfc00000, data 0x4e4e64b/0x500e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:34.808004+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 169017344 unmapped: 32718848 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:35.808234+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170344448 unmapped: 31391744 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:36.808409+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170344448 unmapped: 31391744 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:37.808589+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2556279 data_alloc: 285212672 data_used: 20127744
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170344448 unmapped: 31391744 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 275 heartbeat osd_stat(store_statfs(0x1b3eb0000/0x0/0x1bfc00000, data 0x4ebed94/0x507e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:38.808797+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.784550667s of 10.182422638s, submitted: 95
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170385408 unmapped: 31350784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:39.809007+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170385408 unmapped: 31350784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:40.809156+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 170967040 unmapped: 30769152 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:41.809303+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 171106304 unmapped: 30629888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:42.809469+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2573067 data_alloc: 285212672 data_used: 20140032
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 172285952 unmapped: 29450240 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 276 heartbeat osd_stat(store_statfs(0x1b3db8000/0x0/0x1bfc00000, data 0x4fb42f4/0x5175000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:43.809625+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 172654592 unmapped: 29081600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:44.809797+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 171974656 unmapped: 29761536 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:45.810010+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 172113920 unmapped: 29622272 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:46.810240+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 172548096 unmapped: 29188096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:47.810429+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2579449 data_alloc: 285212672 data_used: 20152320
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 172654592 unmapped: 29081600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 heartbeat osd_stat(store_statfs(0x1b3cc2000/0x0/0x1bfc00000, data 0x50a8574/0x526b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:48.810633+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 173703168 unmapped: 28033024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:49.810839+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 173850624 unmapped: 27885568 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.947267532s of 11.369501114s, submitted: 122
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:50.810977+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174120960 unmapped: 27615232 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:51.811151+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174252032 unmapped: 27484160 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:52.811325+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2591347 data_alloc: 285212672 data_used: 20152320
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174260224 unmapped: 27475968 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:53.811456+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 heartbeat osd_stat(store_statfs(0x1b3c3a000/0x0/0x1bfc00000, data 0x5131778/0x52f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174260224 unmapped: 27475968 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 heartbeat osd_stat(store_statfs(0x1b3c3a000/0x0/0x1bfc00000, data 0x5131778/0x52f4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:54.811619+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174260224 unmapped: 27475968 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:55.811841+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 173735936 unmapped: 28000256 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:56.811999+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174792704 unmapped: 26943488 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:57.812121+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 278 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2610481 data_alloc: 285212672 data_used: 20164608
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 278 heartbeat osd_stat(store_statfs(0x1b29f2000/0x0/0x1bfc00000, data 0x51d6ddf/0x539b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 175022080 unmapped: 26714112 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:58.812301+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174694400 unmapped: 27041792 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:59.812479+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 174694400 unmapped: 27041792 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.617480278s of 10.000386238s, submitted: 97
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:00.812623+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 176807936 unmapped: 24928256 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:01.812864+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 177963008 unmapped: 23773184 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:02.813026+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 278 handle_osd_map epochs [278,279], i have 278, src has [1,279]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2620745 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 177963008 unmapped: 23773184 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b2923000/0x0/0x1bfc00000, data 0x52a5c0a/0x546b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:03.813194+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 177971200 unmapped: 23764992 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:04.813362+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178454528 unmapped: 23281664 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:05.813593+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178528256 unmapped: 23207936 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:06.813780+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178528256 unmapped: 23207936 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b2869000/0x0/0x1bfc00000, data 0x535b2fd/0x5523000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:07.813923+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2636119 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 22831104 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:08.814150+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 22831104 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:09.814317+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 178905088 unmapped: 22831104 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.594232559s of 10.000697136s, submitted: 104
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:10.814499+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b2389000/0x0/0x1bfc00000, data 0x543f5c7/0x5605000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180084736 unmapped: 21651456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b2389000/0x0/0x1bfc00000, data 0x543f5c7/0x5605000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:11.814635+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180084736 unmapped: 21651456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:12.814823+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2656493 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 181133312 unmapped: 20602880 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:13.815038+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180322304 unmapped: 21413888 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:14.815211+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b22c4000/0x0/0x1bfc00000, data 0x5501691/0x56c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180330496 unmapped: 21405696 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:15.815391+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180510720 unmapped: 21225472 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:16.815743+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 180920320 unmapped: 20815872 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:17.815986+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667129 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 181829632 unmapped: 19906560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:18.816125+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 181829632 unmapped: 19906560 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:19.816255+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182272000 unmapped: 19464192 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.597408295s of 10.006043434s, submitted: 87
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:20.816380+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b21ca000/0x0/0x1bfc00000, data 0x55ff62e/0x57c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182779904 unmapped: 18956288 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:21.816523+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182779904 unmapped: 18956288 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:22.816724+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2672867 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 183058432 unmapped: 18677760 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:23.816818+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182886400 unmapped: 18849792 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0fe6000/0x0/0x1bfc00000, data 0x56436e0/0x5808000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:24.816980+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182935552 unmapped: 18800640 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:25.817198+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182910976 unmapped: 18825216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:26.817489+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 182910976 unmapped: 18825216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:27.817632+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2692303 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 184139776 unmapped: 17596416 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:28.817776+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 184492032 unmapped: 17244160 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:29.817956+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0eab000/0x0/0x1bfc00000, data 0x577b9c8/0x5942000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 184180736 unmapped: 17555456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.615580559s of 10.003980637s, submitted: 75
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0eab000/0x0/0x1bfc00000, data 0x577b9c8/0x5942000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:30.818083+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0e49000/0x0/0x1bfc00000, data 0x57df2d0/0x59a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185458688 unmapped: 16277504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:31.818311+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185884672 unmapped: 15851520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:32.818502+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2703257 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185884672 unmapped: 15851520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:33.818716+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185581568 unmapped: 16154624 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0dfd000/0x0/0x1bfc00000, data 0x5828faa/0x59f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:34.818857+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185761792 unmapped: 15974400 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:35.819102+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185761792 unmapped: 15974400 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:36.819254+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0d63000/0x0/0x1bfc00000, data 0x58c4140/0x5a8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 185761792 unmapped: 15974400 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b0d63000/0x0/0x1bfc00000, data 0x58c4140/0x5a8b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:37.819440+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2719739 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 187293696 unmapped: 14442496 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1d0a000/0x0/0x1bfc00000, data 0x591b1ca/0x5ae2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:38.819611+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 187293696 unmapped: 14442496 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:39.819827+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 187293696 unmapped: 14442496 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.631224632s of 10.001991272s, submitted: 94
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:40.819995+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188432384 unmapped: 13303808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:41.820105+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188432384 unmapped: 13303808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:42.820250+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2726639 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188432384 unmapped: 13303808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:43.820419+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188375040 unmapped: 13361152 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1c3b000/0x0/0x1bfc00000, data 0x59eac44/0x5bb1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:44.820579+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188383232 unmapped: 13352960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:45.820842+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188383232 unmapped: 13352960 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:46.821029+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 188506112 unmapped: 13230080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:47.821182+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2742813 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 189571072 unmapped: 12165120 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:48.821424+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1b1a000/0x0/0x1bfc00000, data 0x5b0b112/0x5cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 189571072 unmapped: 12165120 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:49.821637+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 189243392 unmapped: 12492800 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.606970787s of 10.000104904s, submitted: 93
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:50.821808+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190504960 unmapped: 11231232 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1ab8000/0x0/0x1bfc00000, data 0x5b6c72e/0x5d34000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:51.822025+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190504960 unmapped: 11231232 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:52.822182+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2755965 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 191078400 unmapped: 10657792 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:53.822387+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1a6d000/0x0/0x1bfc00000, data 0x5bb635b/0x5d7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1a6d000/0x0/0x1bfc00000, data 0x5bb635b/0x5d7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190758912 unmapped: 10977280 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:54.822537+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190767104 unmapped: 10969088 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:55.822816+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190775296 unmapped: 10960896 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:56.823011+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 190775296 unmapped: 10960896 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:57.823181+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2756205 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 191823872 unmapped: 9912320 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:58.823340+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7acbb/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:59.823496+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:00.823674+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:01.823854+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:02.824014+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751131 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19af000/0x0/0x1bfc00000, data 0x5c7ad55/0x5e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:03.824200+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 13.588920593s of 13.938811302s, submitted: 85
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:04.824442+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192110592 unmapped: 9625600 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:05.824637+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:06.824850+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:07.825052+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7ae81/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2752707 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.445 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:03 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:03.446 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:08.825245+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:09.825461+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7ae81/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:10.825639+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:11.825835+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:12.826027+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751115 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:13.826173+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:14.826344+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:15.826571+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19af000/0x0/0x1bfc00000, data 0x5c7ae1f/0x5e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19af000/0x0/0x1bfc00000, data 0x5c7ae1f/0x5e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:16.826749+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.871628761s of 12.908679962s, submitted: 8
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:17.826939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751339 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192118784 unmapped: 9617408 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:18.827095+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:19.827358+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7ae84/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:20.827560+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:21.827735+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:22.827887+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751339 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7ae84/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:23.828087+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192126976 unmapped: 9609216 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:24.828264+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:25.828469+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7ae84/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:26.828658+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:27.829220+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751339 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:28.829420+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:29.829583+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7aee9/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:30.829741+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:31.829939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7aee9/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.998890877s of 15.012193680s, submitted: 2
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192135168 unmapped: 9601024 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:32.830162+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2751163 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:33.830276+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7af4e/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:34.830458+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:35.830702+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:36.844277+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19ae000/0x0/0x1bfc00000, data 0x5c7af4e/0x5e40000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:37.844439+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2752867 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:38.844625+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192143360 unmapped: 9592832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:39.844839+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192159744 unmapped: 9576448 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:40.845008+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b19a9000/0x0/0x1bfc00000, data 0x5c7d885/0x5e44000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [0,0,1])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192192512 unmapped: 9543680 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:41.845197+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b196d000/0x0/0x1bfc00000, data 0x5cbafcf/0x5e81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192192512 unmapped: 9543680 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:42.845355+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2762581 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192192512 unmapped: 9543680 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:43.845527+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.892515182s of 11.971315384s, submitted: 18
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b196d000/0x0/0x1bfc00000, data 0x5cbafcf/0x5e81000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192372736 unmapped: 9363456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:44.845684+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192372736 unmapped: 9363456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:45.845834+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192372736 unmapped: 9363456 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:46.846020+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192602112 unmapped: 9134080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:47.846158+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2766181 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192602112 unmapped: 9134080 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:48.846340+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1911000/0x0/0x1bfc00000, data 0x5d17529/0x5edd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 191692800 unmapped: 10043392 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:49.846540+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 191692800 unmapped: 10043392 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:50.846717+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 191692800 unmapped: 10043392 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:51.846898+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192741376 unmapped: 8994816 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:52.847066+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b189e000/0x0/0x1bfc00000, data 0x5d8a02f/0x5f50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2771225 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192913408 unmapped: 8822784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:53.847260+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 192913408 unmapped: 8822784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:54.847440+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b189e000/0x0/0x1bfc00000, data 0x5d8a02f/0x5f50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.936290741s of 11.091415405s, submitted: 30
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193028096 unmapped: 8708096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:55.847626+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193167360 unmapped: 8568832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:56.847830+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193167360 unmapped: 8568832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:57.847996+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2777829 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193167360 unmapped: 8568832 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:58.848203+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b183c000/0x0/0x1bfc00000, data 0x5dec43a/0x5fb2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193290240 unmapped: 8445952 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:28:59.848370+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 194338816 unmapped: 7397376 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:00.848531+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 194338816 unmapped: 7397376 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:01.848695+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 194756608 unmapped: 6979584 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:02.848838+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2779381 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b17b8000/0x0/0x1bfc00000, data 0x5e70d28/0x6036000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 194756608 unmapped: 6979584 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:03.849074+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 194756608 unmapped: 6979584 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:04.849270+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b17b2000/0x0/0x1bfc00000, data 0x5e766bd/0x603c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.903612137s of 10.071923256s, submitted: 33
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193454080 unmapped: 8282112 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:05.849484+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b177d000/0x0/0x1bfc00000, data 0x5eab593/0x6071000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193454080 unmapped: 8282112 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:06.849636+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 ms_handle_reset con 0x55d11d97b000 session 0x55d11d101c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d119a70c00
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193454080 unmapped: 8282112 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:07.849820+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 ms_handle_reset con 0x55d11918c000 session 0x55d11b805c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11d97b000
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2783005 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:08.850013+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:09.850156+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:10.850328+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:11.850557+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1752000/0x0/0x1bfc00000, data 0x5ed67f7/0x609c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:12.850716+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2784353 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:13.850869+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:14.851026+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.948145866s of 10.004994392s, submitted: 13
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:15.851246+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:16.851452+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1746000/0x0/0x1bfc00000, data 0x5ee26ae/0x60a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:17.851652+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2785673 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:18.851818+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193552384 unmapped: 8183808 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:19.852035+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:20.852217+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:21.852473+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:22.852667+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1746000/0x0/0x1bfc00000, data 0x5ee26ae/0x60a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2785673 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:23.852843+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:24.853026+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1746000/0x0/0x1bfc00000, data 0x5ee26ae/0x60a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:25.853310+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:26.853509+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1746000/0x0/0x1bfc00000, data 0x5ee26ae/0x60a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193560576 unmapped: 8175616 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:27.853685+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.628934860s of 12.628935814s, submitted: 0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2787417 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:28.853859+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:29.854038+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b172a000/0x0/0x1bfc00000, data 0x5efdf45/0x60c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:30.854199+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:31.854344+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:32.854548+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2787433 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:33.854732+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1711000/0x0/0x1bfc00000, data 0x5f171ad/0x60dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193650688 unmapped: 8085504 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:34.854843+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b1711000/0x0/0x1bfc00000, data 0x5f171ad/0x60dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193699840 unmapped: 8036352 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:35.855061+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 heartbeat osd_stat(store_statfs(0x1b16c2000/0x0/0x1bfc00000, data 0x5f64065/0x612c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193699840 unmapped: 8036352 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:36.855299+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193708032 unmapped: 8028160 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:37.855503+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2797221 data_alloc: 285212672 data_used: 20176896
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.838793755s of 10.596912384s, submitted: 26
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193937408 unmapped: 7798784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:38.855654+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 9000.1 total, 600.0 interval
                                                          Cumulative writes: 24K writes, 90K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s
                                                          Cumulative WAL: 24K writes, 8386 syncs, 2.88 writes per sync, written: 0.07 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 42K keys, 11K commit groups, 1.0 writes per commit group, ingest: 35.98 MB, 0.06 MB/s
                                                          Interval WAL: 11K writes, 4381 syncs, 2.51 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193937408 unmapped: 7798784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:39.855846+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193937408 unmapped: 7798784 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:40.856019+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _renew_subs
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _send_mon_message to mon.np0005548789 at v2:172.18.0.104:3300/0
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 280 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193945600 unmapped: 7790592 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:41.856186+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 280 heartbeat osd_stat(store_statfs(0x1b1677000/0x0/0x1bfc00000, data 0x5faf2c2/0x6176000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193945600 unmapped: 7790592 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:42.856322+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2798019 data_alloc: 285212672 data_used: 20189184
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193945600 unmapped: 7790592 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:43.856478+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 193945600 unmapped: 7790592 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:44.856667+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:45.856852+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 280 heartbeat osd_stat(store_statfs(0x1b164b000/0x0/0x1bfc00000, data 0x5fdbe2d/0x61a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:46.856994+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:47.857120+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2801147 data_alloc: 285212672 data_used: 20189184
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:48.857258+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195076096 unmapped: 6660096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:49.857464+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195076096 unmapped: 6660096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:50.857692+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195076096 unmapped: 6660096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:51.857895+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195076096 unmapped: 6660096 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:52.858102+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2802981 data_alloc: 285212672 data_used: 20201472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:53.858334+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:54.858495+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:55.858694+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:56.858879+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:57.859073+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2802981 data_alloc: 285212672 data_used: 20201472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:58.859225+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:29:59.859404+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195084288 unmapped: 6651904 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:00.859583+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:01.859783+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:02.859974+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2802981 data_alloc: 285212672 data_used: 20201472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:03.860148+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:04.860317+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:05.860536+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:06.860683+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:07.860970+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195100672 unmapped: 6635520 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2802981 data_alloc: 285212672 data_used: 20201472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:08.861179+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195117056 unmapped: 6619136 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:09.861326+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:10.861495+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:11.861666+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 heartbeat osd_stat(store_statfs(0x1b1646000/0x0/0x1bfc00000, data 0x5fde246/0x61a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:12.861864+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195125248 unmapped: 6610944 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11de01800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 34.233470917s of 34.410053253s, submitted: 62
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2805867 data_alloc: 285212672 data_used: 20201472
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:13.861991+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195149824 unmapped: 6586368 heap: 201736192 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:14.862164+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195289088 unmapped: 11108352 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 281 handle_osd_map epochs [281,282], i have 281, src has [1,282]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 282 ms_handle_reset con 0x55d11de01800 session 0x55d11bcc7c20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: handle_auth_request added challenge on 0x55d11c2b8800
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:15.862369+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195297280 unmapped: 11100160 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 ms_handle_reset con 0x55d11c2b8800 session 0x55d11b80fa40
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:16.862549+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195354624 unmapped: 11042816 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe2d00/0x61af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:17.862732+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195362816 unmapped: 11034624 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814341 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:18.862882+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195362816 unmapped: 11034624 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:19.863015+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195362816 unmapped: 11034624 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe2d00/0x61af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:20.863202+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195379200 unmapped: 11018240 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:21.863355+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195379200 unmapped: 11018240 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:22.863480+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe2d00/0x61af000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195395584 unmapped: 11001856 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814341 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.213387489s of 10.580154419s, submitted: 86
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:23.863676+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195403776 unmapped: 10993664 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:24.863843+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:25.864048+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:26.864308+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:27.864570+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814452 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:28.864689+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:29.864936+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:30.865105+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:31.865264+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195411968 unmapped: 10985472 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:32.865535+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814452 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:33.865702+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:34.865921+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:35.866527+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:36.866669+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163a000/0x0/0x1bfc00000, data 0x5fe5119/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:37.866960+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195420160 unmapped: 10977280 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2814452 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:38.867068+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195420160 unmapped: 10977280 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:39.867196+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195420160 unmapped: 10977280 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:40.867334+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195428352 unmapped: 10969088 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:41.867685+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 18.185007095s of 18.203737259s, submitted: 20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 ms_handle_reset con 0x55d11c2b8400 session 0x55d11a59cd20
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195739648 unmapped: 10657792 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:42.867850+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195739648 unmapped: 10657792 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:43.868006+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Got map version 55
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3354697053,v1:172.18.0.108:6811/3354697053]
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195747840 unmapped: 10649600 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:44.868190+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195747840 unmapped: 10649600 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:45.868399+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195747840 unmapped: 10649600 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:46.868633+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195747840 unmapped: 10649600 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:47.868849+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195747840 unmapped: 10649600 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:48.869050+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195756032 unmapped: 10641408 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:49.869197+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:50.869351+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:51.869532+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:52.869869+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:53.870006+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:54.870147+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:55.870344+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195764224 unmapped: 10633216 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:56.870490+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:57.870655+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:58.870866+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:30:59.871005+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:00.871182+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:01.871419+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:02.871554+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:03.871720+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195805184 unmapped: 10592256 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:04.871948+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:05.872185+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:06.872463+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:07.900699+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:08.900845+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:09.901015+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:10.901239+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:11.901447+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195821568 unmapped: 10575872 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:12.901634+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195846144 unmapped: 10551296 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:13.901851+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195846144 unmapped: 10551296 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:14.902264+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:15.902441+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:16.902602+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:17.902857+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:18.903043+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:19.903243+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:20.903464+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:21.903657+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:22.903848+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:23.904984+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:24.905280+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:25.907021+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:26.907186+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:27.907566+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195862528 unmapped: 10534912 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:28.907933+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:29.908094+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:30.908238+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:31.908398+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:32.908847+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:33.909223+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:34.909374+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:35.909686+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:36.909844+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:37.910012+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:38.910261+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:39.910524+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:40.910665+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:41.910802+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:42.910998+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:43.911175+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195878912 unmapped: 10518528 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:44.911527+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195887104 unmapped: 10510336 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:45.911853+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195887104 unmapped: 10510336 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:46.912168+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195887104 unmapped: 10510336 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:47.912307+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195895296 unmapped: 10502144 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:48.912529+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195895296 unmapped: 10502144 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:49.912679+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195895296 unmapped: 10502144 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:50.912838+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195895296 unmapped: 10502144 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:51.913051+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195895296 unmapped: 10502144 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:52.913214+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195903488 unmapped: 10493952 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:53.913398+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195903488 unmapped: 10493952 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:54.913592+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195903488 unmapped: 10493952 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:55.913829+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195903488 unmapped: 10493952 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:56.913939+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195903488 unmapped: 10493952 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:57.914084+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195911680 unmapped: 10485760 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:58.914287+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195911680 unmapped: 10485760 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:31:59.914460+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195911680 unmapped: 10485760 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:00.914611+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:01.914794+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:02.914930+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:03.915035+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:04.915235+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:05.915452+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:06.915615+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:07.915825+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195919872 unmapped: 10477568 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:08.915964+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:09.916141+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:10.916335+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:11.916488+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:12.916639+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:13.916827+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:14.916964+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:15.917145+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195928064 unmapped: 10469376 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:16.917272+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195936256 unmapped: 10461184 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:17.917407+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:18.917555+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:19.917735+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:20.917964+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:21.918116+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:22.918234+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:23.919265+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195944448 unmapped: 10452992 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:24.919427+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195952640 unmapped: 10444800 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:25.919563+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195952640 unmapped: 10444800 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:26.919674+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195952640 unmapped: 10444800 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:27.919799+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195960832 unmapped: 10436608 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:28.919918+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: osd.1 284 heartbeat osd_stat(store_statfs(0x1b163b000/0x0/0x1bfc00000, data 0x5fe532c/0x61b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,2,3,4,5] op hist [])
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: bluestore.MempoolThread(0x55d118187b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2813796 data_alloc: 285212672 data_used: 20213760
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195960832 unmapped: 10436608 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:29.920058+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 196059136 unmapped: 10338304 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'config show' '{prefix=config show}'
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:30.920210+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195854336 unmapped: 10543104 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:31.920351+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: prioritycache tune_memory target: 5709084876 mapped: 195870720 unmapped: 10526720 heap: 206397440 old mem: 4047415775 new mem: 4047415775
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: tick
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_tickets
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:32:32.920485+0000)
Dec 06 10:33:03 np0005548789.localdomain ceph-osd[31726]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/4062777244' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain rsyslogd[760]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1342647983' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 06 10:33:03 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3932188867' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:04.177 282197 DEBUG oslo_service.periodic_task [None req-fd40418a-ae44-4351-b144-9b83228b14b4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2348448191' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1182164294' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/294876320' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2422302273' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1710489525' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: pgmap v833: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/4062777244' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1935362394' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2953662664' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1342647983' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2985416432' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/321501752' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3932188867' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/908175058' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3193103153' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2348448191' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1689303103' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1182164294' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3268197787' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:33:04 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/946269881' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1230936092' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1464918040' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/77878866' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2370934995' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/730220607' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3268197787' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/946269881' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/4280751732' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2574211342' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/4154228913' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2389094995' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1798206855' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1230936092' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1464918040' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1107750515' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:05 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.
Dec 06 10:33:05 np0005548789.localdomain systemd[1]: Starting Hostname Service...
Dec 06 10:33:05 np0005548789.localdomain systemd[1]: tmp-crun.ClUwjk.mount: Deactivated successfully.
Dec 06 10:33:05 np0005548789.localdomain podman[344693]: 2025-12-06 10:33:05.882090886 +0000 UTC m=+0.070544596 container health_status 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:33:05 np0005548789.localdomain podman[344693]: 2025-12-06 10:33:05.889949127 +0000 UTC m=+0.078402827 container exec_died 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:33:05 np0005548789.localdomain systemd[1]: 44e8baf5fd6841d0aefb15fe2359de24b840fa2fbe07d09ba5f718006ae6ace6.service: Deactivated successfully.
Dec 06 10:33:05 np0005548789.localdomain systemd[1]: Started Hostname Service.
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.50115 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3451307951' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.69737 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.59539 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.59533 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: pgmap v834: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.50127 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.50124 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.69758 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/322497280' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: from='client.59560 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 06 10:33:06 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/488060457' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "versions"} v 0)
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3447850753' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain sshd[344889]: main: sshd: ssh-rsa algorithm is disabled
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.59566 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.50136 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.50139 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.69773 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.69779 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.59572 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.50145 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.69785 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.59587 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.69791 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/488060457' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.50157 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1050406131' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3447850753' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:33:07 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2289626626' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.917 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'name': 'test', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005548789.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'hostId': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.922 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2079a91c-a7a7-489b-b349-ddee7e42cd0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.918463', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f4973dba-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '6390229aef43ccc8d76e3b13025f3f8db25e1b58a429a9daa23c8837717051d9'}]}, 'timestamp': '2025-12-06 10:33:07.922895', '_unique_id': 'd6ec8b0e97424b22a12e9a454d157b20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.924 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc1877e6-6cb6-4bf7-8f81-3ebb003f1c4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.925150', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f497a2a0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '3b317b0c2168ecd8153c03893186606b0885659737bfe1dbd7b5b3459dbb5b95'}]}, 'timestamp': '2025-12-06 10:33:07.925376', '_unique_id': 'bbded9dae191422e940e36cdca81f6d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.925 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.951 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.951 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ec122aa-cdf1-43bc-b54f-d15a015e85ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.926390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49baa12-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': 'a37bd2b814f932e4108f2ecdd1cbeb03d02180cf36c40b728d2e1abd3ca9ea2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.926390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49bb4a8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '1c69b95c818fbb9e7ca85a4fa30cffe2479fd65f09ca11c433582897be7dd6a4'}]}, 'timestamp': '2025-12-06 10:33:07.952045', '_unique_id': 'bb9e522dc5b74625a441da327af984be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.952 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.962 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.962 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd8d76e6-42df-46a8-9be1-f7d7156d4991', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.953302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49d4d68-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': '72e2a074d6ffe7b6d6d6a66334bfec0ab32c877693b0353256a6c9fc9c2d3de4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.953302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49d55c4-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': '7f06fcb16533fe87771514f6c29ad56975a4f3fdad893bdc358f93a03ae1a287'}]}, 'timestamp': '2025-12-06 10:33:07.962720', '_unique_id': '87c3b7da077d40af8b6c96b26c476815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.963 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '406cd6f8-cb02-4c42-bc49-f2f8b54dc1c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.963945', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f49d8dd2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '71e18a0f2e96ca281f1cc840ac6e1676629cea89a85ad5759ddd3d71ecdfa053'}]}, 'timestamp': '2025-12-06 10:33:07.964160', '_unique_id': '9f2c6357e8734183a57196c272e07132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.964 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8aaeffd-3cc5-4cdf-a7cb-edc87f5ffab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.965116', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f49dbb72-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '9f6897dabba597a80c270e14c83989dcaacf56f12a5b20296e4958d56e857b9a'}]}, 'timestamp': '2025-12-06 10:33:07.965327', '_unique_id': 'ec5d5c713f9f42d2a007880628fcc1db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.965 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.966 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.966 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.966 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2977fb00-6d3b-4d24-a5d3-aca4e43c5848', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.966321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49dea66-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': 'e00dc73c324b337b0c0de56615a235775d218884b7a737996342c8796f07822a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.966321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49df164-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '0617b1fc084bec6d730fbdcbbdcd91169d1c5334fbd45ec651f7d49576ee684c'}]}, 'timestamp': '2025-12-06 10:33:07.966693', '_unique_id': '3664de32f51b4872bc6cf417e4e01cd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.967 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05efe716-2b0a-44b0-8d67-ca1c8c94b16f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.967677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49e1fe0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '365e973298a4f85ea899f57d679b04da4199bc2744614edd7dfcfd6d086e77ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.967677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49e26f2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': 'a61446d22fcff3591d517d5370e9bfd5ad3ada67670fb19a4e1df5760dbd596e'}]}, 'timestamp': '2025-12-06 10:33:07.968064', '_unique_id': 'c47904d65dac42d982c3af4c64ee2f85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57be3955-7906-4ee4-8104-491d275e8bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.969019', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f49e53ca-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '069d4fc5a5d403d4c6f6cc03c1bcd8fd05498aadd7f2da5d03c22915be41b15c'}]}, 'timestamp': '2025-12-06 10:33:07.969225', '_unique_id': 'cd30e09823184f75abd361c0bf8af891'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.969 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a10c98b-dcaa-41b4-a5d8-9342c0174679', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.970158', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f49e8052-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '16b1fc6f8f8dbce4a3b01abfbbe529b9aa78f68f8f95c5c0f89f2fa342c6eeae'}]}, 'timestamp': '2025-12-06 10:33:07.970366', '_unique_id': 'bd36801865c14d6aa04c112de2e05a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.970 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.971 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.971 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.971 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 1252245154 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.971 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.latency volume: 27668224 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2af4b501-d7a1-4851-a12a-80ff9e20da67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1252245154, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.971469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49eb374-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '1b6039228f8d86eb08acc63a8d56114ed28cb3cec7c221f14ce3f693817c1519'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27668224, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.971469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49ebba8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '4667d8ecbd5eb7ac1e4b550ea08ca555991f3c26acb1ebecd6c62a609c3b6b82'}]}, 'timestamp': '2025-12-06 10:33:07.971875', '_unique_id': '05322ce1070945f0ab9d48b353ce24c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.972 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4335c5-0688-4454-8363-0a727e8b950e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.972851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49ee984-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '618ab98cde4edccffe7ee8eec689ac314fe214c194ad55d73e3b674bf51c0f1c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.972851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49ef0a0-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '43166610d2cb386fccaf66c34623f63856c53064096e3a9b34ba82f4048f69a7'}]}, 'timestamp': '2025-12-06 10:33:07.973230', '_unique_id': '999a491f6de54d488a8b0ee44861cf88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.973 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.974 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.974 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.974 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83867794-d042-4896-8d8e-439554be3fe2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.974437', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f49f2764-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': '12ac13ce7f0337ce7b1edd7966a1800829f75d085a7438c35e4526a0ca80932f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.974437', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f49f2eb2-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': '3791d284078a607de30a8f11296047111f21fee5bab5e628949c22aba1a5ca39'}]}, 'timestamp': '2025-12-06 10:33:07.974856', '_unique_id': '59ad2d25ba814f3e9ffa332b64dd76b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.975 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.987 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/cpu volume: 21940000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2887edd-88b9-448b-a997-98deb59c01fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21940000000, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:33:07.975851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f4a11e48-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.236519286, 'message_signature': 'f1e18032799605d39c7761d2351f44e523d180ff348913e81dab4c3306350fea'}]}, 'timestamp': '2025-12-06 10:33:07.987522', '_unique_id': '89b89489fb324723bf3398fa87081774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.988 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d84bedb-d655-4892-b10b-1113aea15c31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.988562', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f4a14f44-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '6f2b5e123388e6a05342a666bfcd47a3d3410763c36ae22af32e0cb82ecca02a'}]}, 'timestamp': '2025-12-06 10:33:07.988785', '_unique_id': '54b5f37804ae47fcbace18e101fb4708'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.989 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a4ceef8-e90b-46d3-b607-1ac59f352bba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.989715', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f4a17cda-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': 'b1af47741f3aff1b51a3452f6a1c0859714f1239e845f4cccc9c3b9a4b2b8e72'}]}, 'timestamp': '2025-12-06 10:33:07.989938', '_unique_id': 'dfe4a260673641a0bf0b438dc3e418c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.990 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62899e4e-81c4-418d-acfd-e3f6314c9670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.990869', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f4a1a94e-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': '4f411403eba2ab17b9b73c65e35aa35f3b3ff9bcc209c85866ad46a071825837'}]}, 'timestamp': '2025-12-06 10:33:07.991074', '_unique_id': 'e486577734004a318f4203c1b2a41b5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.991 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/memory.usage volume: 51.80859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b205d0f-1b87-4999-9c39-d78f3759a5dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.80859375, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'timestamp': '2025-12-06T10:33:07.992023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f4a1d63a-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.236519286, 'message_signature': '2dbe0fdefb5a2c64e4c6bc2def12ab81589136bb01441d766a770d594aa693ac'}]}, 'timestamp': '2025-12-06 10:33:07.992217', '_unique_id': 'b30c366237d042dfb5dccee607b14267'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.992 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 1525105336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.read.latency volume: 106716064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78e46dd8-0cbb-4a60-a33e-3bf50bb0881e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1525105336, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.993141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f4a201fa-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '0af236e015ecd7e026a60c0d3699cea534b3f586d086f0a4f61ee6b505827cc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106716064, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.993141', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f4a208ee-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.175716991, 'message_signature': '0bd668f96d14f4d3554f6abe7f88add944ca61602861141390969d7dfdd8922c'}]}, 'timestamp': '2025-12-06 10:33:07.993510', '_unique_id': '6907714a3c904ce1931058dce41073ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.993 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.994 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab10d4af-02b9-4908-ab5c-30c642ea07c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vda', 'timestamp': '2025-12-06T10:33:07.994567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f4a23af8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': 'fdb1e42a40d089b309f6097efd36eee1a213288ae19810076b9529b39792da21'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-vdb', 'timestamp': '2025-12-06T10:33:07.994567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f4a244a8-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.202635707, 'message_signature': '5c5d91372d810a4de8edd6fb16e95b9dd3b3893ab4c8c24a99741b2173497440'}]}, 'timestamp': '2025-12-06 10:33:07.995040', '_unique_id': '60030d69cf724d67ad5da20c000395a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.995 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 DEBUG ceilometer.compute.pollsters [-] b7ed0a2e-9350-4933-9334-4e5e08d3e6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da2de8de-25d0-4a66-acba-850ed2aad55c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ff0049f3313348bdb67886d170c1c765', 'user_name': None, 'project_id': '3d603431c0bb4967bafc7a0aa6108bfe', 'project_name': None, 'resource_id': 'instance-00000002-b7ed0a2e-9350-4933-9334-4e5e08d3e6aa-tap86fc0b7a-fb', 'timestamp': '2025-12-06T10:33:07.996064', 'resource_metadata': {'display_name': 'test', 'name': 'tap86fc0b7a-fb', 'instance_id': 'b7ed0a2e-9350-4933-9334-4e5e08d3e6aa', 'instance_type': 'm1.small', 'host': '2da4d478d8e46de98c9126bc3117f8e5f52968a2b8b7dab159eddaf1', 'instance_host': 'np0005548789.localdomain', 'flavor': {'id': '3b9dcd46-fa1b-4714-ba2b-665da2f67af6', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'e0d06706-da90-478a-9829-34b75a3ce049'}, 'image_ref': 'e0d06706-da90-478a-9829-34b75a3ce049', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:64:77:f3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap86fc0b7a-fb'}, 'message_id': 'f4a27428-d28e-11f0-aaf2-fa163e118844', 'monotonic_time': 13606.167960843, 'message_signature': 'b467f9a22d63709870dc2686fc81554754a91e52f8775ac6ec42104161e5a18b'}]}, 'timestamp': '2025-12-06 10:33:07.996266', '_unique_id': '69ed200d2c2f4c7486d88d71a615368a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     yield
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 06 10:33:07 np0005548789.localdomain ceilometer_agent_compute[238351]: 2025-12-06 10:33:07.996 12 ERROR oslo_messaging.notify.messaging 
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1041013098' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.69803 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.59605 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.69815 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.50175 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.59617 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: pgmap v835: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/375595248' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1050443617' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.50187 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/2289626626' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.69827 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.59629 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/730980905' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1525022835' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1041013098' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/2525006127' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.475 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.480 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.480 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.481 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.482 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 06 10:33:08 np0005548789.localdomain nova_compute[282193]: 2025-12-06 10:33:08.485 282197 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:08 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:08 np0005548789.localdomain sshd[344889]: Received disconnect from 14.194.101.210 port 36972:11: Bye Bye [preauth]
Dec 06 10:33:08 np0005548789.localdomain sshd[344889]: Disconnected from authenticating user root 14.194.101.210 port 36972 [preauth]
Dec 06 10:33:08 np0005548789.localdomain systemd[1]: Started /usr/bin/podman healthcheck run d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.
Dec 06 10:33:08 np0005548789.localdomain systemd[1]: tmp-crun.HX2QZr.mount: Deactivated successfully.
Dec 06 10:33:08 np0005548789.localdomain podman[345139]: 2025-12-06 10:33:08.905644382 +0000 UTC m=+0.091088465 container health_status d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 06 10:33:08 np0005548789.localdomain podman[345139]: 2025-12-06 10:33:08.946307391 +0000 UTC m=+0.131751474 container exec_died d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 06 10:33:08 np0005548789.localdomain systemd[1]: d25404de82bf33c2d8ac6199349771ccea807f639a7289be7b7e445bb6613538.service: Deactivated successfully.
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/759059760' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3853367064' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2625254279' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/759059760' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/41112147' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 06 10:33:09 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/995061289' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "df"} v 0)
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/509212967' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.59683 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: pgmap v836: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.50256 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/995061289' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2611527651' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3002096115' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/509212967' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 06 10:33:10 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3948867050' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1936156670' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 06 10:33:11 np0005548789.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 06 10:33:11 np0005548789.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 06 10:33:11 np0005548789.localdomain kernel: cfg80211: failed to load regulatory.db
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.69911 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/1083196786' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/1474627565' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/3948867050' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/3836096139' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/2126088694' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 06 10:33:11 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1936156670' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1607197834' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: pgmap v837: 177 pgs: 177 active+clean; 235 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.106:0/356022594' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3406386216' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: from='client.59728 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.108:0/3352183805' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: from='client.? 172.18.0.107:0/1607197834' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: mon.np0005548789@1(peon) e15 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 06 10:33:12 np0005548789.localdomain ceph-mon[298582]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1106523137' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
